How to make scientific collaboration across colleagues and continents run smoothly? That’s what computer scientist, Ewa Deelman, a research director at USC ISI, does. As scientists work with myriad data points and pull in data from sensors all over the world, they need to work collaboratively utilize distributed resources to do complex scientific computations. Instead of reinventing the wheel for each project, Deelman creates computational tools for scientists to collaborate. One can say she creates the complex cyber ‘plumbing’ so that data can flow freely between, and be crunched easily by researchers to advance scientific knowledge.
Deelman’s systems have been leveraged by the Nobel Prize winning scientists who directly detected gravitational waves and by biologists and seismologists. Her lab at USC ISI will now lead an effort to conduct a pilot study for a potential Cyberinfrastructure Center of Excellence. Collaborating with computer scientists at the University of North Carolina’s Renaissance Computing Institute, the University of Utah, Indiana University and the University of Notre Dame she will work on developing a cyberinfrastructure blueprint that could support scientists working on various high-profile National Science Foundation (NSF) programs.
The two-year effort supported by the Office of Advanced Cyberinfrastructure in the Directorate for Computer and Information Science and Engineering and the Division of Emerging Frontiers in the Directorate for Biological Sciences at NSF, will aim to support sharing of best practices, software solutions and architectures coming from over 20 of NSF largest facilities. NSF programs such as the Large Synoptic Survey Telescope, a wide-field survey telescope under construction that will photograph the entire available sky; OOI, a networked ocean research observatory helping scientists studying coastal regions and the impact of autonomous underwater vehicles; IceCube, a neutrino detector at the South Pole and NEON, a research platform designed to study the biosphere to conduct real-time ecological studies at the scales required to address grand challenges in ecology, are the types of programs from which the project team is hoping to learn. The computer scientists will study how these large scale projects develop, re-use, and deploy complex cyberinfrastructure, how they develop and nurture their cyberinfrastructure workforce, and how they meet the needs of the thousands of scientists they serve.
The participating team of computer scientists will engage with these large-scale NSF projects, which often have collected data over 20 to 30 years, to discover commonalities in their data collection, processing, and dissemination methods and to facilitate knowledge sharing and community building centered around large facilities’ cyberinfrastructure. Next, they will develop a model for a Cyberinfrastructure Center of Excellence that can help enhance the existing cyberinfrastructure of current facilities and help new facilities and projects leverage the existing wealth of knowledge and solutions.
“The NSF Large Facilities have invested a tremendous amount of effort developing sophisticated cyberinfrastructure that delivers data and computing to their scientists. They have also nurtured and trained a talented cyberinfrastructure workforce. Our team aims to capture and augment this knowledge and experience, and provide a forum for exchanging this knowledge among the Large Facilities and other cyberinfrastructure projects and users. I am very much looking forward to working with my colleagues on designing a strategic plan for a Cyberinfrastructure Center of Excellence that can have an impact on not only the large NSF projects but also on the diverse and numerous communities they serve.”
Project website: http://cicoe-pilot.org