Cracking the Code to Help Planet Earth

| August 30, 2022

50 Years of ISI: How ISI has used computer science to mitigate the adverse effects of climate change and aid paleoclimatologists in natural disaster detection and management

A heart-shaped green leaf on a computer circuit board

Photo credit: Weerapatkiatdumrong/Getty Images

As climate change is a modern problem, it requires modern solutions. AI, deep learning, knowledge graphs and cyberinfrastructures are all tools researchers at USC’s Information Sciences Institute have been leveraging for the last two decades for remediation efforts.  

From detecting wildfires from space to modeling flash floods in Dallas-Fort Worth, ISI has independently and collaboratively developed research to protect our environment from further damage and natural disasters caused by climate change.  

Modeling for Natural Disaster Management  

The effective management of natural disasters relies on productive decision-making, mainly: where, when and how to deploy limited resources most efficiently. AI and computer science are especially helpful to that end, as they enable automated reasoning and AI modeling to streamline decision-making. 

In the case of wildfires, one preemptive mitigation strategy firefighters have used is the controlled burn of flammable brush to reduce available tinder that feeds megafires. To help emergency responders perform this safely and effectively, ISI collaborated with UC San Diego and Florida’s Tall Timbers Research Station to deliver BurnPro3D, a decision support platform to help the fire response and mitigation community quickly and accurately understand the risks and tradeoffs presented by a fire to more effectively plan controlled burns.  

ISI’s Senior Director for Strategic Artificial Intelligence and Data Science Initiatives, Yolanda Gil, explains that AI can be employed to do automated reasoning about factors like wind speed and direction, slope as well as vegetation type and density, so it can quickly put together accurate models of how a controlled fire will evolve under different initial conditions. Through this, firefighters can use the platform to plan the burns and know exactly how they will progress. 

Modeling is also effective in other instances as well, such as in predicting flash floods. Since 2008, NOAA (National Oceanic and Atmospheric Administration) and the National Weather Service have used a flash flood forecasting system in Dallas-Fort Worth that incorporates parameters like soil moisture and temperature, permeability, vegetation, and the primary forcing mechanism, rainfall. However, the overwhelming volume of data has brought about the need for Pegasus – a workflow management system created by Principal Scientist Ewa Deelman and her team at ISI.  

“Pegasus automatically chains dependent tasks together, so that a single scientist can complete complex computations that once required many different people,” explained Deelman. “It compiles not only software and data, but also the expertise of the people who developed the software.” 

The model works by inputting precipitation data along with other parameters, and outputting streamflow, runoff, and time frame estimates valuable for flash flood forecasting. This allows the flash floods to be mapped, and when areas on the map exceed set thresholds, it triggers targeted flash flood alerts sent to people in the risk area, as well as city emergency managers and stormwater management personnel.  

Deep-Learning and Disaster Detection  

Although remediation and mitigation research is vital, such efforts are reliant on the rapid detection of climate events and natural disasters. This is often the challenge with wildfires, which spread faster than can be detected. Enter SARFire, a rapid wildfire detection solution that uses a novel synthetic aperture radar.  

“Synthetic Aperture Radar is a remote sensing imaging modality that produces images that depend on reflectivity of region of interest,” explain ISI researchers Andrew Rittenbach and JP Walters. SAR image collection begins with the emission of radar signals from an aircraft, satellite, or UAV, which are then reflected off the region of interest and received by an antenna. This signal is then transmitted to a ground station where echo data is processed to create an image using an image formation algorithm.  

Awarded the 2020 Keston Research Award for their work, Rittenbach and Walters designed SARFire to address the limitations of the satellites being used prior which lacked in definition. Unlike its predecessor, the SAR can get a clear picture of a region even in the event of a raging fire and/or if the area is covered in smoke. 

“We believe that when real-time SAR imagery is used in conjunction with data collected from other remote sensing satellites, we will be able to rapidly detect, localize, and monitor wildfires with resolution on the meter scale, improving the imaging resolution used for early wildfire detection by nearly 1,000 times beyond what is currently used, while also substantially reducing detection time, greatly increasing the chance for early wildfire detection and thus limiting the damages caused by it,” they add.  

Communicating for the Environment: Databases and Networks  

Due to a lack of standardization and no database to house it all, the volume of data available to paleoclimatologists has become a roadblock rather than an advantage for remediation.  

“Life without standards is miserable!” noted Julien Emile-Geay, an associate professor of Earth Sciences at USC’s Dornsife Department of Earth Sciences. “Imagine needing a different plug type for every single item in your house—that’s currently the state of paleoclimate data, forcing early-career folks who want to integrate their data to spend months of their life reinventing the wheel every time they do something.”  

Inspired to mitigate this, Emile-Geay worked with Yolanda Gil to create the Paleoclimate Community reporTing Standard (PaCTS). The platform, described as a socio-technical system, uses AI to draw links between data from different disciplines to make it more accessible. The researchers also set the standardization for idiosyncratic data, which is made up of three elements: data representation, vocabulary and reporting requirements 

“We also construct what we call the ‘Linked Earth knowledge graph’ that expresses connections among datasets, researchers, locations, publications, etc.” explained Gil. 

Paving the way for PaCTS was The George E. Brown, Jr. Network for Earthquake Engineering Simulation (NEES), first designed at ISI in 2001. Funded by the National Science Foundation, NEES is a first-of-its-kind network that allows engineers from a range of disciplines to pool together their knowledge and experience to study earthquakes and create public infrastructure designs to withstand them.  

“Normally, geotechnical engineers, structural engineers, tsunami researchers, and so on never even sit in the same room together,” explained ISI’s Informatics Systems Research Division Director and William M. Keck Chair in Engineering Carl Kesselman, who organized the workshop where NEES was born. “The fact that so many different types of civil engineers are now talking with computer scientists about how to build a common networked infrastructure is going to advance not only earthquake engineering research but practice as well.” 

NEES was also a launching pad for NEESGrid, a database that stores research findings and enables communication among researchers. Connected by Internet2, a network that moves data faster than old-fashioned Internet, it allows researchers to plan collaborative experiments addressing earthquake management.  

Published on August 30th, 2022

Last updated on October 21st, 2022

Share This Story