Convergence Science and Research

I-GUIDE was envisioned on the principles of convergence science, merging the expertise of dozens of scholars and practitioners as it pursues its goals.

Convergence research intentionally brings together intellectually diverse researchers to develop effective ways of communicating across disciplines. As experts from different disciplines pursue a common research challenge, their knowledge, theories, methods, data and research communities increasingly intermingle.   - National Science Foundation

Read more about some of our recent achievements.

Integrating Privacy-Protecting Practices into Workflows

A challenge: Maintaining data confidentiality is a key tenet of ethical data practices and may be a mandatory obligation for data access and usage. What are effective approaches to protecting location information when the data themselves must be placed on to maps, and how can these be conducted in a readily replicable and robust manner?

Our approach: Yue Lin of The Ohio State University, one of our 2021-22 I-GUIDE Community Champions, focused her efforts on two frequently used types of data: point locations of individuals and Census data. For each case, she created a Jupyter Notebook to demonstrate how techniques such as geomasking and differential privacy address the key needs for protection of individuals.

Our impact: Designing and deploying technical solutions that have low barriers for integrating into existing workflows will make their consistent use more likely. Jupyter Notebooks are a common tool of a spatial data scientist, and by producing and openly distributing these on our CyberGISX website, we encourage their adoption and integration. Highlighting the known and effective digital strategies for protecting privacy is a component of our I-GUIDE Geoethics initiative. Our contributions enable practitioners to go beyond posing questions about ethics and to make the intention a professional practice instead.


Expanding the I-GUIDE Platform as a Community Tool

A challenge: Researchers are faced with a plethora of cyberinfrastructure tools and services when deciding on a solution for their data management and computational challenges. Data-intensive research workflows, the barriers to locating, wrangling or engineering, and then transferring the data for computational processing are significant. Current cyberinfrastructure tools only solve some of these challenges, or solve them only partially. The capacity for high-performance computing being an easy-access enabler is too often elusive.

Our approach: The I-GUIDE platform integrates disparate existing cyberinfrastructure (CI) tools to create a single point of entry for researchers to conduct end-to-end research workflows from data acquisition and wrangling to HPC execution and results retrieval. Our platform leverages JupyterHub (for interactive workflow development and reproducibility), CyberGIS-Compute (for seamless integration with HPC resources for job submission, web-based model configuration and job submission templates, job status monitoring, data transfer, and results retrieval), and GeoEDF (for containerized workflow building blocks that implement various data acquisition and processing operations). When researchers login to I-GUIDE using their institutional credentials, they can start developing workflows or re-run existing Jupyter notebooks without having to worry about the CI tools implementing the end-to-end workflows behind-the-scenes.

Our impact: The benefits are numerous. As one example, the I-GUIDE platform has enabled researchers to scale up their analysis of the social vulnerability of aging dams infrastructure in the entire US by deploying GeoEDF building blocks to acquire dam flood inundation maps and carry out census tract-level inundation analysis on a large national HPC resource. The high CPU core counts and large node memory on the HPC resource enabled the analysis of those flood inundation maps that were too large to process on the researcher’s desktop machine. The researcher was then able to carry out interactive analysis of various dams in the US by submitting jobs from the I-GUIDE platform and use the analysis results that were returned from the HPC resource to develop interactive visualizations of social vulnerability of the dam in question.


Is Climate Change Exacerbating Dam Failures?

A challenge: Most of the 90,000 dams in the United States were constructed more than 50 years ago, and at least 1,680 are considered a high hazard and unsatisfactory in their rating condition. Complete dam collapses do occur, but short-term overtopping is also considered a dam failure. Nearly 34% of historical dam failures are attributed to overtopping, which most often occurs not after a single extreme rainfall event, but following an extreme event taking place compounded by a certain stretch of other recent rainfall. Older dams were not designed for the irregularity or non-stationarity of these extremes, and increased exposure to overtopping from high flood volumes increases a dam’s risk of catastrophic dam failure.
Our approach: Exploring the spatial variation of trends in precipitation that may influence flood volumes - and hence overtopping - would help prioritize where to focus attention for risk analysis and mitigation of aging dams. Since the antecedent wetness or soil moisture of a catchment is often found to play a prominent role in triggering floods under extreme rainfall, a joint analysis of both univariate and bivariate trends in the extreme daily rainfall and antecedent rainfall is useful data, even beyond the concern of localized dam filling and overtopping. We considered the joint occurrence of annual maximum daily rainfall and total antecedent rainfall over the prior k days (k = 5,15, and 30) as the compound event of interest for the analysis of trends across the country. Check back soon here for links to our published results.
Our impact: Using this analysis, we can identify regions that are experiencing extreme annual maximum daily rainfall events along with the antecedent rainfall conditions. This combination is still relatively uncommon, but the trend is increasing, and where these are statistically significant, the potential of dam overtopping is greater. Being able to model these combinations with greater accuracy and reliability can help contribute to better prediction of events that have powerfully negative socio-economic impacts.
Share this: