Categories AI

Google’s AI Tool Predicts Flash Floods 24 Hours Ahead

Overview

  • Google has created the most extensive flash flood dataset to date by utilizing Gemini to analyze two decades’ worth of global news articles.
  • This dataset now supports an AI model that can forecast urban flash floods up to 24 hours in advance.
  • The initiative addresses a significant data deficiency that has hindered effective flash flood predictions for years.

Flash floods claim the lives of thousands each year, often striking suddenly and severely impacting urban areas. For many years, scientists lacked sufficient data to detect these events in advance, which hampered their ability to develop predictive models.

On Thursday, Google announced it has found a solution by leveraging news reports.

The tech giant introduced Groundsource, a system that employs Gemini AI to sift through millions of news articles published since 2000, extracting references to flood occurrences and associating each with specific locations and dates. This effort has yielded a dataset comprising 2.6 million historical flash flood events across more than 150 countries, now available for public access.

This dataset was subsequently used to train a new AI model capable of predicting the likelihood of flash flooding in urban areas within the next 24 hours. These forecasts are now accessible through Google’s Flood Hub, which already provides warnings to approximately 2 billion people about river-related flooding globally.

The challenge that Groundsource addresses is fundamentally straightforward. Unlike rivers, which have physical gauges that monitor water levels, urban areas lack similar infrastructure for real-time flood monitoring. When heavy rainfall occurs, it can lead to swift flooding that overwhelms drainage systems, making it difficult to track with conventional tools.

Without adequate historical data, it becomes impossible to train an AI model to recognize flooding patterns. Google’s solution was to treat news articles as a substitute for those missing sensors.

“Transforming public information into actionable data allows us to do more than analyze history; it helps us create a future where natural disasters don’t take people by surprise,” Google stated.

Source: Google

After removing ads, navigation elements, duplicates, and translating foreign language articles into English, the team transformed millions of unstructured text descriptions into precise, geolocated time-series data.

The AI model, which employs an LSTM (Long Short-Term Memory) neural network, analyzes hourly weather forecasts along with local conditions such as urban density, soil absorption rates, and topography to generate a straightforward output: medium or high flood risk for any urban area with a population density exceeding 100 people per square kilometer within the next 24 hours.

However, the system does have limitations. It can only cover areas of about 20 square kilometers at a time, cannot predict the severity of potential flooding, and may not be effective in regions where media coverage is limited.

Nonetheless, the preliminary outcomes are promising. A regional disaster management team in Southern Africa received a Flood Hub alert during the testing phase, validated the flood occurrence, and sent a humanitarian worker to oversee the response. As pointed out by Google’s crisis resilience director, Juliet Rothenberg, “this sequence of events—from prediction in the Flood Hub to action on the ground—is precisely what Flood Hub was designed to achieve.”

Daily Debrief Newsletter

Begin your day with the latest top news stories, along with original features, podcasts, videos, and much more.

Leave a Reply

您的邮箱地址不会被公开。 必填项已用 * 标注

You May Also Like