When Disaster Strikes: Information to the Rescue

japan tsunami
Blog

Ten years ago, on March 11, 2011, off the east coast of Japan’s Tohoku region, one of the most powerful earthquakes that was ever recorded struck and triggered a massive ocean wave: the Tohoku Tsunami, as it became known, surged up to 40 meters, far higher than the 10-meter seawalls that had been built to protect the region from precisely such an event. The waters came too fast to be outrun or outdriven, and the ten-minute warning that the 600.000 people living in the affected area received proved to be tragically insufficient: more than 20.000, more than half of them over the age of 65, perished in the tsunami, which had pushed water and debris up to 10 kilometers inland. This alone would have made the Tohoku Tsunami the most damaging and – in terms of insurance cost – most expensive natural disaster in human history: The World Bank estimates the total economic cost of this event at over 230 billion dollars.

But it also triggered an essentially man-made disaster that could have caused global devastation: The waters flooded the Fukushima nuclear power plant and forced the four reactors to shut down but it also knocked out the emergency cooling system in three of the four reactors, damaging their cores and leaking radiation that will remain measurable across large areas of the pacific for years to come. Around 200.000 people living within a 30-kilometer radius of the Fukushima plant were forced to evacuate.

Not merely a function of physical processes

This tragic example not only highlights the inevitability and often unpredictability of such events, even in a country like Japan, which has arguably the highest level of technical skill to deal with them – it also demonstrates that disasters like the Tohoku tsunami and the Fukushima are not merely functions of physical processes, but also of information: about surface features like geology, terrain, or coastlines, as well as weather patterns and sea state, to predict and prepare for potential risks of flooding, landslides, building collapses, and mitigate them as much as possible; insurance companies need this information to assess potential risks and allocate resources. But information also shapes the response when an event actually happens: knowing the extent and impact of the flood or the storm, for example, is necessary to effectively plan and manage the disaster response, as well as assessing damages and assisting in locating victims.

For all these information domains, satellite-based data are indispensable: Digital Surface Models (DSM) and Digital Terrain Models (DTM), useful tools that use satellite images, might have alerted planners of the Fukushima plant that their emergency generators would be flooded once the water reaches within the perimeter of the plant; mobile GPS in people’s cell phones helped locate victims and assess the viability of transportation systems after the tsunami.

Some programs are conceived for very specific events: Hydrological disasters, for example, are among the most common on the globe, and with the increase of severe storms as a result of climate change, they are likely to increase in frequency and severity. In 2011, more than half of all global disasters involved floods and landslides, with almost 140 million victims and total damage of more than 70 billion dollars. One example for flood-specific motoring and warning systems is the Namibia Flood SensorWeb, multi-national cooperation that includes NASA, the German aerospace agency DLR, and that builds on Landsat5 time series data to create terrain models for frequent risks of inundation. The Cyclone Global Navigation Satellite System (CYGNSS), another specific example, was created by the University of Michigan and is sponsored by NASA; CYGNSS is designed to map and predict the paths of Hurricanes and other cyclones. Other systems may not have been primarily designed with disaster relief in mind but can become invaluable to help mitigate risks and manage relief and response, like the Global Urban Footprint, which is run by the DLR, and Global Human Settlement Layer (GHSL), a data tool offered by the European Union. Both projects use satellite data to determine the extent of human settlement – which is crucial, especially in the population centers of developing countries, where rapid and uncontrolled urban growth outpaces official planning and zoning capabilities. Many of these informal settlements (which may be called Slums or Shantytowns in English-speaking countries, Favelas in Brazil, chaebol as or barrios in Latin America, or Bidonvilles in Africa and the Caribbean) are built on terrain that is at high risk for flooding and landslides are disproportionally affected by natural disasters.

Satellite-based Imagery is the Best Option for Information-Gathering

Other systems, like the already mentioned MobileGPS, merely proved useful in the aftermath, helping locate victims after the Tohoku/Fukushima catastrophe or helping assess and monitor the state of transportation systems after Hurricane Sandy, which struck and partially paralyzed New York City in 2012. This kind of damage assessment is particularly challenging when large areas are affected – which includes practically every major storm, every major earthquake, every major flood. Aerial imagery may be unattainable: Airports may be damaged or destroyed, aircraft needed for the transport of rescue personal and relief materials; unmanned aerial vehicles (UAV) might seem tempting, but their short-range and battery life makes them all but useless in these situations. Satellite Imagery is by far the best option to resolve this problem.

Of particular interest here is the use of satellite-based synthetic aperture radar: SAR images from Low-Earth Orbit platforms like the European Sentinel mission or the Italian COSMO-SkyMed system can be taken anytime, independently of daytime or could cover; their sweep ranges from some ten kilometers to several 100 km, covering fairly large regions at resolutions of a few meters. The main drawback is that it usually takes several days until the same satellite revisits the same area, which limits availability and quality (multiple images, taken in short sequence, allow for the capture or more detail.) One solution to this challenge, which the DLR has been exploring for some time, is to bring the data of multiple sources together, thus increasing information density and assessment speed – which is crucial in the aftermath of major disasters.

This is, in essence, less a problem of data collection: Practically all major international space agencies, as well as a number of private operators, already have dozens of satellites in orbit. The challenge to using SAR Images – as well as optical imagery, which would be useful to refine the data – is to create a platform where data from multiple providers can be accessed with a simple user interface. This idea is being explored in multiple international and inter-European projects; it is no coincidence that many of them invited cloudeo to contribute platform and API know-how – after all, this is what they have been doing successfully for almost a decade now, providing multiple high-quality, ready-to-use data from multiple geodata creators, regardless of the type of sensors, software, and processing power.