Group On Earth Observations banner Group On Earth Observations banner

IMPLEMENTING GEOSS

GEO’s Tohoku-oki (Japan) earthquake supersite

By Falk Amelung, Lead, Task DA-09-01c on Geohazard Supersites and Natural Laboratories Task, University of Miami, Florida, USA.

The tragic 11 March 2011 earthquake offshore northern Japan (known as the Tohoku-oki earthquake) and the tsunami that followed left more than 27,000 dead or missing. The International Charter on Space and Major Disasters provided satellite imagery to support the rescue efforts. The GEO Geohazard Supersite went into action to collect Synthetic Aperture Radar (SAR) multispectral imagery as well as GPS and seismic data to better understand what exactly happened during the earthquake. Space agencies and other contributors to the Global Earth Observation System of Systems (GEOSS) supported these and other actions.

The Geohazard Supersites

A year prior to the Japanese disasters the Group on Earth Observations (GEO) selected several “Supersites” that were highly exposed to earthquake or volcano hazards for detailed monitoring. It then established data portals for the earthquake-prone Supersites of Tokyo-Mt. Fuji, Istanbul, Los Angeles and Seattle-Vancouver, and volcano Supersites for Hawaii and Italy’s Mt. Etna and Vesuvius-Campi Flegrei.

These Supersite web portals, for simplicity also referred to as Supersites, provide rapid access to multi-satellite data, and they allow scientists to post preliminary research results on the Internet that can then be used by other scientists for their own work. The Supersites activities are coordinated by Unavco, a science support facility in Boulder, Colorado, that is funded by the US National Aeronautics and Space Administration (NASA) and the US National Science Foundation (NSF).

The Tohoku-oki Supersite

The first GPS-measured ground displacement field produced by the Geospatial Information Authority of Japan (GSI) was available on the Supersite the day of the earthquake. It provided the global community with a first idea about the severity and type of the earthquake. During the next few days, independent research groups in Japan, the United States, China and Europe obtained similar results using GPS and InSAR (interferometric synthetic aperture radar) data. The Tohoku-oki Supersite also featured links to the relevant research institutions in Japan such as the Earthquake Research Institute (ERI) at the University of Tokyo and the Japanese Space Agency’s Research and Application project, providing the visitor an easy overview about ongoing research into the earthquake and tsunami.

To infer where and by how much the ground moved, scientists use the InSAR technique together with imagery from before and after the earthquake. This information is combined with in-situ data from Japan’s excellent seismic and GPS networks to infer in unprecedented detail the processes along the interface between the two tectonic plates that moved during the earthquake. The results show that during the 2-3 minute earthquake the Pacific plate dove by more than 20m under the Eurasian plate. The details of the plate movement, however, are not easy to determine, given that the earthquake happened offshore and at a depth of 10-30 km.

The SAR imagery that has been provided includes data from the European Space Agency’s Envisat and ERS-2 satellites, from Japan’s ALOS and from Germany’s TerraSAR-X satellites. The imagery is provided via ESA’s Virtual Archive, a cloud-based cyberinfrastructure that ensures rapid online access anywhere in the world.

During the first two weeks after the earthquake, the Tohoku-oki Supersite had as many as 4,500 visitors a day. During the month of March it served about 34,000 unique IP addresses, with about 6% of the users being from Japan. Independent research groups all over the world downloaded a total of 11 terabytes of data from the Supersite, corresponding to about 20,000 SAR images.

The direct impact of the GEO Supersites is difficult to measure. Emergency managers and decision makers draw their information from scientists, who in turn draw their conclusions after consulting many different sources of information, including the Supersites. The long-lasting legacy of the Supersites will hopefully be that it provided the international community with much sought-after information in the first days to weeks after the earthquake and that this triggered research projects around the world that will ultimately lead to a better assessment of earthquake and tsunami risk in Japan, Haiti and Chile.

The historical record

In 2001, Koji Minoura, a professor at the Tokohu University in Sendai, and his colleagues published a paper on tsunami deposits found in the Sendai plain that they attributed to a disastrous earthquake in 869 AD1. Additional evidence for this 1100-year old tsunami was found in sediment cores from a coastal lake south of Sendai2. In 2010, an international tsunami field symposium was held in Sendai which included presentations and field trips on the deposits of the 869 AD tsunami.

The tectonic plates underlying offshore Honshu are converging at a rate 7-8 cm/year. If the plate interface becomes locked for 100 years, a displacement deficit of 7-8 m accumulates. A magnitude 9 earthquake with 15-20 m of fault displacement should therefore be expected every ~250 years. Since the meticulous Japanese historical records lack evidence for such a big earthquakes, many scientists believed that most of the plate interface is permanently unlocked and moving aseismically, with only several smaller earthquakes (magnitude 7.5-8) occasionally rupturing the locked segments (such as occurred in 1933).

The 2011 earthquake demonstrated that this was a misconception. We should have known better. The 9.3 magnitude Sumatra-Andaman earthquake of 2004, which generated a tsunami that killed 230,000 people in Indonesia, Sri Lanka and elsewhere, also occurred along a fault that was not thought to be capable of generating big earthquakes.

Japan’s future earthquake risk

After northern Japan’s recent experience of an earthquake that may have been the largest for more than 1000 years, the question now is: what is the earthquake risk in other parts of Japan? Of particular concern is the Tokai district southwest of Tokyo, where the Suruga trough has been generating magnitude 8 and larger earthquakes every 100-150 years. A new earthquake is considered overdue because the last one occurred 157 years ago in 1854. But of greatest concern is the Tokyo metropolitan area, home to 35 million people, which was destroyed by big earthquakes in 1703, 1855 and 1923. This last event, including its associated fires, killed 140,000 people, Japan’s worst earthquake disaster ever. The next big Tokyo earthquake will have a global economic impact. The probability of an earthquake with 3,000-10,000 fatalities and 1 trillion dollars in damage occurring in the next 30 years has been estimated at 35%3. The costs could be even higher if a tsunami develops within Tokyo Bay.

Scientists are now scrambling to estimate how the Great Japan Earthquake changed the earthquake risk in Japan. The GEO Supersite provides useful support for this work. Once the processes underlying the recent great earthquake have been determined, the next step will be to estimate which of the known faults were brought closer to, or further away from, rupture.  Preliminary results that have been posted by a Japanese-US team on the Supersite convey mixed news. The good news is that the fault in the Tokai district was not much affected, which means that the probability of it rupturing has not changed. The not-so-good news is that a suspected fault under Tokyo has been brought closer to failure. If true, Tokyo’s earthquake risk has been raised. The caveats of this result involve uncertainties still associated with the recent earthquake as well as with uncertainty over which fault is the culprit in Tokyo earthquakes.

Estimating earthquake probabilities or, more accurately, estimating the probabilities for ground shaking and tsunami inundation, is the scientifically accepted form of long-term earthquake prediction. Short-term earthquake prediction, which requires a statement about location, time and size of an imminent earthquake, is not possible at this time and probably never will be.

Nevertheless, a law enacted in 1978 required the Japan Meteorological Agency (JMA) to develop an earthquake prediction program for the Tokai district, the area where the next big one was expected. Critics now say that this program is part of the reason for the disastrous consequences of the recent earthquake and tsunami.4 It provided an exaggerated sense of security to the public and took attention away from the real earthquake risk in Japan. In the same way, meter-high seawalls everywhere along the Japanese coast gave the public a sense of security regarding tsunamis, only to be overrun on March 11.

How GEOSS fills the gaps

Because the European Space Agency recognizes the power of SAR imagery for estimating earthquake risk, it initiated the development of a virtual Japan natural laboratory for geohazards two years ago by placing all of its data on Japan in the Tokyo-Mt Fuji Supersite. In 2010, when the orbit of the aging Envisat satellite had to be lowered because the satellite was running out of fuel, ESA selected an orbit that could continue the satellite radar interferometric observations needed by the Supersites, including the Supersite for Tokyo-Mt Fuji. The pay-off is the excellent InSAR imagery of the recent earthquake which is now so important for understanding what happened during the earthquake.

Last month, Japan’s ALOS satellite suddenly reached the end of its operational life. This makes the availability of Envisat and other satellite data on the Supersite that much more essential. The fact that Japan can continue to benefit from the availability of imagery gathered by other space agencies is a powerful confirmation of the value of the Supersites, of GEOSS as a whole, and of the GEO Data Sharing Principles.

A Japan natural laboratory for geohazards

GEO, then, can provide extremely useful support to Japan and other earthquake-prone countries as they seek to better prepare for future risks. By collaborating through GEO, Japan and other governments can sustain a Japan natural laboratory for geohazards that uses global observing satellites for background monitoring and high-resolution satellites for frequent monitoring of particular high-risk areas. Global scientific expertise would then be tapped by providing open access to all relevant space- and ground-based data sets. The best scientists from around the world will use this data for comparative research into Japanese and other subduction zones, which ultimately will lead to a better understanding of earthquake and tsunami risk in Japan and elsewhere.

Data sharing – the critical ingredient

The Japan disasters are a timely reminder of the societal relevance of fundamental geohazard research. If taken into account by decision makers, research findings can lead to better disaster preparedness and ultimately help to mitigate disasters.

For some of the sites covered by the GEO Geohazards Supersites most of the in-situ data sets are available online, but this is not the case for others, in particular for the volcano Supersites. Even in some of the countries that are in the forefront of implementing GEO’s Data Sharing Principles, a commitment to the full and open access to data has not yet trickled down to the volcano observatories. For the Japan earthquake, it is not yet possible to access some of the GPS data that are so vital for understanding earthquakes and volcanic eruptions. As for the space segment, the European Space Agency, the German Space Agency and NASA are spear-heading the initiative, whereas other agencies have still to join. One consequence has been that it was not possible to establish event Supersites for last year’s volcanic crises involving Eyjafjallajökull in Iceland or Merapi in Indonesia or for last February’s earthquake in New Zealand.

If we are to advance our understanding of earthquake and volcanic events and protect people from these risks, it is vital that governments take action to implement the GEO Data Sharing Principles.

Further reading

1 Minoura, K.; Imamura F., Sugawara D., Kono Y. & Iwashita T. (2001). "The 869 Jōgan tsunami deposit and recurrence interval of large-scale tsunami on the Pacific coast of northeast Japan". Journal of Natural Disaster Science 23 (2): 83–88
2 Sawai, Y., Fujii, Y., Fujiwara, O., Kamataki, T., Komatsubara, J., Okamura, Y., Satake, K., Shishikura, M. (2008) Marine incursions of the past 1500 years and evidence of tsunamis at Suijin-numa, a coastal lake facing the Japan Trench. The Holocene, 18, 517-528.
3 Stein, R. S., Toda, S., Parsons, T. and Grunewald, E.A new probabilistic seismic hazard assessment for greater Tokyo. Phil. Trans. R. Soc. A 2006 364, 1965-1988 doi: 10.1098/rsta.2006.1808
4 Geller, R. J., Shake-up time for Japanese seismology, Nature (2011), doi:10.1038/nature10105