Issue No. 02 - February (2006 vol. 7)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/MDSO.2006.11
A UK-based grid computing project that merges two earthquake data modeling methods is one of the latest tools in earthquake prediction. The new technique provides a 2D data map of tectonic structures by combining a macro modeling method based on surface deformation measurements from satellite imagery with a micro method based on finite element analysis and simulation within local geophysical constraints.
The technique has already revealed a secondary fault near a 2001 quake under study in Tibet's Kunlun Mountains. The secondary fault couldn't have been predicted without this technique, says Moustafa Ghanem, a research scientist for the project at the Imperial College, London.
The technique was a finalist in the High Performance Computing (HPC) Analytics Challenge at the SC05 ( http://sc05.supercomputing.org) supercomputing conference in November. In a paper documenting the work ( http://doi.ieeecomputersociety.org/10.1109/SC.2005.16), researchers describe it as a step toward developing an Earth geome, "a grid-based warehouse of tectonic features, geodetic and remote sensing data, and associated modeling results which will be similar in concept to the Human Genome Project."
Grid power and service-oriented science
"It was the power of grid computing that allowed us to consider designing this type of analysis," says Ghanem. The Grid provides enough processing power for researchers to easily rotate data and manage the models and analysis. InforSense KDE ( http://www.inforsense.com), a software analytics platform originally developed at Imperial College, manages the workflow for data analysis resources and services declared as Grid services.
In the SC05 HPC demonstration, an image-mining method developed at Imperial College executed on Grid servers in London. This "macro" analysis reveals surface deformation patterns in Landsat imagery before and after the Kunlun quake. The demonstration coupled this analysis in real time with finite element models to simulate the responses of brick-sized volumes of rocks to stresses and strains according to known regional tectonic boundary conditions. This "micro" analysis executed on servers at the University of Oklahoma's Fears Structural Engineering Laboratory.
InforSense KDE bridges the gap between macro and micro data models by building complex analytic workflows that integrate access to data, software, and other services, enabling the real-time distributed analytical environment that couples the disparate modeling methods. Without InforSense KDE, according to Ghanem, you would have to move the output from one analysis stage to the next by saving it in a file and moving it to another machine. "Now, you can run analyses with complex analytic workflows that coordinate the execution of distributed services," he says. "Using Grid technologies, the data and analytic components used in the data mining—and the workflows themselves—can be distributed all over the world."
InforSense KDE users can build workflows without doing any programming, says Vasa Curcin, a research scientist at Imperial College. "It's also important to note that we do not tie ourselves to any particular standard," he says. "We support multiple standards for publishing services and for integrating them as well." Results from the earthquake study are published as Grid services, using the Keyhole Markup Language ( http://www.keyhole.com/kml/kml_doc.html) (KML) to enable visualization through the Google Earth toolset.
Developers can store the workflows for reuse via Web services, Java portlets, or other visual desktop applications. According to Curcin, the technology easily scales to other research domains and has been successfully applied to bioinformatics, cheminformatics, and financial analysis as well as earthquake modeling.
The infrastructure for this work is a product of Discovery Net ( http://www.discovery-on-the.net), a UK e-science project to develop information technologies that exploit Grid resources and services.
Imperial College researchers say the 2D earthquake modeling technique could be used to create a comprehensive geological fault line map of the earth. The map could detail areas of greatest risk, providing annotation with data models from around the world, made accessible through a set of services offered across the Grid.
The researchers compare this earth geome map to the work of integrative biology in linking the Human Genome Project's sequence analysis (the micro level) to gene expression data and clinical records (the macro level)."We're not trying to predict whether an earthquake will actually happen," says Ghanem, bur rather the scale of damage when an earthquake does occur in the measured area. Coupling macro scale tectonic models with micro scale finite element models of local regions can significantly advance the ability to predict and mitigate earthquake risks.
The 2D modeling technique uses the same Web services and grid technologies for integrating science data that businesses are using for enterprise data integration, says Heather Kreger, a Web services and management standards architect for IBM and colead of the Oasis Web Services Distributed Management Technical Committee ( http://www.oasis-open.org/committees/tc_home.php?wg_abbrev=wsdom). The use of KML in the workflow is similar to the use of BPEL (Business Process Execution Language) in enterprise Web service applications, she says.
Kreger likes the Web services approach as a preparedness technique for the large-scale natural disasters we've seen over the past couple of years. "It's going to be very important for predicting other large systems," she says, "not just earthquakes but also weather."
The whole workflow-based architecture is really an implementation of the service-oriented idea, says Ghanem. You are composing services, local or remote, and building new services out of the compositions. "What we provide is a way of managing that composition and putting it together in a transparent fashion."