cyberquantic logo header
EN-language img
FR-language img
breadcrumbs icon
Public Sector

Disaster and Emergency

System for real-time earthquake simulation with data assimilation

System for real-time earthquake simulation with data assimilation

For:
Information Technology Centre (ITC), The University of Tokyo Earthquake Research Institute (ERI), The University of Tokyo National Institute of Informatics (NII) National Research Institute for Earth Science and Disaster Resilience (NIED) Japan Meteorological Agency (JMA), Meteorological Research Institute (MRI) Local Governments in Japan Transportation Companies (railway, highway)
Goal:
Other
Problem addressed
The system conducts large-scale simulation of 3D seismic wave propagation, and results are improved based on real-time data assimilation using observation and machine-learning.
Scope of use case
This system provides accurate information for evacuation in an earthquake
disaster.
Description
1 New direction in supercomputing:
The majority of SCD/ITC/U. Tokyos (Supercomputing
Research Division, The University of Tokyo) supercomputer
system users belong to the fields of computational science
and engineering, including engineering simulations (fluid
dynamics, structural dynamics, and electromagnetics), earth
sciences (atmosphere, ocean, solid earth, and earthquakes),
and material sciences. Recently, the number of users related
to data science, machine learning, and artificial intelligence
(AI) has been increasing. Examples of new research topics
are weather prediction by data assimilation, medical image
recognition, and human genome analyses. Towards Society
5.0, a new type of method for solving scientific problems that
integrates simulation (S), data (D) and learning (L)
(S+D+L) is emerging.
2 BDEC: Big data and extreme computing
The BDEC system (Big data and extreme computing), which
is scheduled to be introduced to SCD/ITC in April 2021, is a
hierarchical, hybrid, heterogeneous (h3) system. The BDEC
is the platform for integration of simulation, data and
learning (S+D+L), and consists of computing nodes for
computational science, those for data science/machine
learning, and those for integration. The aggregated peak
performance of the BDEC system is expected to be 40+
PFLOPS with aggregated memory bandwidth: 5,00+PB/sec.
It comprises three types of compute nodes: simulation
nodes (SIM, 90 % of total resources) for traditional
supercomputing applications; data/learning nodes (DL,
5 %) for data and learning; and integration nodes (INT,
5 %). The architecture of SIM and INT is necessary to be the
same, while that of DL can be different. Some of the DL nodes
would be connected to external resources (e.g. data storage,
servers, sensor networks, etc.) directly through an external
network (e.g., SINET, Japan). DL and INT would share a fast
file system (capacity: 4+PB, bandwidth: 2+TB/sec), while all
nodes share a large-scale file system (shared file system, 60
+ PB, 500 + GB/sec).
3 h3-Open-BDEC: Innovative software platform for
integration of (S+D+L)
We develop an innovative software platform h3-Open-
BDEC for integration of (S+D+L) and evaluate the effects of
integration of (S+D+L) on the BDEC (Figure.A.1). The h3-
Open-BDEC is designed for extracting the maximum
performance of the supercomputers with minimum energy
consumption focusing on (1) Innovative method for
numerical analysis with high-performance/high-
reliability/power-saving based on the new principle of
computing by adaptive precision, accuracy verification and
automatic tuning, and (2) Hierarchical data-driven approach
(hDDA) based on machine learning. This work would be
supported by the Japanese Government from FY.2019 to
FY.2023 (JSPS Grant-in-Aid for Scientific Research (S), P.I.:
Kengo Nakajima (ITC/U.Tokyo)).
Figure.A.1 Overview of h3-Open-BDEC
In the data driven approach (DDA), the technique of machine
learning is introduced for predicting the results of
simulations with different parameters. DDA generally
requires a lot of simulations for generation of teaching data.
We propose the hDDA, where simplified models for
generating teaching data are constructed automatically by
machine learning with feature detection, model order
reduction (MOR), uncertainty quantification (UQ), sparse
modelling and adaptive mesh refinement (AMR).
The h3-Open-BDEC is the first innovative software platform
to realize integration of (S+D+L) on supercomputers in the
Exascale Era, where computational scientists can achieve
such integration without support from other experts. Source
codes and documents are open to the public for various kinds
of computational environments. This integration by h3-
Open-BDEC enables significant reduction of computations
and power consumption compared to those by conventional
simulations.
The idea of h3-Open-BDEC is extension of that of ppOpen-
HPC [232] ppOpen-HPC is part of a (five + three)-year
project (FY.20112015, FY.2016-2018) supported by JST-
CREST and DFG-SPPEXA in Germany.
Possible applications on the BDEC system with h3-Open-
BDEC are combined simulations/data assimilations for
climate/weather simulations and earthquake simulations,
and real-time disaster simulations, such as flood, earthquake
and tsunami.
4 System for real-time earthquake simulation with data
assimilation
The system conducts large-scale simulation of 3D seismic
wave propagation, and results are improved based on real-
time data assimilation using observation and machine-
learning. Observations of seismic activities at more than 2
000 points in Japan are obtained by JDXnet developed by
ERI/U. Tokyo through SINET in real time. Construction of the
detailed and accurate underground model is crucial for
accurate simulations. An optimized underground model is
also constructed by integration of (S+D+L).
Interested in the same or similar project?
Submit a request and get a free project evaluation.