The Extreme-scale Discontinuous Galerkin Environment (EDGE) is a solver for hyperbolic partial differential equations with emphasis on seismic simulations. EDGE targets model setups with high geometric complexities and at increasing the throughout of extreme-scale ensemble simulations. The entire software stack is tailored to the execution of “fused” simulations, which allow to study multiple model setups within one execution of the forward solver.

Disclaimer: We do not provide legal advice. The provided information is incomplete and is not a substitute for legal advice.

EDGE’s core is open-source under the permissive BSD 3-Clause license. Additionally, EDGE’s user and developer guide are CC0‘d. Many of EDGE’s assets, e.g., providing example setups or scripts for benchmarks, are typically CC0’d or licensed under BSD 3-Clause as well.


Alexander Breuer
High Performance Geocomputing Lab (HPGeoC)
San Diego Supercomputer Center
Unversity of California, San Diego
9500 Gilman Drive
La Jolla CA 92093-0505


EDGE’s protoyping efforts are supported since 2016 by the Intel Parallel Computing Center (Intel PCC) at the San Diego Supercomputer Center (SDSC). Support for spontaneous rupture simulations was initiated through the Southern California Earthquake Center (SCEC) contribution #16247: “Increasing the Efficiency of Dynamic Rupture Simulations by Concurrently Executed Forward Runs”. The development of nonlinear earthquake simulations is supported by the SCEC contribution #18211: “Nonlinear Earthquake Simulations Through Robust and Accurate A Posteriori Sub-Cell Limiting”.

High performance computing resources are required for development, testing, and accurate simulations. EDGE ran in the Amazon Elastic Compute Cloud and Google Cloud Platform. EDGE ran on Cori Phase 2, Theta, Stampede 2, Blue Waters, Comet, and Endeavour. This research was supported by the AWS Cloud Credits for Research program. This research used resources of the Google Cloud. This research used resources of the National Energy Research Scientific Computing Center (NERSC), a DOE Office of Science User Facility supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. This research used resources of the Argonne Leadership Computing Facility (ALCF), which is a DOE Office of Science User Facility supported under Contract DE-AC02-06CH11357. This work used the Extreme Science and Engineering Discovery Environment (XSEDE), which is supported by National Science Foundation grant number ACI-1053575. This research is part of the Blue Waters sustained-petascale computing project, which is supported by the National Science Foundation (awards OCI-0725070 and ACI-1238993) and the state of Illinois. Blue Waters is a joint effort of the University of Illinois at Urbana-Champaign and its National Center for Supercomputing Applications.

EDGE heavily relies on contributions of many authors to open-source software. This software includes, but is not limited to: ASan (debugger), Catch (unit tests), CGAL (surface meshing), Clang (compiler), Cppcheck (static code analysis), Easylogging++ (logging), ExprTk (expression parsing), GCC (compiler), Git (versioning), Git LFS (versioning), gitbook (documentation), Gmsh (volume meshing), GMT (DEM pre-processing), GoCD (continuous delivery), HDF5 (I/O), jekyll (homepage), LIBXSMM (matrix kernels), METIS (partitioning), MOAB (mesh interface), NetCDF (I/O), ParaView (visualization), Proj.4 (map projections), pugixml (XML interface), SAGA-Python (automated remote job submission), SCons (build tool), TF-MISFIT GOF CRITERIA (signal analysis), UCVMC (velocity model), Valgrind (memory debugging), Visit (visualization).