Enabling integration across research infrastructures for accelerating science

April 30, 2025

Part 4 of 5: General Atomics: Higher Fidelity Plasma Reconstructions and Particle Tracking Across Facilities for DIII-D

Graphic image of DIII-D National Facility and its connections
The experimental DIII-D National Fusion Facility is connected with the advanced computing resources at the National Energy Research Scientific Computing Center and Argonne Leadership Computing Facility via ESnet.

The DIII-D National Fusion Facility is a U.S. Department of Energy user facility that is discovering and optimizing key science and technology solutions for fusion energy commercialization. The DIII-D tokamak, a donut-shaped fusion device that confines plasma with strong magnetic fields, is the largest operating tokamak in North America, and its world-leading flexibility and comprehensive diagnostic suite enable researchers to explore complex plasma physics questions, with the extensive resulting datasets providing insights to advance fusion.

The program is leveraging the power of Globus to streamline and automate their analysis pipelines, significantly enhancing the efficiency and responsiveness of their research workflows. With help from Argonne National Laboratory and Lawrence Berkeley National Laboratory, the DIII-D team, led by General Atomics developers, have implemented two workflows to perform higher fidelity reconstructions and enable near-real time particle tracking within the DIII-D research program. The Consistent Automatic Kinetic Equilibria (CAKE) workflow is designed to trigger reconstructions in lockstep with the pulsed operation of the DIII-D tokamak. By leveraging high-performance computing (HPC) resources, CAKE rapidly analyzes data as it is collected, ensuring timely insights into kinetic equilibria that allow scientists to make more informed experimental decisions and better understand their data. The process uses Globus Flows and Globus Compute to perform remote OMFIT processes at the National Energy Research Scientific Computing Center (NERSC), facilitated by a Globus App credential that securely links service accounts at both DIII-D and NERSC. Results are efficiently moved between Perlmutter’s scratch and persistent storage via Globus Transfer. Reprocessing tasks are interleaved throughout the day to maintain a continuous flow of data analysis.

IonOrb particle following code and a heat map of deposited high energy particles onto the walls of the tokamak
The IonOrb particle following code generates a heat map of deposited high energy particles onto the walls of the tokamak (blue). This information is crucial for the protection of sensitive equipment and the prevention of excessive impurity generation.

Another innovative workflow initially developed by General Atomics and now available to the entire DIII-D team is IonOrb, which has drastically reduced the time required to simulate energy deposition for each shot from 4.5 hours to just 7 minutes. IonOrb utilizes Globus Compute to invoke a GPU-accelerated code that calculates particle trajectories within the tokamak to identify where the particles impact the vessel. This rapid computation can be performed using resources at either the Argonne Leadership Computing Facility (ALCF) or NERSC, providing researchers with swift feedback. The workflow dynamically updates an interactive view of any impacted instruments during experiments. When fully integrated, this code could alert researchers of potential instrument damage in real-time to support changes in operation for machine protection.

These workflows with Globus integrated have accelerated research progress at the DIII-D National Fusion Facility and have the potential to enhance machine protection and operational precision once fully implemented. These workflows within the DOE Integrated Research Infrastructure are available to all participants of the DIII-D program, and enable researchers to rapidly advance the science and technology needed to commercialize fusion energy.

Contributors: S.P. Smith, Z. Xing, T. Amara, S. Denk, W. DeShazer, O. Meneghini, T. Neiser, L. Stephey, O. Antepara, M. Clark, E. Dart, P. Ding, S. Flanagan, R. Nazikian, D. Schissel, C. Simpson, N. Tyler, T. Uram, S. Williams, M. Kostuk, J. Colmenares, A. Deshpande, N. Logan, R Chard