Atherosclerotic cardiovascular disease is the leading global cause of death, accounting for more than 17.3 million deaths per year, a number expected to exceed 23.6 million by 2030. Personalized computational models of the coronary arteries can be used to analyze the influence that a narrowing of the blood vessel can have on the associated hemodynamics. HARVEY is a lattice Boltzmann fluid dynamics code developed and optimized in collaboration with the Randles research group at Duke University to push the boundaries of circulatory modeling in arterial geometries.
Our study explored if and how HARVEY can make the transition from the parallelism of homogenous message-passing interface systems to heterogeneous architectures with portable performance needed to reach exascale. To achieve the necessary scale and precision in circulatory models with explicit fluid coupled to deformable cell models, our simulation code must be designed to efficiently use current and next-generation, high-performance computing architectures. We assessed and redesigned the code's core data structure to prepare for accelerated GPU-based systems, such as Lawrence Livermore National Laboratory's Sierra, as well as next-generation architectures such as the Aurora exascale system. Ultimately, we demonstrated the potential for significant advances in circulatory simulation capability through the use of heterogeneous architectures. Our research paves the way to increased capability, performance, and portability of the code.
This study leveraged the Laboratory's significant expertise in exascale computing and its core competencies in high-performance computing, simulation, and data science. The results of this project advance the Laboratory Director's predictive biology initiative by advancing technologies that combine biosciences and high-performance computing to solve human health challenges. Our findings regarding the best use of heterogeneous computing and optimized data storage support DOE and NNSA goals to advance the science and technology that drives each organization's missions forward.
Ames, J. et al. 2019. "Low-Overhead In Situ Visualization Using Halo Replay." The 9th IEEE Symposium on Large Data Analysis and Visualization (LDAV), Vancouver, BC, Canada, October 2019. LLNL-CONF-787574.
Herschlag, G. et al. 2019. "Multi-Physics Simulations of Particle Tracking in Arterial Geometries with a Scalable Moving Window Algorithm." IEEE Cluster, Albuquerque, NM, September 2019. LLNL-CONF- 795181.
Gounley, J. et al. 2019. "Immersed Boundary Method Halo Exchange in a Hemodynamics Application." International Conference on Computational Science (ICCS), Faro, Algarve, Portugal, June 2019. LLNL-CONF-776797.
Vardhan, M. et al 2019. "Moment Representation in the Lattice Boltzmann Method on Massively Parallel Hardware." The International Conference for High Performance Computing, Networking, Storage, and Analysis (Supercomputing 2019). Denver, Colorado, November 2019. LLNL-CONF-773341.
Lawrence Livermore National Laboratory • 7000 East Avenue • Livermore, CA 94550
Operated by Lawrence Livermore National Security, LLC, for the Department of Energy's National Nuclear Security Administration.