Skip to main content

A resilient and efficient CFD framework: Statistical learning tools for multi-fidelity and heterogeneous information fusion

Author(s): Lee, Seungjoon; Kevrekidis, Yannis G.; Karniadakis, George Em

Download
To refer to this page use: http://arks.princeton.edu/ark:/88435/pr1hg3f
Full metadata record
DC FieldValueLanguage
dc.contributor.authorLee, Seungjoon-
dc.contributor.authorKevrekidis, Yannis G.-
dc.contributor.authorKarniadakis, George Em-
dc.date.accessioned2021-10-08T19:58:17Z-
dc.date.available2021-10-08T19:58:17Z-
dc.date.issued2017-09-01en_US
dc.identifier.citationLee, S, Kevrekidis, IG, Karniadakis, GE. (2017). A resilient and efficient CFD framework: Statistical learning tools for multi-fidelity and heterogeneous information fusion. Journal of Computational Physics, 344 (516 - 533). doi:10.1016/j.jcp.2017.05.021en_US
dc.identifier.issn0021-9991-
dc.identifier.urihttp://arks.princeton.edu/ark:/88435/pr1hg3f-
dc.description.abstract© 2017 Elsevier Inc. Exascale-level simulations require fault-resilient algorithms that are robust against repeated and expected software and/or hardware failures during computations, which may render the simulation results unsatisfactory. If each processor can share some global information about the simulation from a coarse, limited accuracy but relatively costless auxiliary simulator we can effectively fill-in the missing spatial data at the required times by a statistical learning technique – multi-level Gaussian process regression, on the fly; this has been demonstrated in previous work [1]. Based on the previous work, we also employ another (nonlinear) statistical learning technique, Diffusion Maps, that detects computational redundancy in time and hence accelerate the simulation by projective time integration, giving the overall computation a “patch dynamics” flavor. Furthermore, we are now able to perform information fusion with multi-fidelity and heterogeneous data (including stochastic data). Finally, we set the foundations of a new framework in CFD, called patch simulation, that combines information fusion techniques from, in principle, multiple fidelity and resolution simulations (and even experiments) with a new adaptive timestep refinement technique. We present two benchmark problems (the heat equation and the Navier–Stokes equations) to demonstrate the new capability that statistical learning tools can bring to traditional scientific computing algorithms. For each problem, we rely on heterogeneous and multi-fidelity data, either from a coarse simulation of the same equation or from a stochastic, particle-based, more “microscopic” simulation. We consider, as such “auxiliary” models, a Monte Carlo random walk for the heat equation and a dissipative particle dynamics (DPD) model for the Navier–Stokes equations. More broadly, in this paper we demonstrate the symbiotic and synergistic combination of statistical learning, domain decomposition, and scientific computing in exascale simulations.en_US
dc.format.extent516 - 533en_US
dc.language.isoen_USen_US
dc.relation.ispartofJournal of Computational Physicsen_US
dc.rightsAuthor's manuscripten_US
dc.titleA resilient and efficient CFD framework: Statistical learning tools for multi-fidelity and heterogeneous information fusionen_US
dc.typeJournal Articleen_US
dc.identifier.doidoi:10.1016/j.jcp.2017.05.021-
dc.identifier.eissn1090-2716-
pu.type.symplectichttp://www.symplectic.co.uk/publications/atom-terms/1.0/journal-articleen_US

Files in This Item:
File Description SizeFormat 
A_resilient_CFD_framework_heterogeneous_fusion.pdf1.54 MBAdobe PDFView/Download


Items in OAR@Princeton are protected by copyright, with all rights reserved, unless otherwise indicated.