Multifidelity Strategies in Uncertainty Quantification: an overview on some recent trends in sampling based approaches


In the last decades, the advancements in the areas of computer hardware/architectures and scientific computing algorithms enabled engineers and scientists to more rapidly study and design complex systems by heavily relaying on numerical simulations. The increased need for predictive numerical simulations exacerbated the requirement for an accurate quantification of the errors of the numerical simulations beyond the more classical algorithmic verification activities. As a consequence, Uncertainty Quantification (UQ) has been introduced as a task that allows for a formal characterization and propagation of the physical and numerical uncertainty through computational codes in order to obtain statistics of the system's response. Despite the recent efforts and successes in advancing the UQ algorithms’ efficiency, the simultaneous combination of a large number of uncertainty parameters (which often correlates to the complexity of the numerical/physical assumptions) and the lack of regularity of the system's response still represents a formidable challenge for UQ. One of the possible ways of circumventing these difficulties is to rely on sampling-based approaches which are generally robust, easy to implement and they possess a rate of convergence which is independent from the number of parameters. However, for many years the extreme computational cost of these methods prevented their widespread use for UQ in the context of high-fidelity simulations. More recently several multilevel/multifidelity Monte Carlo strategies have been proposed to decrease the Monte Carlo cost without penalizing its accuracy. Several different versions of multifidelity methods exist, but they all share the main idea: whenever a set/cloud/sequence of system realizations with varying accuracy can be obtained, it is often more efficient to fuse data coming from all of them instead of relying to the higher-fidelity model only. In this talk we summarize our recent efforts in investigating novel ways of increasing the efficiency of these multifidelity approaches. We will provide several theoretical and numerical results and we will discuss a collection of numerical examples ranging from simple analytical/verification test cases to more complex and realistic engineering systems. 

18/09/2020

BIO

Gianluca Geraci is a Senior Member of Technical Staff in the Optimization and Uncertainty Quantification Department of the Sandia National Laboratories in Albuquerque, New Mexico. Gianluca received his Master Degree in Aeronautical Engineering (with a major in aerodynamics) from Politecnico di Milano, Italy in 2010 with a thesis on hybrid finite element/finite volume schemes for compressible flows in curvilinear coordinates supervised by Prof. Alberto Guardone. He completed a PhD in Applied Mathematics and Scientific Computing in 2013 working under the direction of Prof. Remi Abgrall and Dr. Pietro Marco Congedo on intrusive finite volume multiresolution schemes for UQ at the French Institute for Research in Computer Science and Automation (INRIA) in Bordeaux, France. After his PhD he moved to the USA to join the Center for Turbulence Research of the Stanford University as PostDoctoral Fellow where he worked on efficient UQ strategies for particle-laden turbulent flows in radiation environments within the Predictive Science Academic Alliance Program (PSAAP II) project directed by Prof. Gianluca Iaccarino. At the end of 2016, he moved to the Sandia National Laboratories in Albuquerque where his current research interests are spanning the broad areas of theoretical and algorithmic development of multifidelity approaches for UQ and optimization under uncertainty along with their deployment to realistic engineering applications. 



© Università degli Studi di Roma "La Sapienza" - Piazzale Aldo Moro 5, 00185 Roma