All

Towards Breaking the Curse of Dimensionality: Sparse Polynomial and Reduced Basis Methods for Saddle Point Problems with Random Inputs

  • Speaker: CHEN Peng (The University of Texas at Austin)

  • Time: Dec 28, 2017, 16:20-17:20

  • Location: Conference Room 518, Wisdom Valley 3#

For mathematical modeling and computational simulation in many scientific and engineer- ing systems, uncertainties are ubiquitous. In the probability framework, such uncertainties often can be modeled as random fields or stochastic processes, which can be further represented by countably infinite-dimensional random variables/parameters. Monte Carlo methods are widely applied to solve such problems. However, they are blamed for slow convergence and prohibitive to use when large scale partial differential equations (PDEs) have to be solved for many times. On the other hand, most classical fast convergent methods face the curse of dimensionality, i.e., the complexity increases exponentially with respect to the parameter dimensions.
In this talk, we present two classes of fast and scalable approximation methods–sparse polyno- mial and reduced basis methods–that can break the curse of dimensionality by exploiting the intrinsic low-dimensional structure of the large-scale PDEs with infinite-dimensional inputs. In particular, we focus on saddle point PDEs with infinite-dimensional random coefficients. Under suitable as- sumptions on the sparsity of the random coefficients, we provide feasible constructions and prove dimension-independent convergence rates for both methods. These methods are promising for uncer- tainty quantification problems such as system prediction, control and optimization under uncertainty, parameter estimation and optimal experimental design. 

Baidu
sogou