Document Sample

Eﬃcient Uncertainty Quantiﬁcation via Sparse Representation Hermann G. Matthies Alexander Litvinenko, Dishi Liu, Elmar Zander Institute of Scientiﬁc Computing a Technische Universit¨t Braunschweig Brunswick, Germany wire@tu-bs.de http://www.wire.tu-bs.de 2 Overview 1. Uncertainty and stochastic models 2. General problem description 3. Solution in tensor product space 4. Model reduction and sparse representation 5. Low rank representation and algorithms 6. Conclusion ifiCompu ent tin TU Braunschweig Institute of Scientiﬁc Computing Sci g 3 Sources of Uncertainty Mechanical/ physical systems may contain uncertain elements, as some details are not precisely known. • Action on the system from the rest of the world (surrounding environment). • The system itself may contain only incompletely known para- meters, processes or ﬁelds (not possible or too costly to measure) • There may be small, unresolved scales in the model, they act as a kind of background noise. All these items introduce some uncertainty in the model. ifiCompu ent tin TU Braunschweig Institute of Scientiﬁc Computing Sci g 4 Ontology and Modelling A bit of ontology: • Uncertainty may be aleatoric, which means random and not reducible (action from outside is often in this category), or • epistemic, which means due to incomplete knowledge (uncer- tainty in the system or model is often of this kind). Stochastic models give quantitative information about uncertainty, here used in modelling both types of uncertainty. Possible areas of use: Reliability, heterogeneous materials, upsca- ling, incomplete knowledge of details, uncertain [inter-]action with environment, random loading, etc. ifiCompu ent tin TU Braunschweig Institute of Scientiﬁc Computing Sci g 5 Probability What is probability? We may understand probability as • A mathematical concept — theory of a ﬁnite measure. • Applies to aleatoric phenomena, i.e. frequencies of occurrence — Bernoulli’s law of large numbers. • Applies also to epistemic concepts — extension of Aristotelian pro- positional logic to uncertain propositions (Cox’s theorem). Realm of Bayesian ideas and methods. Exlusive application to ﬁrst area is today often labelled classical or frequentist, Bernoulli and Laplace had both in mind. ifiCompu ent tin TU Braunschweig Institute of Scientiﬁc Computing Sci g 6 General Problem Description Consider mechanical / physical system (stationary for simplicity) A(u) = f u ∈ U, f ∈ F, where A models the system, F — space of actions / loadings, U — dual space of possible states of the system. Solution is usually by ﬁrst discretisation A(u) = f u ∈ UN ⊂ U, f ∈ FN ⊂ F, and then (iterative) numerical solution process uk+1 = Φ(uk ), lim uk = u. k→∞ Process is eﬃcient, if Φ can be evaluated eﬃciently, and if # of iterations k can be kept low. ifiCompu ent tin TU Braunschweig Institute of Scientiﬁc Computing Sci g 7 Model with Uncertainties With uncertainties modelled by apropriate probab. space (Ω, P, A): A[ω](u(ω)) = f (ω) a.s. in ω ∈ Ω, State u(ω) is U - valued random variable (RV), viewed as element of tensor product S ⊗ U =: W, with R-valued RVs S. Similarly after discretisation A[ω](u(ω)) = f (ω) a.s. in ω ∈ Ω, assume {v j }N a basis in UN , then the approx. solution in S ⊗ UN j=1 N u(ω) = uj (ω)v j j=1 is represented by N real-valued RVs uj (ω), to be approximated. ifiCompu ent tin TU Braunschweig Institute of Scientiﬁc Computing Sci g 8 Direct Integration / Sampling Solution Builds on fact that ultimately E (Ψ (u)) is wanted. Z E (Ψ (u)) = Ψ (ω, u(ω)) P(dω) ≈ wz Ψ (ωz , u(ωz )) Ω z=0 Pick (e.g. Monte Carlo) {ω}Z points, for each ωz do solution process z=0 uk+1(ωz ) = Φ[ωz ](uk (ωz )), giving u(ωz ) = j uj (ωz )v j = j uz v j . j (Usually u(ωz ) discarded after use in integration.) Random state represented by collection [u(ω0), . . . , u(ωZ )], or the tensor uZ := {uz }z=0,...,Z . N j j=1,...,N ifiCompu ent tin TU Braunschweig Institute of Scientiﬁc Computing Sci g 9 Solution by Emulation Emulation is replacement of expensive simulation by inexpensive approximation −→ emulation ( alias response surfaces, proxy / surrogate models, etc.) Choose subspace SB ⊂ S with basis {Xβ }B , β=0 make ansatz for each uj (ω) ≈ β uβ Xβ (ω), giving j u(ω) = uβ Xβ (ω)v j = j uβ Xβ (ω) ⊗ v j . j j,β j,β Solution is in tensor product WB,N := SB ⊗ UN ⊂ S ⊗ U = W. Random state u(ω) represented by tensor uB := {uβ }β=0,...,B . N j j=1,...,N ifiCompu ent tin TU Braunschweig Institute of Scientiﬁc Computing Sci g 10 Tensor Product Structure Story does not end here as one may choose S = m Sm, M approximated by SB = m=1 SBm with SBm ⊂ Sm. Solution represented as a tensor of grade M + 1 M in WB,N = m=1 SBm ⊗ UN . — but that is a story for another talk. For higher grade tensor product structure, more reduction is possible, — here we stay with M = 1, i.e. grade 2. Sparse representation entails • reduce uB := [uβ ] to important info u ≈ uB , N j N • never store all of uB , but only u, N • operate eﬃciently on sparse representation u. ifiCompu ent tin TU Braunschweig Institute of Scientiﬁc Computing Sci g 11 Computation of Co-Eﬃcients Co-eﬃcients computed by neural nets, Gaussian process emulation, Kriging, stochastic collocation, stochastic Galerkin, . . . Weighted residual method with weighting {Yγ (ω)}B , γ=1 ∀γ : E f − A(uB ) Yγ = 0. N Often Xα(ω) = Hα(ω), Norbert Wiener’s polynomial chaos expansion (PCE) with Hermite polynomials in Gaussian variables. Many extensions (gPCE, ME-gPCE, wavelets, Fourier, ...). In stochastic collocation (and many other methods) you take Yz (ω) = δ(ω − ωz ). In stochastic Galerkin you take Yβ (ω) = Xβ (ω). ifiCompu ent tin TU Braunschweig Institute of Scientiﬁc Computing Sci g 12 Model Reduction / Discretisation On continuous level discretisation is choice of subspace WB,N := SB ⊗ UN ⊂ S ⊗ U = W and—important for computation—basis in it. On discrete level reduced models ﬁnd subspace WR ⊂ WB,N with smaller dimensionalty dim WR = R B × N. They can work on SB or UN , or both. Diﬀerent approaches to choose reduced model: • Before the solution process (e.g. proper generalised decomposition). • After the solution process (essentially data compression). • During solution, computing solution and reduction simultaneously. ifiCompu ent tin TU Braunschweig Institute of Scientiﬁc Computing Sci g 13 Low-Rank Approximation Fokus on array of numbers uB := [uβ ], which one may view as matrix N j B N uβ eβ ⊗ ej j β=0 j=1 with unit vectors eβ , ej . The sum has M = (B + 1) ∗ N terms, the number of entries in uB . N An approximation with B N R uβ eβ ⊗ ej ≈ j y ⊗w β=0 j=1 =1 is called a rank-R representation. It contains only R ∗ (B + 1 + N ) numbers. ifiCompu ent tin TU Braunschweig Institute of Scientiﬁc Computing Sci g 14 Use in Sampling Solution I Example: UQ-computations of RANS-ﬂow around airfoils. ifiCompu ent tin TU Braunschweig Institute of Scientiﬁc Computing Sci g 15 Use in Sampling Solution II e Here reduction is achieved by truncating Karhunen-Lo`ve expansion (alias: singular value decomposition (SVD), proper orthogonal decomposition (POD)): Relative errors, memory requirements: rank R pressure turb. kin. energy memory [MB] 10 1.9e-2 4.0e-3 21 20 1.4e-2 5.9e-3 42 50 5.3e-3 1.5e-4 104 Each tensor ∈ R260000×600. Dense matrix format costs 1.25 GB. Made from 600 MC Simulations, SVD is updated every 10 samples. ifiCompu ent tin TU Braunschweig Institute of Scientiﬁc Computing Sci g 16 Use in Time-Space Randomness I Example: UQ-computations of time-dependent shallow-water ﬂow. 1:50 Scale model of Toce river (Italy) ifiCompu ent tin TU Braunschweig Institute of Scientiﬁc Computing Sci g 17 Time-Space Randomness II Topography in model - random elevation ifiCompu ent tin TU Braunschweig Institute of Scientiﬁc Computing Sci g 18 Time-Space Randomness III 5 % exceedance water level ifiCompu ent tin TU Braunschweig Institute of Scientiﬁc Computing Sci g 19 Use in Emulation Example: Diﬀusion equation with random conductivity. Solution via stochastic Galerkin. Discretisation with ﬁnite elements and PCE, solution process uk+1 = Φ(uk ) may be written as M uk+1 = uk − Ξ(uk ) = uk − Y i ⊗ G (uk ). i=1 R0 With u0 = j=1 y 0,j ⊗ g 0,j , this gives R0 M u1 = y 0,j ⊗ g 0,j − Y i(u0) ⊗ Gi(u0) j=1 i=1 ifiCompu ent tin TU Braunschweig Institute of Scientiﬁc Computing Sci g 20 Truncated Iteration If iteration and rank-truncation are alternated, rank stays low. Rank-truncation by updated SVD. Rk M ˆ uk+1 = y k,j ⊗ g k,j − Y i(uk ) ⊗ Gi(uk ), j=1 i=1 u uk+1 = T (ˆk+1). It can be shown that truncated iteration converges until stagnation for • super-linearly convergent iterative process (Hackbusch, Tyrtyshnikov), • linearly convergent process with enlarged stagnation range (Zander). ifiCompu ent tin TU Braunschweig Institute of Scientiﬁc Computing Sci g 21 Truncation Accuracy Accuracy of k-term tensor approximation. ifiCompu ent tin TU Braunschweig Institute of Scientiﬁc Computing Sci g 22 Iteration Accuracy Convergence of truncated iteration. 0 ε=0.01 ε=0.001 ε=0.0001 −1 ε=1e−05 ε=1e−06 ε=1e−07 ε=1e−08 ε=1e−09 −2 ε=1e−10 ε=1e−11 ε=1e−12 ||u −u||/||u|| ε=1e−13 −3 ε −4 −5 −6 −10 0 10 20 30 40 50 60 70 iter ifiCompu ent tin TU Braunschweig Institute of Scientiﬁc Computing Sci g 23 Number of Iterations Iteration count of truncated iteration. 60 50 40 #iterations 30 20 10 0 −14 −12 −10 −8 −6 −4 −2 log10(ε) ifiCompu ent tin TU Braunschweig Institute of Scientiﬁc Computing Sci g 24 Conclusion • Stochastic calculations produce huge amount of data. • For eﬃcency try and use sparse representation throughout: ansatz in tensor products, as well as storage of solution and residuum—and iterartor in tensor products, sparse grids for integration. • Works also for non-linear problems and solvers. • Works also for time-dependent problems. ifiCompu ent tin TU Braunschweig Institute of Scientiﬁc Computing Sci g

DOCUMENT INFO

Shared By:

Categories:

Tags:

Stats:

views: | 0 |

posted: | 3/16/2013 |

language: | Latin |

pages: | 24 |

OTHER DOCS BY ihuangpingba

How are you planning on using Docstoc?
BUSINESS
PERSONAL

By registering with docstoc.com you agree to our
privacy policy and
terms of service, and to receive content and offer notifications.

Docstoc is the premier online destination to start and grow small businesses. It hosts the best quality and widest selection of professional documents (over 20 million) and resources including expert videos, articles and productivity tools to make every small business better.

Search or Browse for any specific document or resource you need for your business. Or explore our curated resources for Starting a Business, Growing a Business or for Professional Development.

Feel free to Contact Us with any questions you might have.