Orthonormal Basis Construction and Optimal Design Criteria with Derivative Information
Description
In computer experiments where each run is costly, one may wish to construct a surrogate, f (x) = ΨT(x)β, that approximates the true response f : Ω → R. Here Ψ(x) =(Ψ1 (x), . . . , Ψk (x))T is a vector of basis functions and β is the regression coefficient obtained by the least squares method. Choosing a design that makes the information matrix,M, nearly diagonal minimizes the aliasing errors in β. This can be accomplished by choosing a design whose empirical distribution approximates some distribution F and choosing the ∫Ψ to be orthonormal with respect respect to this distribution, i.e. Ω Ψl(x)Ψm (x)dF (x) = δlm . If Ω = Ω1 X · · · X Ωd and F has independent marginals, then the Ψl might be chosen as tensor products of univariate orthonormal polynomials.
We considers the case where first derivatives of f are available, a case arising in the simulation of nuclear reactors. Derivative information improves the estimation of the response function, and computing the derivative can be less time consuming than sampling an extra point. When derivative information is included, the inner product of the basis polynomials must be generalized to include derivatives. Moreover, tensor product polynomial basis are only possible under certain restrictions on the polynomial degree.
The talk will also discuss different types of optimality criteria, and what the optimality criteria will be like with derivative information in the model.
Event Topic:
Computational Mathematics & Statistics