Fast Automatic Bayesian Cubature Using Matching Kernels and Designs

Time

-

Locations

RE 036

Speaker: 

Jagadeeswaran Rathinavel, Illinois Tech Ph.D. candidate (AMAT)

Description: 

Automatic cubatures approximate integrals to user-specified error tolerances. For high dimensional problems, it is difficult to adaptively change the sampling pattern, but one can automatically determine the sample size, $n$, given a reasonable, fixed sampling pattern. We take this approach here using a Bayesian perspective. We postulate that the integrand is an instance of a Gaussian stochastic process parameterized by a constant mean and a covariance kernel defined by a scale parameter times a parameterized function specifying how the integrand values at two different points in the domain are related. These hyperparameters are inferred or integrated out using integrand values via one of three techniques: empirical Bayes, full Bayes, or generalized cross-validation. The sample size, $n$, is increased until the half-width of the credible interval for the Bayesian posterior mean is no greater than the error tolerance. The process outlined above typically requires a computational cost of $O(N_{\text{opt}}n^3)$, where $N_{\text{opt}}$ is the number of optimization steps required to identify the hyperparameters. Our innovation is to pair low discrepancy nodes with matching covariance kernels to lower the computational cost to $O(N_{\text{opt}} n \log n)$. This approach is demonstrated explicitly using two approaches, 1) rank-1 lattice sequences with shift-invariant kernels and, 2) Sobol' nodes with Walsh kernels. Our algorithm is implemented in the Guaranteed Automatic Integration Library (GAIL).

Event Topic:

Computational Mathematics & Statistics

 

Getting to Campus