Bayesian versus Frequentist Supervised Learning with Shape Constraints



Social Media Links


Matthew Dixon, Assistant Professor, Department of Applied Math, Illinois Institute of Technology



A ubiquitous problem in surrogate modeling of dynamical systems is how to effectively satisfy shape constraints, oftentimes originating from physical laws. Unfortunately, imposing shape constraints can violate the universal representation theorem and consequently result in poor model performance. Furthermore, there is little understanding of the tradeoffs between Bayesian and frequentist machine learning with shape constraints. In this talk, we shall demonstrate how to modify Gaussian Processes and Neural Networks to impose or enforce shape constraints, highlighting relevant theory and methodological developments, some of which are developed by the authors. We present numerical results applied to financial derivative modeling and show how to construct a non-arbitrage approximation of the price or implied volatility surface. We conclude with a discussion of the relevant merits of each approach. This is joint work with Stéphane Crépey (University of Paris) and Areski Cousin (University of Strasbourg).


This will be a one-hour talk with a 15-minute Q&A session to follow.


Mathematical Finance, Stochastic Analysis, and Machine Learning


Getting to Campus