Advances in Machine Learning: Applications in Time Series Prediction

Stuart School of Business research presentation by: Justin London, Stuart Management Science Ph.D. student

Time

-

Locations

Virtual—Online

Advances in Machine Learning: Applications in Time Series Prediction

  • Justin London, Stuart Management Science Ph.D. student

Abstract:

A new time series modeling framework for prediction, regime switching, and dynamic modelling using new types of recurrent neural networks (RNNs) for machine learning is introduced. I introduce a new class of RNN models, the a-RNN and dynamic a{t}-RNNs that do not suffer from these problems by utilizing an exponential smoothing parameter a. I also introduce a regime switching RNN (RS-RNN), a new dynamic model that has time series regime switching capabilities. These advanced supervised learning methods overcome the parameter estimation and convergence problems of traditional econometric autoregression (AR) models that use MLE and expectation-maximization (EM) methods that are computationally expensive, assume linearity, Gaussian distributed errors, and suffer from the curse of dimensionality.  Due to these estimation problems and lower number of lags that can be estimated, traditional AR and MS-AR models are limited in their ability to capture long memory and temporal dependencies. On the other hand, plain RNNs suffer from the vanishing gradient problem that also limits their ability to have long-memory.

These new models have long memory capabilities, can handle non-linear dynamics, do not always require data stationarity or require Gaussian error distributions. They make no assumptions about the data generating process (DGP) and have the ability to better capture long-term temporal dependencies leading to better forecasting and prediction accuracy over traditional econometric models and plain RNNs. Yet, the partial autocorrelation function (PACF) and classical econometric tools, such as the ADF, Ljung-Box, and AIC test statistics, can be used to diagnose serial correlation and to determine the optimal model sequence lag order which can greatly influence prediction performance on test data. I show that such traditional econometric tools can provide interpretability to ML models and bridge econometrics with ML.

Model and memory comparisons of the a-RNN are made to the echo state networks (ESNs) with leaky neurons and exponentially-smoothed (ES-RNNs). I show that these neural networks can be analyzed in a stochastic process framework rather than a general neural network framework. Numerous data sets and numerical experiments are applied to these models to establish their empirical validity including energy, wind power, crude oil, natural gas, commodities, and high-frequency data. It is shown that the new models often outperform plain RNNs and other econometric models. It is also shown that the bias-variance tradeoff may not always hold in certain cases.

 

All Illinois Tech faculty, students, and staff are invited to attend.

The Friday Research Presentations series showcases ongoing academic research projects conducted by Stuart School of Business faculty and students, as well as guest presentations by Illinois Tech colleagues, business professionals, and faculty from other leading business schools.

Join session on Blackboard Collaborate