Recurrent Neural Networks and Universal Approximation of Bayesian Filters

Publication Type:
Conference Proceeding
Citation:
Proceedings of Machine Learning Research, 2023, 206, pp. 6956-6967
Issue Date:
2023-01-01
Full metadata record
We consider the Bayesian optimal filtering problem: i.e. estimating some conditional statistics of a latent time-series signal from an observation sequence. Classical approaches often rely on the use of assumed or estimated transition and observation models. Instead, we formulate a generic recurrent neural network framework and seek to learn directly a recursive mapping from observational inputs to the desired estimator statistics. The main focus of this article is the approximation capabilities of this framework. We provide approximation error bounds for filtering in general non-compact domains. We also consider strong time-uniform approximation error bounds that guarantee good long-time performance. We discuss and illustrate a number of practical concerns and implications of these results.
Please use this identifier to cite or link to this item: