|||
Approximate Gaussian conjugacy: parametric recursive filtering under nonlinearity, multimode, uncertainty, and constraint, and beyond
Author(s): Tian-cheng Li, Jin-ya Su, Wei Liu, Juan M. Corchado | ||
Affiliation(s): School of Sciences, University of Salamanca, Salamanca 37007, Spain; more | ||
Corresponding email(s): t.c.li@usal.es, J.Su2@lboro.ac.uk, w.liu@sheffield.ac.uk, corchado@usal.es | ||
Key Words: Kalman filter, Gaussian filter, time series estimation, Bayesian filtering, nonlinear filtering; constrained filtering,Gaussian mixture, maneuver, unknown inputs |
Abstract: Since the landmark work of R. E. Kalman in the 1960s, considerable efforts have been devoted to time series state space models for a large variety of dynamic estimation problems. In particular, parametric filters that seek exact analytical estimates based on closed-form Markov-Bayes recursion, e.g., recursion from a Gaussian or gaussian mixture (GM) prior to a Gaussian/GM posterior (termed Gaussian conjugacy in this paper), form the backbone for general time series filter design. Due to challenges arising from nonlinearity, multimode (including target maneuver), intractable uncertainties (such as unknown inputs and/or non-Gaussian noises) and constraints (including circular quantities), and so on, new theories, algorithms and technologies are continuously being developed in order to maintain, or approximate to be more precise, such a conjugacy. They have in a large part contributed to the prospective developments of time series parametric filters in the last six decades. This paper reviews the stateof- the-art in distinctive categories and highlights some insights which may otherwise be overlooked. In particular, specific attention is paid to nonlinear systems with very informative observation, multimodal systems including gaussian mixture posterior and maneuvers, intractable unknown inputs and constraints, to fill the voids in existing reviews/surveys. To go beyond a pure review, we also provide some new thoughts on alternatives to the first order Markov transition model and on filter evaluation with regard to computing complexity.
10 Highlights presented in the paper:
CRLB (Cramer-Rao Lower Bound) limits only the variance of unbiased estimators and lower MSE (mean squared error) can be obtained by allowing for a bias in the estimation, while ensuring that the overall estimation error is reduced.
The KF (Kalman filter) is conditionally biased with a non-zero process noise realization in the given [deterministic] state sequence and is not an efficient estimator in a conditional sense, even in a linear and Gaussian system.
Among all possible distributions of the observation noise $\mathbf{w}$ with a fixed covariance matrix, the CRLB for $\mathbf{x}$ attains its maximum when $\mathbf{w}$ is Gaussian, i.e., the Gaussian scenario is the ``worst-case'' for estimating $\mathbf{x}$.
For sufficiently precise measurements, none of the KF variants, including the KF itself, are based on an accurate approximation of the joint density. Conversely, for imprecise measurements all KF variants accurately approximate the joint density, and therefore the posterior density. Differences between the KF variants become evident for moderately precise measurements.
While the BCRLB (Bayesian Cramer-Rao Lower Bound) sets a best line (in the sense of MMSE) that any unbiased sequential estimator can at maximum achieve, the O2 inference sets the bottom line that any ``effective'' estimator shall at worst achieve.
Many adaptive-model approaches proposed for MTT (manuevering target tracking) may show superiority when the target indeed maneuvers but perform disappointingly or even significantly worse than those without using an adaptive model, when there is actually no maneuver. We call this over-reaction due to adaptability.
The theoretically best achievable second order error performance, namely the CRLB, in target state estimation is independent of knowledge (or the lack of it) of the observation noise variance.
Robust filtering is much more related to robustness with respect to statistical variations than it is to optimality with respect to a specified statistical model. Typically, the worst case estimation error rather than the MSE needs to be minimized in a robust filter. As a result, robustness is usually achieved by sacrificing the performance in terms of other criteria such as MSE and computing efficiency.
The standard structure of recursive filtering is based on infinite impulse response (IIR), namely all the observations prior to the present time have effect on the state estimate at present time and therefore the filter suffers from legacy errors.
Computing speed matters!
open access page: http://www.jzus.zju.edu.cn/iparticle.php?doi=10.1631/FITEE.1700379
Archiver|手机版|科学网 ( 京ICP备07017567号-12 )
GMT+8, 2024-12-23 02:48
Powered by ScienceNet.cn
Copyright © 2007- 中国科学报社