State space models (SSM)¶
Is just an hidden markov model, except the hidden states are continuous:
\(z_t\) is the hidden state
\(u_t\) is an optional input or control signal
\(y_t\) is the observation
\(g\) is the transition model
\(h\) is the observation model
\(\epsilon_t\) is the system noise at time t
\(\delta_t\) is the observation noise at time t.
Our goal is to recursively estimate the belief state:
Linear-Gaussian SSM (LG-SSM) (Linear dynamical system (LDS))¶
The transition model is a linear function
The observation model is a linear function
The system noise is Gaussian:
The observation noise is Gaussian
If all the parameters \(\theta_t = (A_t, B_t, C_t, D_t, Q_t, R_t)\) are independent of time, the model is called stationary.
LG-SSM supports exact inference. The initial belief state is a Gaussian:
All subsequent belief states are also Gaussian:
Kalman filtering algorithm can compute this quantities efficiently.
Application of SSM¶
Object tracking
Robotic slam (Essentially this is the base for robotic vacuums)
Time-series forecasting: Here we can show that the popular ARMA model can be viewed as a form of SSM.
Non-Linear and non-Gaussian SSM¶
A lot of models are not linear, or the noise is non-Gaussian. Hence the posterior is no longer Gaussian. So we have to approximate \(p(Y)\)