1

For a state-space model is is assumed that

A1) $ (\theta_{t}) $ is a Markov chain; that is, $ \theta_{t} $ and $ (\theta_{0},\ldots, \theta_{t-2}) $ are conditional independent given $ \theta_{t-1} $.

A2) Conditionally on $ (\theta_{t}) $, the $ Y_{t}' $s are independent and $ Y_{t} $ depends only on $ \theta_{t} $.

The dynamic linear model is a class of state space models defined by:

$$ \theta_{t} = G\theta_{t-1} + w_{t} \quad \mbox{and} \quad Y_{t} = F\theta_{t-1} + v_{t}, $$ where $ w_{t} \sim N(0,W) $ and $ v_{t} \sim N(0, V) $. The matrices $ G $ and $ F $ are known and the two sequences $ (w_{t}) $ and $ (v_{t}) $ are independent. It is also assumed that $ \theta_{0} \sim N(m_{0}, C_{0}) $.

Questions:

1) How can I show that the dynamic linear model satisfies A1) and A2) with $ \theta_{t}\mid \theta_{t-1} \sim N(G_{\theta_{t-1}}, W) $ and $ Y_{t}\mid \theta_{t-1} \sim N(F\theta_{t-1}, V) $? I guess it is fairly straight forward to deduce the means and variances, but why are they also normal?

2) How can I show that the random vector $ (\theta_{0},\ldots, \theta_{t}, Y_{1},\ldots, Y_{t}) $ has Gaussian distribution for any $ t\geq 1 $. I know it has something to do with classic results about the multivariate distribution - but I'm not sure how to apply it. Now it follows that the marginal and conditional distributions are also Gaussian. Since all the relevant distributions are Gaussian, it suffices to compute their means and covariances in the proof.

I have included a dependence structure (below) where A1) and A2 can be summarized: The graphical representation of the model can be used to deduce conditional independence properties of the random variables occurring in a state space model. In fact, two sets of random variables, A and B, can be shown to be conditionally independent given a third set of variables, C, if and only if C separates A and B, i.e., if any path connecting one variable in A to one in B passes through C.

As an example, $ Y_{t} $ and $ (\theta_{0:t-1}, Y_{1:t}) $ are conditionally independent given $ \theta_{t} $. It follows that $$ p(Y_{t}\mid \theta_{0:t}, Y_{1:t}) = p(Y_{t}, \theta_{t})$$

In a similar way, one can show that $ \theta_{t} $ and $ (\theta_{0:t-2}, Y_{1:t-1}) $ are conditionally independent given $ \theta_{0_{t-1}} $.

enter image description here

1 Answers1

0

Question 1: $\theta_{t} = G\theta_{t-1} + w$. Since you condition on $\theta_{t-1}$ you can think of it as a "known" constant. So conditioned on $\theta_{t-1}$, $\theta_{t}$ is a constant plus a normally distributed random variable ($w$) so it is also normally distributed. The same reasoning applies to $Y_{t}$.

Question 2: We know that $p(\theta_{t},\theta_{t-1}) = p(\theta_{t}\mid\theta_{t-1})\cdot p(\theta_{t-1})$. So if the two factors on the rhs are Gaussian so is the result on the lhs, because the product of two Gaussians is a Gaussian (see Product of Two Multivariate Gaussians Distributions ). The independence comes from the fact that $w$ is iid. And same reasoning applies to the $Y_{t}$ and therefore the joint distribution is also a Gaussian.