6

First, let me clarify what I mean by a strictly stationary time series. Let $(X_t)_{t\in \mathbb{Z}}$ be a sequence of random variables on some probability space. If it holds that $$(X_t, X_{t+1},\ldots,X_{t+h}) \stackrel{d}{=} (X_s, X_{s+1},\ldots,X_{s+h})$$ for every $s,t \in \mathbb{Z}$ and every $h \in \mathbb{Z}_{\geq 0}$.

Next, I will explain what I mean by linear transform. Given a sequence of reals $(\psi_j)_{j\in \mathbb{Z}}$ such that $\sum_j\lvert\psi_j\rvert < \infty$, linear transform of $(X_t)$, which we denote by $(Y_t)$, is defined as

$$Y_t = \sum_j\psi_jX_{t-j}$$

Now suppose that the sum above is well-defined, i.e. $\sum_j\psi_jX_{t-j}$ is finite almost surely. ($E\lvert X_t\rvert < \infty$ would be sufficient for this but we don't assume that.)

How do I then show that $(Y_t)$ is also strictly stationary? Intuitively speaking, $$\sum_j\psi_jX_{t-j} \stackrel{d}{=} \sum_j\psi_jX_{s-j}$$ since $(X_t)$ is strictly stationary. But shifting a finite sequence is not quite the same as ``shifting an infinite sequence". How do I make this rigorous?

Calculon
  • 5,843
  • 1
    you can clamp all the $X_t$ with the function $f_M(x) = x$ if $|x| < M$, $f_M(x) = \pm M$ otherwise, clearly $Y_t(M) = \sum_j \psi_j f_M(X_{t-j})$ is stationary, and when $M \to \infty$, $Y_t(M)$ converges in distribution to $Y_t$ – reuns Apr 24 '16 at 21:52
  • @user1952009 What do you mean by "clamp all the.."? – Calculon Apr 24 '16 at 21:55
  • 1
    it's not clear ? and I guess if there are other ways, they won't be fundamentally different. and what you call a linear transform is called a linear time-invariant transformation : https://en.wikipedia.org/wiki/LTI_system_theory (aka. a filtering) – reuns Apr 24 '16 at 22:03

2 Answers2

3

Similar to what @user1952009 was saying, we can fix a positive integer M and then define the finite linear transformation:

$$(Y_t^M,\dots,Y_{t+h}^M) = (\sum_{|j|\le M}\psi_jX_{t-j},\dots, \sum_{|j|\le M}\psi_jX_{t+h-j})$$

Here we can without problem make the argument:

$$(\sum_{|j|\le M}\psi_jX_{t-j},\dots, \sum_{|j|\le M}\psi_jX_{t+h-j}) \stackrel{d}{=} (\sum_{|j|\le M}\psi_jX_{s-j},\dots, \sum_{|j|\le M}\psi_jX_{s+h-j})$$

Assuming that $(Y_t^M,\dots,Y_{t+h}^M)$ is integrable, we can apply the dominated convergence theorem to conclude that in the limit as $M \to \infty$, $(Y_t^M,\dots,Y_{t+h}^M) \to (Y_t,\dots,Y_{t+h})$ a.s. (or definitely at least in distribution, which is all that is necessary for our purposes).

Likewise $(\sum_{|j|\le M}\psi_jX_{t-j},\dots, \sum_{|j|\le M}\psi_jX_{t+h-j}) \to (\sum_{j}\psi_jX_{t-j},\dots, \sum_{j}\psi_jX_{t+h-j})$ and $(\sum_{|j|\le M}\psi_jX_{s-j},\dots, \sum_{|j|\le M}\psi_jX_{s+h-j}) \to (\sum_{j }\psi_jX_{s-j},\dots, \sum_{j}\psi_jX_{s+h-j})$ by the dominated convergence theorem, therefore the two sequences, since they are equal in distribution for every $M$, must approach the same limit in distribution, i.e. we have the desired conclusion:

$$(\sum_{j}\psi_jX_{t-j},\dots, \sum_{j}\psi_jX_{t+h-j}) \stackrel{d}{=} (\sum_{j }\psi_jX_{s-j},\dots, \sum_{j}\psi_jX_{s+h-j})$$

I hope this helps even though I glossed over some details.

EDIT: For any measurable $g$, if $Z_1, Z_2$ are two random vectors with the same distribution $\mathbb{P}( \cdot)$, then both $g(Z_1)$ and $g(Z_2)$ have the distribution $\mathbb{P}(g^{-1}(\cdot))$, i.e. it follows immediately that $g(Z_1)$ and $g(Z_2)$ have the same distribution. See for example Kallenberg's book on probability theory -- if I recall correctly, he treats strictly stationary sequences extensively in the context of ergodic theory.

Chill2Macht
  • 22,055
  • 10
  • 67
  • 178
  • Please read the definition of strict stationarity I gave in the question carefully. You need to consider a vector of finite length of $Y$. What I wrote at the end of my post concerns only the marginal distribution of $Y_t$. – Calculon May 04 '16 at 13:40
  • I know what the definition of stationarity (strict and weak) is. However my argument generalizes immediately to that case. – Chill2Macht May 04 '16 at 13:42
  • I edited the argument to make my point clearer. – Chill2Macht May 04 '16 at 13:48
  • I think I get the proof and it seems OK to me. Denoting the vector at the top by $Y_t^M$, we have $Y_t^M \stackrel{d}{=} Y_s^M$. Furthermore, $Y_t^M \to Y_t$ and $Y_s^M \to Y_s$ a.s.. Hence also in distribution. By the uniqueness of limits, $Y_t \stackrel{d}{=} Y_s$. Thanks for the clarification. – Calculon May 04 '16 at 14:34
  • How much extra effort would it require to show that $Y_t = g(X_t,X_{t-1},\ldots)$ is strictly stationary for any measurable function $g$? – Calculon May 05 '16 at 19:25
  • For any measurable $g$, if $Z_1, Z_2$ are two random VECTORS with the same distribution $\mathbb{P}( \cdot)$, then both $g(Z_1)$ and $g(Z_2)$ have the distribution $\mathbb{P}(g^{-1}(\cdot))$, i.e. it follows immediately that $g(Z_1)$ and $g(Z_2)$ have the same distribution. See for example Kallenberg's book on probability theory -- if I recall correctly, he treats strictly stationary sequences extensively in the context of ergodic theory. – Chill2Macht May 05 '16 at 19:51
  • when you say vectors you mean vectors of possibly infinite length? – Calculon May 05 '16 at 20:02
  • Short answer: yes. It suffices to check that equivalence in distribution holds for all finite dimensional distributions of a sequence of random variables in a Polish Space. See for example Fristedt and Gray, "A Modern Approach to Probability Theory". It's better to formulate the result in terms of finite dimensional distributions, since what "distribution" is supposed to mean for a bidimensional sequence of random variables is more difficult to formulate correctly and precisely. This is the reasoning/motivation behind the definition of strict stationarity given in your question. – Chill2Macht May 05 '16 at 20:41
  • Thank you very much. I am looking at Kallenberg now. I will check the other one also. – Calculon May 05 '16 at 20:52
  • Kallenberg's definition of strict stationarity is already based on the infinite sequence of random variables. I guess I have to show the equivalency between the finite dimensional definition and Kallenberg's definition then. – Calculon May 05 '16 at 21:58
  • I believe Theorem 19 in section 18.5 of Fristedt and Gray is relevant (p. 359), since we are assuming that all of the finite-dimensional distributions are determined. Theorem 20 (which is effectively a special case) might also be applicable. – Chill2Macht May 05 '16 at 23:16
3

Fix some $s,t \in \Bbb Z$. Note that since $(X_t, X_{t+1},\ldots,X_{t+h}) \stackrel{d}{=} (X_s, X_{s+1},\ldots,X_{s+h})$ for all $h$, we have that $$(X_{t+k})_{k \in \Bbb Z} \stackrel{d}{=} (X_{s+k})_{k \in \Bbb Z}$$ since both processes have the same finite-dimensional distributions. If you are not familiar with this fact from elementary measure theory, see for example Theorem 23 here (there $\Xi$ denotes the space of functions endowed with product $\sigma$-algebra) or alternatively Proposition 3.1 here.

Now define the map $\psi: \Bbb R^{\Bbb Z} \to \Bbb R^{\Bbb Z}$ by sending $(x_j)_{j \in \Bbb Z} \mapsto (\sum_j \psi_j x_{k-j})_{k \in \Bbb Z}$ and note that it is measurable on the product $\sigma$-algebra (since the individual components are). Fix some $s,t \in \Bbb Z$, and since $(X_{t+k})_{k \in \Bbb Z} \stackrel{d}{=} (X_{s+k})_{k \in \Bbb Z}$ we easily get that $$(Y_{t+k})_{k\in\Bbb Z}=\psi\big((X_{t+k})_{k \in \Bbb Z}\big) \stackrel{d}{=} \psi\big((X_{s+k})_{k \in \Bbb Z}\big)=(Y_{s+k})_{k\in\Bbb Z}$$ which precisely means stationarity of $Y$.

shalin
  • 14,313