0

Thinking on a specific problem, led me to a more general question. Thus, here are two questions:

1) As the title suggests, does there exist a r.v. $X$ with $E[x]= \mu$ and $Var[X]=\sigma^2$ for some fixed $\mu$ and $\sigma$. What are the restrictions that need to be imposed on $\mu$ and $\sigma^2$?

2) Now consider the following problem: Is it possible to give an example of a sequence of random variables $(X_n)_{t \in T}$, where, say, $T= \mathbb{N_0}$ or $T= \mathbb{Z}$, such that $$Cov(X_t,X_{t+h})= \begin{cases} 1, & \text{if $h=0$} \\ 0.4, & \text{if $|h|=1$} \\ 0, & \text{otherwise.} \end{cases}$$

Holden
  • 1,607
  • 2
    The answer to your first question is obviously "yes" if $\sigma^2\ge0$ (if $\sigma^2=0$ take $X\equiv\mu$, if $\sigma^2>0$ take $X\sim\mathcal N(\mu,\sigma^2)$) and "no" if $\sigma^2<0$ (variance cannot be negative). I am unsure what $T$ refers to in your second question. – Jason May 02 '17 at 17:12
  • Since $(X)_{t \in T}$ is a sequence, then $T$ is a countable set, say $T= \mathbb{N_0}$ or $T=\mathbb{Z}$ – Holden May 02 '17 at 17:16
  • In that case, you can again use Gaussian random variables as an example. – Jason May 02 '17 at 17:18
  • Can you give more details, or post an answer? – Holden May 02 '17 at 17:19
  • 1
    For the second question, start from some i.i.d. sequence $(Y_t)$ centered with unit variance and define $$X_t=\cos\theta\cdot Y_t+\sin\theta\cdot Y_{t+1}$$ where $\theta$ solves $$\sin(2\theta)=0.8$$ – Did May 02 '17 at 18:40
  • @Did This is indeed very beautiful! Do you think it is possible that you tell how you came over the idea? – Holden May 02 '17 at 18:59
  • 1
    The idea to look at linear combinations of i.i.d. processes is natural if one thinks of MA processes. The calibration of the parameters $\cos\theta$ and $\sin\theta$ is then direct. – Did May 02 '17 at 19:31

1 Answers1

1

As I said in the comments, for the first question you can just take $X\sim\mathcal N(\mu,\sigma^2)$ if $\sigma^2>0$, $X\equiv\mu$ if $\sigma^2=0$, and otherwise it is impossible. For the second question, recall that given $R:T\times T\to\mathbb R$ there exists a mean-zero Gaussian process $(X_t)_{t\in T}$ such that $E(X_sX_t)=R(s,t)$ if and only if $R$ is symmetric and positive definite, that is, $R(s,t)=R(t,s)$ and $\sum_{i,j=1}^nR(t_i,t_j)x_ix_j\ge0$ for all $t_1,\ldots,t_n\in T$ and all $x\in\mathbb R^n$. In your case this follows from e.g. the fact that $R(t,t)>0$ and $\sum_{s\neq t}R(s,t)<R(t,t)$ for all $t$.

Jason
  • 15,726
  • Can you give a source/link to support the statement that for a matrix to be positive definite it is sufficient that $R(t,t)>0$ and $\sum_{s\neq t}R(s,t)<R(t,t)$ for all $t$? I assume it should be $Cov(X_s,X_t)=R(s,t)$ – Holden May 02 '17 at 18:13
  • This very site has a question on this topic: https://math.stackexchange.com/questions/87528/a-practical-way-to-check-if-a-matrix-is-positive-definite – Jason May 02 '17 at 18:40