Questions tagged [time-series]

This tag is used for question related to time series models such as AR, ARMA, ARCH, GARCH and their properties and techniques used for inference.

A time-series model is one which postulates a relationship amongst a num- ber of temporal sequences or time series. An example is provided by the simple regression model

$$y(t) = x(t) \beta + \epsilon(t)$$

or more commonly,

$$y(t) = \sum_{i=1}^p \phi_i y(t-i)+ \sum_{i=1}^k \beta_i y(t-i)+\sum_{i=1}^q \mu_i \epsilon(t-i)$$

980 questions
21
votes
2 answers

What is lag in a time series?

I am curious about what a lagging time series is. On investopedia, I saw an article that said that: "Autocorrelation is degree of similarity between time series and a lagged version of itself over successive intervals." Someone please explain to me…
TopCoder
  • 321
14
votes
3 answers

Scaling factor and weights in Unscented Transform (UKF)

I'm trying to implement the UKF for parameter estimation as described by Eric A. Wan and Rudolph van der Merwe in Chapter 7 of the Kalman Filtering and Neural Networks book: Free PDF I am confused by the setting of $\lambda$ (used in the selection…
12
votes
1 answer

The only strictly stationary random walk in $\mathbb{R}$ is degenerate

An $\mathbb{R}$-valued discrete-time stochastic process $\{X_n\}_{n \in \mathbb{Z}}$ is said to be strictly stationary if for all choices of times $t_1, \ldots , t_n \in \mathbb{Z}$ and lags $h \in \mathbb{Z}$ the following holds $$(X_{t_1}, \ldots…
8
votes
1 answer

Bound on expectation of absolute value in terms of variance

In my book it says that a white noise process $\{Z_t\}$ with mean zero and variance $\sigma^2$ has the following property: E$|Z_t| \leq \sigma$. This had me thinking of Jensen's inequality, that $\text{E}(g(X)) \geq g(\text{E}(X))$, for convex…
8
votes
1 answer

Autocovariance of Ornstein–Uhlenbeck and AR (1) processes

The autocovariance of an Ornstein–Uhlenbeck process $$ dX(t) = \theta (\mu - X(t))dt + \sigma dW(t) $$ is given on Wikipedia as $$ \operatorname{Cov}(X(s),X(t)) = \frac{\sigma^2}{2\theta}(e^{-\theta|t-s|} - e^{-\theta(t+s)}) . \tag{1} $$ which…
7
votes
1 answer

Seasonal adjustment and Fourier analysis

I've been reading up on seasonal adjustment (removing "seasonal" periodic components from a time series) recently and although I see a lot of fancy work around ARIMA models and fancy ways to detect the seasonality, I see comparatively little work on…
7
votes
2 answers

Book Recommendation on Time Series Statistics

Professionally I am analysing high-frequency data coming from motion sensors and alike. I would like to "up my theoretical background game" in this area and am therefore looking for recommendations to books about (I guess) statistics and probability…
6
votes
1 answer

Show that $X_{t}:=\alpha X_{t-1}+\epsilon_{t}$ is strictly stationary for $|\alpha|<1$ and $\epsilon_{t}$ i.i.d$~\sim N(0,\sigma^{2})$.

The title can be shortened to "prove that $AR(1)$ processes are strictly stationary when $|\alpha|<1$". This has been discussed many times on MSE and Cross Validated, but I found no mathematical proof of why it is strictly stationary. For…
6
votes
1 answer

Computing an Exponential Moving Average Via Convolution

According to the Wikipedia page on moving averages, "This is also why sometimes an EMA is referred to as an $N$-day EMA. Despite the name suggesting there are $N$ periods, the terminology only specifies the $\alpha$ factor. $N$ is not a stopping…
6
votes
2 answers

Linear transform of a strictly stationary time series

First, let me clarify what I mean by a strictly stationary time series. Let $(X_t)_{t\in \mathbb{Z}}$ be a sequence of random variables on some probability space. If it holds that $$(X_t, X_{t+1},\ldots,X_{t+h}) \stackrel{d}{=} (X_s,…
Calculon
  • 5,843
6
votes
1 answer

Empirical Kullback-Leibler divergence of two time series

I have an two vectors (time series) with the same length (1200 elements) $x$ and $y$. Further both time series are stationary. I don't know the theoretical distribution of $x$ and $y$. I would like to calculate relative entropy of these r.v.-s. I…
5
votes
2 answers

Are these statements of my professor about periodicity of harmonic processes in time series analysis correct?

Assume $X_t$ is a harmonic stochastic process, i.e., $$X_t = \sum_{j=-k}^k A_j \exp(i \lambda_j t)$$ where the frequencies $\lambda_j$ are given and $A_j$ are uncorrelated random variables with zero mean and variance $\sigma_j^2$. Then the spectral…
5
votes
1 answer

Proof that a series converges to zero

I am working on the following problem arising in time series analysis. Let us assume that $\sum_{h \in \mathbb{Z}} |\gamma(h)|<\infty$. I would like to prove that \begin{equation*} 1) \; \; \; \lim_{n\to +\infty} \sum_{h >n } \gamma(h) =…
5
votes
0 answers

Definition of ergodicity and ergodicity for second moments of a stochastic process

According to this topic, we understand well the relation between the ergodicity for dynamical system and the mean ergodicity of a stochastic process. More exactly, we have the Ergodic Theorem (See page 413 of A. N. Shiryaev) The following…
5
votes
2 answers

Definition of white noise vectors

As is shown in wikipedia: Click [here] (http://en.wikipedia.org/wiki/White_noise#Mathematical_definitions) A random vector (that is, a partially indeterminate process that produces vectors of real numbers) is said to be a white noise vector or white…
1
2 3
65 66