0

Suppose $M$ is a Poisson random measure on $(E,\mathcal{E}) \equiv (\mathbb{R}_+\times\mathbb{R}^d, \mathcal{B}(\mathbb{R}_+\times\mathbb{R}^d))$ with mean measure $\nu\equiv Leb\times\lambda$. This means that:

  1. $\forall A \in \mathcal{E}$, the random variable $N(A)$ has Poisson distribution with mean $\nu(A)$;
  2. $\forall A_1, A_2, \cdots, A_n$ in $\mathcal{E}$ that are disjoint, the random variables $N(A_1), \cdots, N(A_n)$ are independent.

Now, define $X_t(\omega) \equiv \int_{[0,t]\times \mathbb{R}^d} M_\omega(ds,dx)x$. This is a random process.

I want to show that X has stationary and independent increments.

I know that in general, we have explicit formula for the characteristic function for integrals like this, and that can be used to demonstrate stationarity and independence.

But I feel that stationary and independent increments should almost directly come from definition. Suppose I take $X_t$ and $X_{s+t} - X_s$. Then I know that since $[0,t]\times B\ $ is disjoint from $[s,s+t]\times \tilde{B}, \forall B, \tilde{B}\in \mathcal{B}(\mathbb{R}^d)$, the random variables $M([0,t]\times B)$ and $M([s,s+t]\times \tilde{B})$ are independent.

But how would I proceed to show $X_t(\omega)$ and $X_{s+t}(\omega) - X_s(\omega)$ as defined above are independent and have the same distribution from here?

TomG
  • 99

1 Answers1

1

In case anyone else find it useful, let me post my own update.

We could define $L(\omega,A)\equiv \int_{A\times \mathbb{R}^d} M_\omega(ds,dx)x$, and then show that $L$ is an additive random measure. Measurability is a consequence of Fubini, and additivity comes from the properties of N being a random Poisson measure.

Then, by definition of additive random measure, we know $X$ has stationary and independent increments.

TomG
  • 99