3

It is well known that the interarrival times for a standard (i.e. homogeneous) Poisson Process follow an Exponential Distribution (What is the correct inter-arrival time distribution in a Poisson process?) and that the probability distribution of the actual events follow a Gamma Distribution (https://www.youtube.com/watch?v=OdU-nKq6C-U @ 25:51).

I have the following question: If now we are dealing with a Non-homogenous Poisson Process, will the probability distribution of the interarrival times still follow some type of modified Exponential Distribution and will the distribution of the events still follow some type of modified Gamma Distribution?

I tried to logic this out myself:

In a non-homogeneous Poisson process, the number of events $N(T)$ in the interval $[0, T]$ follows a Poisson distribution. The rate function of this process is now time dependent and denoted by $\lambda(t)$. Going back to the Poisson Process itself, I think we can write the rate parameter as: $$ \Lambda(0, T) = \int_0^T \lambda(t) \, dt. $$ Therefore, the probability of observing exactly $k$ events in the interval $[0, T]$ is: $$ P(N(T) = k) = \frac{[\Lambda(0, T)]^k e^{-\Lambda(0, T)}}{k!}, \quad k = 0, 1, 2, \ldots $$

From here, I tried to write the distributions for interarrival times.

If $T_1$ represent the time until the first event, the probability that no event has occurred by time $t$, is: $$ P(T_1 > t) = \exp\left(-\int_0^t \lambda(u) \, du\right). $$ The density function of the first interarrival time can be obtained by differentiating the above function: $$ f_{T_1}(t) = \lambda(t) \exp\left(-\int_0^t \lambda(u) \, du\right), \quad t \ge 0. $$

And finally, for the distribution of the actual events.

In the standard case (i.e. homogenous poisson process), let $X_1, X_2, ..., X_n$ be the interarrival times in the Poisson process. These are iid exponential random variables with rate $\lambda$. $T_n = X_1 + X_2 + ... + X_n$ be the time of the $n$-th event. In general, the sum of $n$ independent exponential($\lambda$) random variables follows a Gamma($n$, $1/\lambda$) distribution.

While I am not sure how the "mathematical gymnastics" would work out, I am guessing that a Gamma Distribution made from the sum of Exponential Random Variables with time dependent rate parameters might result in a Gamma Distribution with time dependent parameters:

$$ f(x; k(t), \theta(t)) = \frac{x^{k(t)-1} e^{-x/\theta(t)}}{\theta(t)^{k(t)} \Gamma(k(t))}, \quad x \ge 0. $$

Here, $k(t)$ could represent the number of events up to time $t$, and $\theta(t)$ could be related to the integral of the rate function up to time $t$:

$$\theta(t) = \int_0^t \lambda(u) \, du$$

Is this logic reasoning correct?

RobPratt
  • 50,938
konofoso
  • 681

1 Answers1

2

Your first argument (the distribution of $T_1$) is correct assuming $\lambda$ is a reasonable function. However, notice that this is not an exponential distribution. For example, if $\lambda(t) = t$, then $T_1$ has pdf $$f_{T_1}(t) = t\exp\left(-\frac{t^2}{2}\right).$$ Things can get strange if $\lambda(t)$ has an infinite integral in finite time (e.g. $\lambda(t) = \frac{1}{1-t}$) or the integral of $\lambda(t)$ is finite: for example, if $\lambda(t) = e^{-t}$, then $$\lim_{t\to\infty} \int_0^t \lambda(s)\,ds = 1,$$ so $$P(T_1 < \infty) = 1 - e^{-1} < 1.$$

Likewise, $T_n$ is not generally Gamma distributed. You can compute the distribution of $T_n$ the same way you did $T_1$ (with some induction). Let $\theta(t) = \int_0^t \lambda(s)\,ds$. I claim, $$f_{T_n}(t) = \lambda(t)e^{-\theta(t)}\left[\frac{(\theta(t))^{n-1}}{(n-1)!}\right].$$

Clearly, when $n = 1$, $$f_{T_1}(t) = \lambda(t)e^{-\theta(t)}\left[\frac{(\theta(t))^{0}}{(0)!}\right] = \lambda(t)e^{-\theta(t)},$$ so the expression is correct for $n=1$. Suppose it is correct up to $n-1$. Then, \begin{align*} P(T_n > t) &= P(T_{n-1} > t) + P(T_{n-1}\leq t < T_n)\\ &= P(T_{n-1} > t) + P(N(t) = n-1)\\ &= P(T_{n-1}>t) + e^{-\theta(t)}\frac{(\theta(t))^{n-1}}{(n-1)!}. \end{align*} Taking a derivative and multiplying by $-1$ yields: \begin{align*} f_{T_n}(t) &= f_{T_{n-1}}(t) - \frac{d}{dt}e^{-\theta(t)}\frac{(\theta(t))^{n-1}}{(n-1)!}\\ &= \lambda(t)e^{-\theta(t)}\left(\frac{(\theta(t))^{n-2}}{(n-2)!} + \frac{(\theta(t))^{n-1}}{(n-1)!} - \frac{(\theta(t))^{n-2}}{(n-2)!}\right)\\ &= \lambda(t)e^{-\theta(t)}\left[\frac{(\theta(t))^{n-1}}{(n-1)!}\right] \end{align*} as desired.

Relation with the Gamma Distribution: That said, there is actually a relation between $T_n$ and the Gamma Distribution. Let's assume $\lambda$ is nice. That is,

  1. $\lambda$ is continuous,
  2. $\lambda(t) > 0$ for all $t$,
  3. $\theta(\infty) = \infty$, and
  4. $\theta(t) < \infty$ for all $t < \infty$.

Then instead of looking at $T_n$, we can examine $\theta(T_n)$. The conditions above ensure that $\theta$ is invertible and its inverse is differentiable. So setting $u = \theta(t)$,

\begin{align*} f_{\theta(T_n)}(u) &= f_{T_n}(\theta^{-1}(u))\left|\frac{d}{du} \theta^{-1}(u)\right|\\ &= f_{T_n}(t)\left|\frac{1}{\lambda(\theta^{-1}(u))}\right|\\ &= \lambda(t)e^{-\theta(t)}\left[\frac{(\theta(t))^{n-1}}{(n-1)!}\right]\frac{1}{\lambda(t)}\\ &= e^{-u}\frac{u^{n-1}}{\Gamma(n)}. \end{align*}

So, $\theta(T_n)$ is Gamma$(n,1)$ distributed. Naturally, when $n = 1$, $\theta(T_1)$ is Gamma$(1,1) = $ Exponential($1$) distributed.

What does this mean? You can think of an inhomogeneous Poisson process as a Poisson process for which time has been distorted. At time $t$, an inhomogeneous Poisson process with rate function $\lambda$ looks just like a unit rate homogeneous Poisson process at time $\theta(t)$. In fact, if $P$ is a unit rate homogeneous Poisson process, then the point process $N$ defined by $$N(t) = P(\theta(t))$$ is an inhomogeneous Poisson point process with rate function $\lambda$.