7

Usually, to define relative entropy between two probability measures, one assumes absolute continuity. Is it possible to extend the usual definition in the non absolutely continuous case?

J.R.
  • 18,312
mellow
  • 796

1 Answers1

3

Relative entropy between two probability measures $P$ and $Q$ can be defined even if $P$ is not absolutely continuous w.r.to $Q$. In any case, $P$ and $Q$ are absolutely continuous w.r.to a common measure $\mu$ (one can take $\mu$ to be $\frac{P+Q}{2}$). Then relative entropy between $P$ and $Q$ is defined as $$D(P\|Q)=\int p\log\frac{p}{q}d\mu,$$ where $p=dP/d\mu$ and $q=dQ/d\mu$.

Ashok
  • 1,981
  • It's a bit unclear how this integral is defined: for some cases mutual singularity can lead to existence of $A$ s.t. $P(A) = 0$ and $Q(A) = 0$ (like with point mass and Lebesgue measure). Then $p,q$ are never non-zero together and the $\log\frac pq$ is undefined. Moreover, for any measurable function $f$ we have $\int pfd\mu = \frac12\int f,dP$ so even if $\log\frac pq$ can be defined, we'll have to integrate it over values of $x$ for which $q=0$, so $D(P||Q) = \infty$ at least informally. Though it makes sense for mutually singular measure to have infinite relative entropy. – SBF Nov 11 '11 at 09:24
  • Yes you are right. But, while defining the above said formula, we always assume the following convention. $\log 0=-\infty$, $\log\frac{a}{0}=+\infty$, for any $a\ge 0$ and $0\cdot (\pm \infty)=0$. I hope, it makes sense now. – Ashok Nov 11 '11 at 10:20
  • I had a typo in my first comment: there may exist $A$ s.t. $P(A) = 0$ and $Q(A^c) =0$. Does it mean then that $D(P| Q) = +\infty$ in such case? – SBF Nov 11 '11 at 10:48
  • 1
    I think, what you have in mind is that $P$ and $Q$ have disjoint support. In such case $D(P|Q)=\infty$. But in the case $P(A)=0$ and $Q(A^c)=0$, $D(P|Q)$ may or may not be $\infty$. If for some $A$, $P(A)\neq 0$ but $Q(A)=0$ then we will definitely have $D(P|Q)=\infty$. – Ashok Nov 11 '11 at 11:02