0

Consider a measure on $\mathbb{R}$ given by $\mu(dx) = \exp(-\Psi(x))$. The domain doesn't necessarily need to be all of $\mathbb{R}$; for example, it could be $[0, \infty)$. For now, I won't impose any conditions on $\Psi$ other than that it is smooth. I'm wondering whether it is easy to write explicitly a set of functions $f_k(x)$ that are orthogonal with respect to $\mu$.

For example, when $\Psi(x) = \frac{x^2}{2}$, then this is just the Gaussian density (modulo the normalization constant). Then, this set of functions can be taken to be the Hermite polynomials, whose formula are $P_k(x) = e^{x^2/2} \frac{d^k}{dx^k} e^{-x^2/2}$. It seems intuitive that one might choose the set of functions to be $f_k(x) = e^{\Psi(x)} \frac{d^k}{dx^k} e^{-\Psi(x)}.$ However, I'm not quite sure how to show that these are orthogonal with respect to $\mu$ (I'm not actually even convinced that they are). In particular, the proofs I've seen for the Hermite polynomials being orthogonal rely heavily on the fact that they are indeed polynomials (i.e., that $P_k(x)$ is a degree $k$ polynomial); for example, here. The proof involves an integration by parts to move derivatives, and then when you differentiate the lower-degree polynomial too many times, it is identically 0. This proof strategy doesn't seem to adapt well to a general function $\Psi(x)$.

I'm wondering whether this formula I wrote above works (and how to prove that they are orthogonal), or otherwise, whether there is an easy and explicit way to write down these functions $f_k(x).$ Thanks!

Alan Chung
  • 1,426

0 Answers0