0

I am not fully clear about the reparameterization trick.

The below is from the VAE paper (title: Auto-Encoding Variational Bayes)

Let z be a continuous random variable, and $z ∼ q_\phi (z|x)$ be some conditional distribution. It is then often possible to express the random variable z as a deterministic variable $z = g_\phi(\epsilon, x)$, where $\epsilon$ is an auxiliary variable with independent marginal $p(\epsilon)$, and $ g_\phi(.) $ is some vector-valued function parameterized by $\phi$.

Given the deterministic mapping $z = g_\phi(, x)$ we know that $q_\phi(z|x) \prod_i dz_i = p(\epsilon) \prod_i d\epsilon_i$

My question is:

How can the equation $q_\phi(z|x) \prod_i dz_i = p(\epsilon) \prod_i d\epsilon_i$ be proved? Is it using the Jackobian trick for the change of variables (however not visible in the equation or sum of probability) Also what is the impact of conditioned on x here?

seeker
  • 17

0 Answers0