I know that:
$$ var(XY)=E(X^2Y^2)−E(XY)^2=var(X)var(Y)+var(X)E(Y)^2+var(Y)E(X)^2 $$
But what is $var(X/Y)$? It doesn't seem to be as simple as treating $Y$ above as $1/Y$.
I know that:
$$ var(XY)=E(X^2Y^2)−E(XY)^2=var(X)var(Y)+var(X)E(Y)^2+var(Y)E(X)^2 $$
But what is $var(X/Y)$? It doesn't seem to be as simple as treating $Y$ above as $1/Y$.
Your second equation for $\text{var}(XY)$ is true if $X$ and $Y$ are independent, not in general otherwise. I don't know what kind of an answer you're expecting for $\text{var}(X/Y)$. There is no formula expressing $E[1/Y]$ or $\text{var}(1/Y)$ in terms of $E[Y]$ and $\text{var}(Y)$, if that's what you want.
EDIT: One thing you can say is this. Suppose the distribution of $Y$ is supported in an interval $[c-r,c+r]$ with $0 < r < c$. We have $$\dfrac{1}{y} = \sum_{j=0}^\infty \dfrac{(-1)^j}{c^{j+1}} (y-c)^j$$ the series converging uniformly on $[c-r,c+r]$, so that $$ E\left[ \dfrac{1}{Y}\right] = \sum_{j=0}^\infty \dfrac{(-1)^j}{c^{j+1}} E[(Y-c)^j]$$ a convergent series in the moments of $Y$ about $c$. In order to determine $E[1/Y]$ exactly, you need all the moments, but partial sums of the series can be used as approximations. Similarly, $$E\left[ \dfrac{1}{Y^2}\right] = \sum_{j=0}^\infty \dfrac{(-1)^j (j+1)}{c^{j+2}} E[(Y-c)^j]$$ and of course this is $\text{var}(1/Y) + E[1/Y]^2$.
I wanted the question to be as general as possible, but perhaps I'll get more specific here:
I have two groups of independent samples (each with a mean and a variance), and I would like to have a measure of their ratio, also with a mean and variance. There are various ways of getting this with simulations but I had hoped their might be an analytical solution.
– Geoff Mar 23 '15 at 21:04