Let $Y=g(X)$ be a nonlinear transformation of some continuous random variable $X$. Assume $Y$ does not have any well-defined moments, e.g. $Y=1/X$ with $X\sim\mathcal N(\mu,1)$ and $\mu\neq 0$. If we expand $Y$ as a Taylor polynomial of order one about $\mu_X$ we obtain a new random variable $$ Y^\ast= g(\mu_X)+g^\prime(\mu_X)(X-\mu_X). $$ Now, if $X$ occurs with high probability in a sufficiently small neighborhood centered on $\mu_X$ then we can conclude that $Y\approx Y^\ast$ in the sense of distribution, that is, the distributions of $Y$ and $Y^\ast$ will be similar.
What I am looking for is a formal proof of this fact. Some sort of limit theorem. I would think such a limit theorem would be very closely related to the delta method.
For example it is easy to see (this is probably an abuse of notation, don't shoot me) that as $\mathsf{Var}X\to0$, then $X\overset{d}{\to}\mu_X$ and we have $Y\overset{d}{\to} Y^\ast\overset{d}{\to}g(\mu_X)$ so long as $g(\mu_X)$ exists. But how would we formally state this?