Suppose that $Y$ is a random variable, $\{X_n\}$ a sequence of random variables, and $f,g$ functions. It's given that for each nonrandom $y$, $$ f(X_n,y)\overset{\text{a.s.}}{\longrightarrow}g(y). $$ For a nonrandom $\alpha\in(0,1)$, can we say something about the convergence of $\Pr[f(X_n,Y)\leq \alpha]$ to $\Pr[g(Y)\leq\alpha]$? (It's also given that $g(y)$ is monotonically decreasing.)
Some thoughts: letting $I$ denote the indicator function and $F_Y(\cdot)$ denote the distribution function of $Y$, we have \begin{aligned} \Pr[f(X_n,Y)\leq\alpha]&=E[I(f(X_n,Y)\leq\alpha)]\\ &=E[E(I(f(X_n,Y)\leq\alpha)|Y)]\\ &=E[\Pr(f(X_n,Y)\leq\alpha|Y)]\\ &\color{red}{=\int_y\Pr(f(X_n,y)\leq\alpha)dF_Y(y)}\\ &\color{red}{\to \int_y\Pr(g(y)\leq\alpha)dF_Y(y)}\\ &=\Pr[g(Y)\leq\alpha]. \end{aligned} I'm a bit shaky on the convergence step and the step just before it. I think the convergence step itself works because convergence almost surely implies convergence in distribution $$ f(X_n,y)\overset{\text{a.s.}}{\longrightarrow}g(y)\implies\Pr[f(X_n,y)\leq\alpha]\to\Pr[g(y)\leq\alpha]. $$ And then we use the Dominated Convergence Theorem. Is this argument correct? I'm almost certain now that the step before the convergence step is wrong. It seems I need $Y$ is independent to each of $\{X_n\}$.