I've been trying to prove that the initial value problem $$y'=y^2+t^2,\quad y(0)=0$$ has a maximal solution in $[0,2[$. From the uncountable simulations I've run it is clear to me that it is the case. The solution moreover seems to be asymptotic to $\frac1{2-t}$ as one can check by doing the change of variables $u=(2-t)y$ and solving numerically for $u$ (one then sees that $u\to 1$ as $t\to 2^-$).
I've been able to prove the existence of a solution up to $t=1.96$ using a comparison criterion and using as bounds the polynomials that come out from the typical iterated integration step $p_{n+1}(t)=y_0+\int_{t_0}^tf(s,p_n(s))\,ds$. I stopped at $1.96$ because the polynomials started to become pretty big (order $500$+) but I'm confident I could prove existence up to arbitrarily close to $2$.
Using the same comparison criterion I was unable to prove that the solution is controlled by neither $\frac1{2-t}$ nor $\frac1{2-t}+t-2$ despite it being numerically obvious that it is the case.
I've also tried the change of variables $v=\frac1{y+1}$, (the $+1$ so to make the denominator non-zero) which seems to make for a more stable ODE where the goal is to prove that the first positive root is at $v(2)=0$. However I am struggling to get a nice comparison out of that too.
How could one go about proving this fact?
Thanks!