Given the following ODE,
$$\frac{{dy}}{{dx}}=\cos ({x})-\sin ({y})+{x}^{2}; \quad {y}\left({x}_{0}=-1\right)=y_0=3$$
I have to use the Taylor Series Method to compute the value of $y(x)$ at $x=-0.8$ with a Taylor's polynomial of second-order, with $h=x-x_0=0.1$.
Considering all this, how should this method be applied for solving this problem?
My attempt at a solution.
I'm not sure if this is the correct way to apply the method, but I have written Taylor's second-order polynomial centred at $x_0=-1.0$:
$$y(x) \approx y\left(x_{0}\right)+\left(x-x_{0}\right) y^{\prime}(x_0,y_0)+\frac{1}{2}\left(x-x_{0}\right)^{2} y^{\prime \prime}(x_0,y_0)= \\=3.0+1.39918(x+1)+0.11333(x+1)^2 $$
And I have evaluated this $y(x)$ at $x=-0.8$, so $y(-0.8) \approx 3.28437$.
However, this doesn't match my textbook's solution, $3.2850$, neither Wolfram-Alpha's one, $3.28687$.
Would the method be applied this way or am I missing something?