I have just been looking at the error in Euler's Method, and I noticed something strange.
I understand that the error is proportional to $O(h)$ via the argument that the local truncation error is proportional to $O(h^2)$, and there are n of these errors, where n is proportional to 1/h.
Given this, I agree that as we gradually increase the number of terms (n) we will obtain a smaller and smaller value of h and hence the error will decrease. It makes intuitive sense that the error for one approximation will be smaller than the error for multiple successive approximations. $O(h)>O(h^2)$ when $h<1$ and $h\rightarrow0$ (or $n\rightarrow\infty$).
However, what if we vary n in the opposite direction. Once h increases above 1, it seems to me that the local truncation error will actually be larger than the global error since $O(h)<O(h^2)$ (as $h\rightarrow\infty$ or $n\rightarrow0$).
How can this be?