I would like to understand better the power series expansion of $\log(1+x)$ and how we know that it converges.
To begin, using the standard formula for Taylor expansion we quickly obtain:
$$\log(1+x) =-\sum_{k=1}^\infty\frac{(-x)^k}{k}~~.$$
Now, given that the RHS is a geometric series, it is clear that it will diverge for $|x|>1$. But then how do we know that for $|x|<1$, when the series does converge, that it converges to $\log(1+x)$?
I have found another "proof" that goes as follows. Let $|x|<1$ , then:
$$\log(1+x) =\int_0^x\frac{1}{1+t}~dt =\int_0^x\sum_{k=0}^\infty(-t)^k~dt =\sum_{k=0}^\infty\int_0^x(-t)^k~dt =-\sum_{k=1}^\infty\frac{(-x)^k}{k}~~.$$
We achieve the same result that is again valid for $|x|<1$, which is what we want. But we relied on term-wise integration and, as far as I know, the geometric series does not converge uniformly, so this step is not justified.
So to summarize my questions:
- How can we justify that $-\sum_{k=1}^\infty\frac{(-x)^k}{k}$ does converge to $\log(1+x)$ for $x\in(-1,1]$?
- How do we know, whether the Taylor sum of a function converges at all, and converges to the function?
Thanks a lot guys,
Alex