2

So, essentially, I want to start with a summation:

$$s = \sum_{k=a}^b{ f(k,x) }$$

and differentiate with respect to $x$, inside the summation. My question is: When does the following equality hold:

$$\frac{ds}{dx} = \sum_{k=a}^b{ \left( f(k,x) \frac{d}{dx} \right) }$$

IMPORTANT NOTE

I am especially interested in the case of infinite sums. Any information on differentiation inside infinite sums will be extremely helpful.

This question is similar to this one.

EXAMPLE

For example, Suppose I have the following sum:

$$s = \sum_{k=a}^b{ \cos(k x) }$$

If I now differentiate with respect to $x$ inside the summation, I get, on the right-hand side:

$$\sum_{k=a}^b{ -k \sin(k x) }$$

I'm wondering when I'm allowed to take the derivative, with respect to $x$, on the left-hand side, to get the same result.

Please note: This is just one example. I'm wondering when I can do this in general, for any $f(k,x)$.

Matt Groff
  • 5,749
  • 1
    For finite sums this reduces to $(f+g)'=f'+g'$. – Carsten S Jan 18 '14 at 21:47
  • @CarstenSchultz: I wouldn't mind having this condition explained in an answer, but I am also interested in infinite sums. – Matt Groff Jan 18 '14 at 21:55
  • I suggest that you ask specifically about infinite sums, because currently it seems as if you only ask about finite sums, and that case is trivial. – Carsten S Jan 18 '14 at 22:00
  • @CarstenSchultz: I edited the question. Thanks for the suggestion. – Matt Groff Jan 18 '14 at 22:12
  • A sufficient condition to be able to differentiate a series termwise is the locally uniform convergence of the differentiated series (plus pointwise convergence of the original). Then integration of the differentiated series shows the original to be a primitive of the differentiated series. – Daniel Fischer Jan 18 '14 at 22:25
  • @DanielFischer: I believe I would accept this as an answer if you can explain it a little further. I'm taking your comment to mean that one sufficient condition of equality is that being able to recover the right-hand terms by integration of the left-hand side. – Matt Groff Jan 18 '14 at 22:30

2 Answers2

2

Following theorem applies:

If $f_n$ converges pointwise to $f$, and if all the $f_n$ are differentiable, and if the derivatives $f'_n$ converge uniformly to $g$, then $f$ is differentiable and its derivative is $g$.

http://en.wikipedia.org/wiki/Uniform_convergence

2

For finite sums where each term is differentiable, the derivative of the sum is always the sum of the derivatives, as is seen inductively from $(f+g)' = f' + g'$.

For infinite sums or series, to be able to differentiate termwise, one needs good enough convergence of the differentiated series. A sufficient condition is locally uniform convergence of the differentiated series, together with pointwise convergence of the original series. If the differentiated series converges locally uniformly, we may interchange integration and summation in

$$\begin{align} \int_a^b \sum_{n=1}^\infty \frac{df_n}{dx}(x)\,dx &= \sum_{n=1}^\infty \int_a^b \frac{df_n}{dx}(x)\,dx\\ &= \sum_{n=1}^\infty \left(f_n(b) - f_n(a)\right)\\ &= \left(\sum_{n=1}^\infty f_n(b)\right) - \left(\sum_{n=1}^\infty f_n(a)\right). \end{align}$$

The argument shows that any kind of convergence that allows the interchange of integration and summation is sufficient. For example, if the derivatives are all non-negative, and the sum function is locally integrable, the monotone convergence theorem yields the result.

Daniel Fischer
  • 211,575