3

Does anyone know of a proof or disproof of this conjectured formula?

$\sum_{k=1}^{n}\frac{\mathrm{d} }{\mathrm{d} k}f(k)=C+\frac{\mathrm{d} }{\mathrm{d} n}\sum_{k=1}^{n}f(k)$

I was working on evaluating this infinite sum from a question that I found:

$\sum_{k=1}^{\infty}2^{-k}\tan(2^{-k})$

While working on evaluating that infinite sum, I have encountered the issue of taking derivatives out of a summation, when expressing $\tan(2^{-k})$ as $\frac{-\frac{\mathrm{d} }{\mathrm{d} k}\ln\cos(2^{-k})}{\frac{\mathrm{d} }{\mathrm{d} k}2^{-k}}$ (using the chain rule with $\frac{\mathrm{d} }{\mathrm{d} x}\ln\cos x = -\tan x$). Taking the derivative out (after cancelling out the $2^{-k}$ in the summation with the one from the derivative) allows you to turn the summation into the repeated product of $\cos(2^{-k})$ inside of the $\ln$.

You can use a formula obtained from the repeated application of the double-angle formula, taking x to be 1, and rearrange it:

$\sin x = 2^{n}\sin\frac{x}{2^{n}}\prod_{k=1}^{n}\cos\frac{x}{2^{k}}$

$\prod_{k=1}^{n}\cos(2^{-k}) = (\frac{2^{-n}}{sin(2^{-n})})\sin 1$

Which finally allows you to evaluate the infinite sum by applying this formula on the repeated product

The issue is that, to take the derivative out of the summation, I had to rely on this conjectured formula that I made, that I wasn't able to prove, but also wasn't able to find any counter-examples to, including that the solution I arrived at, $\frac{1}{2}(\tan(\frac{1}{2})-\cot(\frac{1}{2})+2)$, which seems to be accurate from the estimations of this infinite sum to 6 digits.

  • In your first equation, $k$ is a whole number that goes from $1$ to $n$, so I'm not sure what $\dfrac{\mathrm d}{\mathrm dk}$ is intended to mean. It would seem to be asking for $\dfrac{\mathrm d}{\mathrm d3}$, etc. Do you just mean $f'(k)$? – Mark S. Dec 26 '24 at 23:01
  • @MarkS. Yeah, $f'(k)$, like the derivative of $\ln \cos (2^{-k})$ from the example – IntegralEnjoyer Dec 26 '24 at 23:08
  • 3
    $d/dn$ doesn't mean anything because the sum is only defined for $n$ an integer. Basically, this doesn't make sense. – Thomas Andrews Dec 26 '24 at 23:12
  • @ThomasAndrews Why? It's just differentiating the resulting value of the sum with respect to n, and the sum's value depends on n – IntegralEnjoyer Dec 26 '24 at 23:13
  • 1
    The resulting function is only defined for $n$ and integer, so the limit that defines the derivative is not defined. – Thomas Andrews Dec 26 '24 at 23:19
  • As an example of this conjecture in practice, $\sum_{k = 1}^{n}\frac{\mathrm{d} }{\mathrm{d} k}k = \sum_{k = 1}^{n}1 = n$, $\frac{\mathrm{d} }{\mathrm{d} n}\sum_{k = 1}^{n}k = \frac{\mathrm{d} }{\mathrm{d} n}(\frac{n}{2}(n + 1)) = \frac{\mathrm{d} }{\mathrm{d} n}(\frac{n^2}{2} + \frac{n}{2})= n + \frac{1}{2}$, so $\sum_{k = 1}^{n}\frac{\mathrm{d} }{\mathrm{d} k}k = C + \frac{\mathrm{d} }{\mathrm{d} n}\sum_{k = 1}^{n}k$ – IntegralEnjoyer Dec 26 '24 at 23:20
  • @ThomasAndrews Why not? The summation is a function of n, so you can take the derivative of it in respect to n – IntegralEnjoyer Dec 26 '24 at 23:21
  • 1
    In your example, the sum is $n$ for $n$ an integer. But it is also $n+\sin(n\pi)$ for $n$ and integer. The derivatives of those extensions to the the real numbers are different. – Thomas Andrews Dec 26 '24 at 23:22
  • 3
    There isn't just one way to extend a function on integers to the entire real line, and those different ways yield different derivatives. You can't find a well-defined way to say that $g(x)=x$ is the "real" extension, rather than $g(x)=x+\sin(\pi x).$ – Thomas Andrews Dec 26 '24 at 23:25
  • @ThomasAndrews Where did you get $\sin(\pi x)$ from? It is just a derivative of the result of the summation – IntegralEnjoyer Dec 26 '24 at 23:27
  • $\sin(\pi x)$ is $0$ at all integers, but nowhere else, so,we can add it, or any multiple of it, to a function and get a new function that is the same on all the integers. – Thomas Andrews Dec 26 '24 at 23:47
  • @ThomasAndrews I'm not trying to extend the sum to real numbers though – IntegralEnjoyer Dec 26 '24 at 23:49
  • 1
    You could take $g(x)=x+x^2\sin(\pi x)$. Really, there are a lot of functions we can choose to add. I just picked the simplest (other than $0.$) – Thomas Andrews Dec 26 '24 at 23:49
  • 1
    But you are extending to the whole real line. You have a function which is $f(n)=n,$ defined for $n$ a n integer, then using the derivative of $f(x)=x,$ the latter defined on the entire real line. The function you start with is not the real function, it is the integer function, and you are using the fact that it extends to the reals to a differentiable function. – Thomas Andrews Dec 26 '24 at 23:51
  • @ThomasAndrews Long ago, I had OP's idea in mind for once, I think OP has something "reasonable" but they can't make it rigorous. – Quý Nhân Dec 26 '24 at 23:54
  • @ThomasAndrews I don't know how to solve that issue, but when using regular summation solutions, such as $ \sum_{k = 1}^{n}k = \frac{n}{2}(n + 1)$, the conjecture seems to work in all the cases that I've tested – IntegralEnjoyer Dec 27 '24 at 00:11
  • I don't know which specific rules this constitutes – IntegralEnjoyer Dec 27 '24 at 00:19
  • 1
    It might always be true for polynomials, for example. It doesn't appear to be true for $f(n)=a^n.$ Then $g(n)=\sum_{k=1}^n f(k)$ does not have your property, I don't think. But if you stick with $f$ a polynomial, there is always a unique polynomial for $g(n).$ – Thomas Andrews Dec 27 '24 at 03:50
  • 1
    possibly of interest https://en.wikipedia.org/wiki/Ramanujan_summation – Abdelmalek Abdesselam Dec 28 '24 at 22:33

3 Answers3

5

Given $f(x)\in C^{1}(\mathbb{Z}_{+})$ (by this I mean its value and derivative at all positve integer exists). Let $g(x)$ is a smooth function satisfying $h(x)=0\ \forall x\in\mathbb{Z}_{+}$ (*) where $h(x)=g(x)-g(x-1)-f(x)$, this implies $\sum_{k=1}^{n}f(k)=g(n)-g(0)$.
To construct such $g(x)$ so that the reformulated conjecture is true, we must have $h(x)\in C^{1}(\mathbb{Z}_{+})/H$ where
$$\forall y(x)\in H:(y(x)=0,\ \forall x\in\mathbb{Z}_{+})\wedge(\exists x\in\mathbb{Z}_{+}:y'(x)\not =0)$$ This quotient space ensures that: $$h(x)=0\ \forall x\in\mathbb{Z}_{+}\implies h'(x)=0\ \forall x\in\mathbb{Z}_{+}$$ Consequently, all "pseudo-zeros" like $\sin(\pi x)$ never affect our results.
Now that we have $h'(x)=g'(x)-g'(x-1)-f'(x)=0\ \forall x\in\mathbb{Z}_{+}$. OP's informal statement: $$\sum_{k=1}^{n}\frac{d}{dk}f(k)=C+\frac{d}{dn}\sum_{k=1}^{n}f(k)\tag{1}$$ can be restated as $$\sum_{k=1}^{n}f'(k)=\sum_{k=1}^{n}(g'(k)-g'(k-1))=-g'(0)+g'(n)\tag{2}$$
where $f'(k)$ replaces $\frac{d}{dk}f(k), -g'(0)$ replaces $C$, $g'(n)$ replaces $\frac{d}{dn}\sum_{k=1}^{n}f(k)$.
OP's conjecture $(1)$ is blatantly false but its reformulated version $(2)$ is true.


Just for fun, one may wish to define "derivative on integers" by temporarily transforming domain:
$$\mathbb{Z}_{+}\xrightarrow{\textrm{interpolate}}\mathbb{R}\xrightarrow{\textrm{collapse}}\mathbb{Z}_{+}$$ I will leave this idea open.


Note: The function space $C^{1}(\mathbb{Z}_{+})/H$ will never be null and we will always be able to find $g(x)$. Just for example, take modified Whittaker–Shannon interpolation:
$$g(x)=\sum_{n=1}^{\infty}\left(\operatorname{sinc}(x-n)^2\sum_{k=1}^{n}f(k)+\operatorname{sinc}(x-n)\int_{0}^{x-n}\operatorname{sinc}(t)dt\sum_{k=1}^{n}f'(k)\right)$$
Note that all those $\operatorname{sinc}(x)$ are normalized and this only hold for a certain class of functions $f(x)$.
When $f(x)$ is a polynomial or an exponential, we can easily choose $g(x)$, just remember $g(x)$ is never necessarily unique, it depends on how we want to interpolate.

Quý Nhân
  • 2,706
  • Does this fix the problem shown by @ThomasAndrews? E.g. does adding $\sin(\pi n)$ to $g(n)$ still satisfy $g(x)−g(x−1)=f(x)?$ Because, in this case, it would make $g'(n)$ different, so it wouldn't satisfy the reformulated conjecture – IntegralEnjoyer Dec 27 '24 at 09:41
  • @IntegralEnjoyer Thank you for pointing this out. I was a bit uncareful. Fixed. – Quý Nhân Dec 27 '24 at 12:18
0

Take $f(x) = \sin(2 \pi x)$

Then $$\sum_{k=1}^{n}\frac{d}{d k}f(k)=\sum_{k=1}^{n}2\pi\cos(2\pi k)=2\pi n$$ while $$\frac{d}{dn}\sum_{k=1}^{n}f(k)=\frac{d}{dn}\sum_{k=1}^{n}0=0$$

This two functions does not differ by a constant.

jjagmath
  • 22,582
  • In this case, you could extend $\sum_{k = 1}^{n}0$ to the real numbers into $n \sin(2\pi n)$, similar to what @ThomasAndrews brought up, which would differenciate to $2\pi n \cos(2\pi n) + \sin(2\pi n) = 2\pi n$ since n is an integer – IntegralEnjoyer Dec 27 '24 at 12:50
  • I wonder if you can always find an extension of the summation to the real numbers which satisfies the conjecture, and what the rules for it would be. And if not always, which kinds of $f(k)$ functions it is possible to find a corresponding extension of their summation to the real numbers that satisfies the conjecture – IntegralEnjoyer Dec 27 '24 at 12:52
  • @IntegralEnjoyer That's a different question, I just give a simple counterexample to your conjecture. – jjagmath Dec 27 '24 at 12:55
  • Yeah, I didn't think of needing to extend the summation to the real numbers when making this post, I was only made aware of it afterwards by @ThomasAndrews – IntegralEnjoyer Dec 27 '24 at 12:58
  • We have to be careful about those notations $d/dk$ and $d/dn$, they don't make any sense. This is why the conjecture is already false even without counterexamples. – Quý Nhân Dec 27 '24 at 12:59
  • @QuýNhân I know the notation doesn't really make sense. I this were my post I would have chosen a better notation, but I keep the notation from OP so he/she understand my answer. – jjagmath Dec 27 '24 at 13:08
  • @jjagmath Yeah. Non-sense notations lead to funny results. Like your counter-example can be countered by letting: $$\sum_{k=1}^{n}\sin(2\pi k)=n\sin(2\pi n)$$ – Quý Nhân Dec 27 '24 at 15:17
0

As Thomas Andrews suggested in a comment, we can verify this claim concretely when treating everything as a polynomial.

Suppose $f(x)=\sum_{j=0}^{m}a_{j}x^{j}$. Then $\sum_{k=1}^{n}f'(k)=\sum_{k=1}^{n}\sum_{j=0}^{m}ja_{j}k^{j-1}=\sum_{k=1}^{n}\sum_{j=1}^{m}ja_{j}k^{j-1}=\sum_{j=1}^{m}ja_{j}\sum_{k=1}^{n}k^{j-1}$. We can apply Faulhaber's formula (involving the Bernoulli numbers $B_r$) to write this in terms of powers of $n$ instead of $k$, and obtain $$\sum_{j=1}^{m}ja_{j}\left(\dfrac{1}{j-1+1}\right)\sum_{r=0}^{j-1}{j-1+1 \choose r}B_{r}n^{j-1-r+1}=\boxed{\sum_{j=1}^{m}a_{j}\sum_{r=0}^{j-1}{j \choose r}B_{r}n^{j-r}}\text{.}$$

Similarly, we can calculate $\sum_{k=1}^{n}f(k)=\sum_{k=1}^{n}\sum_{j=0}^{m}a_{j}k^{j}=\sum_{k=1}^{n}\sum_{j=1}^{m}a_{j}k^{j}=\sum_{j=1}^{m}a_{j}\sum_{k=1}^{n}k^{j}$ and apply Faulhaber's formula to obtain $\sum_{j=1}^{m}a_{j}\left(\dfrac{1}{j+1}\right)\sum_{r=0}^{j}{j+1 \choose r}B_{r}n^{j-r+1}$.

And then since this is now written as a polynomial in $n$, we can take $\dfrac{\mathrm{d}}{\mathrm{d}n}$ of it in a straightforward way (ignoring any non-polynomials that happen to equal it for nonnegative integers), to obtain: \begin{align*}\phantom{=}&\sum_{j=1}^{m}a_{j}\left(\dfrac{1}{j+1}\right)\sum_{r=0}^{j}{j+1 \choose r}\left(j-r+1\right)B_{r}n^{j-r}\\=&\sum_{j=1}^{m}a_{j}\left(\dfrac{1}{j+1}\right)\sum_{r=0}^{j}\left(j+1\right){j \choose r}B_{r}n^{j-r}\\=&\sum_{j=1}^{m}a_{j}\sum_{r=0}^{j}{j \choose r}B_{r}n^{j-r}\\=&\sum_{j=1}^{m}a_{j}\left(B_{j}+\sum_{r=0}^{j-1}{j \choose r}B_{r}n^{j-r}\right)\\=&\sum_{j=1}^{m}a_{j}B_{j}+\boxed{\sum_{j=1}^{m}a_{j}\sum_{r=0}^{j-1}{j \choose r}B_{r}n^{j-r}}\\=&\sum_{j=1}^{m}a_{j}B_{j}+\sum_{k=1}^{n}f'(k)\text{.}\end{align*}

Therefore, in the case of polynomials, the claim is true with the constant $C=\sum_{j=1}^{m}a_{j}B_{j}$.

However, we cannot naively extend this result from polynomials to a common power series like that of $\cos(k)$, as the Bernoulli numbers grow pretty fast. There may be an argument about functions that decay very fast like those functions with $2^{-k}$ in them in the OP, but that is out of my wheelhouse and would be deserving of its own separate question.

Mark S.
  • 25,893