1

Let $P_n(x)=x^n-nx+1$ be a sequence of polynomials, where $P_n\colon[1, +\infty) \to \mathbb{R}$ and $n \ge 2$
a) Show that for each $n$, $P_n(x)=0$ has exactly one solution, and for each $n$ let $x_n$ be that unique number such that $P_n(x_n)=0$.
b)Show that $\lim_{n \to \infty}x_n=1$ and study the convergence of the series $\sum_{n \ge 2} (x_n-1)^{\alpha}$, where $\alpha \in \mathbb{R}$

My approach:
For once $\frac{dP_n(x)}{dx}=n(x^{n-1}-1) \ge 0, \forall x \ge 1$, so $P_n$ is increasing. $P_n(1)=2-n \le 0$ and $\lim_{x \to +\infty}P_n(x)=\infty$, so by the intermediate value property there is at least one point $x_n$, where $P_n(x_n)=0$ and it is unique by monotonicity.

Now for the limit.
Take $b \in (0, 1]$. Calculate $$P_n\left(1+\frac{1}{n^b}\right)=\left(1+\frac{1}{n^b}\right)^n-n\left(1+\frac{1}{n^b}\right)+1=2-n+\sum_{k=2}^n {n\choose k}\frac{1}{n^{kb}} \ge 2-n + \sum_{k=2}^m{n\choose k}\frac{1}{n^{kb}}$$ where $m \le n$. The left side is a polynomial in $n$. The highest power of $n$ with a positive coeficient is $n^{m-mb}$ and the one with a negative coefficient is $n$. If we want the limit to be positive, we want $m-mb\ge 1 \iff b \le 1-\frac{1}{m}$ which is true for sufficiently large $m$ so for sufficiently large $n$. Therefore $\forall b \in (0, 1)\colon P_n(1+\frac{1}{n^b}) \ge 0 \iff x_n < 1 + \frac{1}{n^b}$.

Now calculate $$P_n\left(1+\frac{1}{n}\right)=\left(1+\frac{1}{n}\right)^n-n\left(\frac{n+1}{n}\right)+1=\left(1+\frac{1}{n}\right)^n-n$$. So the right hand goes to $-\infty$ so for a sufficiently large $n,P_n(1+\frac{1}{n}) \leq 0 \iff 1+\frac{1}{n} \leq x_n$.

Now we have the double inequality $\forall b \in (0, 1)\colon \frac{1}{n} \leq x_n - 1 \leq \frac{1}{n^b}$ so by the Squeeze theorem $\lim_{n \to \infty}x_n=1$.

Now finding whether or not the series converges. From the above inequality $\frac{1}{n^\alpha} \leq (x_n-1)^\alpha$ so for $\alpha \leq 1$ the series diverges.

To take care of the case of $\alpha \geq 1$, I will use the following lemma.

Let $L^*=\limsup_{n \to \infty}\frac{u_n}{v_n} \in [0, +\infty)$, where $u_n, v_n \geq 0$. Then $\sum v_n \text{converges} \implies \sum u_n \text{converges}$.(There is a similar theorem for $\liminf$ but it is not needed here
Proof:
For some fixed $\epsilon$ we have that $\frac{u_n}{v_n} \leq \sup_{m \geq n}\frac{u_m}{v_m} \leq L^*+\epsilon \implies u_n \leq (L^*+\epsilon)v_n$ and the result follows by direct comparison.

Back to our problem, we have that $$\forall b \in (0, 1)\colon n^{b-1} \leq n^b(x_n-1) \leq 1 \implies \limsup_{n \to \infty}n^b(x_n-1) \in [0, 1] \implies \limsup_{n \to \infty} \frac{x_n-1}{\frac{1}{n^b}} = \limsup\frac{(x_n-1)^\alpha}{n^{b\alpha}} \in [0, +\infty)$$ So $\sum_{k=2}^n(x_n-1)^\alpha \text{converges if }\sum_{k=2}^n\frac{1}{n^{b\alpha}}$, the latter converges if $b\alpha > 1 \iff b > \frac{1}{\alpha}$. So for a given(fixed) $\alpha > 1$, we can always choose $b > \frac{1}{\alpha}$. So the series of $(x_n-1)^\alpha$ converges if $\alpha>1$ and diverges otherwise.

This problem was from some sort of contest and I wonder how would it be appropriate to approach a problem like this. The inequalities for $x_n$ make the problem seem easier, but it took me 2-3hours to find them, which is not feasible to do under the time of the contest, keeping in mind you have 4 problems of the same-ish difficulty. While the limit of $x_n$ could be calculated more easily(firstly $x_n$ is obviously decreasing and is bounded bellow by $1$ so it is convergent by Weierstrass and from the expression $x_n=(nx_n-1)^\frac{1}{n}$ and now we apply Cauchy D'Alambert, the root criterion. So $\lim\frac{(n+1)x_{n+1}-1}{nx_n-1}=1=\lim x_n$ ), the series is atrocious to calculate without first finding the appropriate inequalities. How would one approach problems like this?

P.S.: Let me know if my proof is correct.

  • Didn't read much but I think ur proof for single root is wrong you proved it increases in $(1,\infty)$ and has at least one root in $(0,1)$ the question asks to prove exactly one root in $[1,\infty)$ – RandomGuy Feb 13 '24 at 00:32
  • 1
    Oh wait ur writing is wrong line 3 is $P(1)$ – RandomGuy Feb 13 '24 at 00:34
  • Thanks for pointing out! I meant $P(1)=2-n$, not $P(0)$ – Shthephathord23 Feb 13 '24 at 01:25
  • Just for your interest, $$ x_n = 1 + \frac{{\log n}}{n} + \frac{{\log ^2 n + 2\log n - 2}}{{2n^2}} + \mathcal{O}\bigg( {\frac{{\log ^3 n}}{{n^3 }}} \bigg). $$ – Gary Feb 13 '24 at 02:56
  • I did something quite similar in https://math.stackexchange.com/questions/4856703/solution-for-42r-x1r%e2%88%92x%e2%88%921-0/4860851#4860851 – Claude Leibovici Feb 13 '24 at 10:23
  • @Gary. Just for the fun : do you think that we could inverse $$n=\frac 1 x-\frac{1}{\log (x)},,W_{-1}\left(-x^{\frac{1}{x}-1} \log (x)\right)$$ – Claude Leibovici Feb 14 '24 at 04:42

1 Answers1

1

Too long for a comment.

Consider that you look for the largest zero of function $$f(x)=x^n -n x +1$$ It is negative for $x=1$ but very stiff. So, it looks to me that it is better to look at the zero of funxtion $$g(x)=n\log(x)-\log(nx-1)$$

Expanded as a series $$g(x)=-\log (n-1)+\sum_{k=1}^\infty (-1)^k \,\frac 1 k \left(\left(\frac{n}{n-1}\right)^k-n\right)\,(x-1)^k $$

Using power series reversion $$\large\color{red}{x=1+t\Bigg(1+ \sum_{k=1}^\infty \frac {P_k(n)}{(k+1)!}\,u^k\Bigg)}$$ where $$\color{blue}{t=\frac{(n-1) }{(n-2) n}\,\log (n-1)}\qquad \text{and}\qquad \color{blue}{u=\frac{t}{(n-2) (n-1)}}$$ The coefficients of the first polynomials are (they are given from the constant term to the highest power $n^{2k}$) $$\left( \begin{array}{cc} k & P_k(n) \\ 1 & \{1,-3,1\} \\ 2 & \{-1,-4,11,-6,1\} \\ 3 & \{-1,5,16,-43,30,-9,1\} \\ 4 & \{13,-22,6,-76,173,-140,58,-12,1\} \\ 5 & \{-47,89,-125,154,197,-645,621,-326,95,-15,1\} \\ 6 & \{-73,480,-795,842,-648,-876,2657,-2814,1728,-624,141,-18,1\} \\ \end{array} \right)$$

Using only the terms above (we can generate as many as we want using the explicit formula given by Morse and Feshbach for the $n^{\text{th}}$ term).

$$g(x)=-\frac{\log ^8(n)}{40320 \,n^7}$$

For $n=10$ it gives $x=\color{red}{1.279928035}48$.

Expanding all the above for large $n$ leads to the asymptotics @Gary wrote in comments since $$t=\frac{\log (n)}{n}+O\left(\frac{1}{n^2}\right)\qquad \text{and}\qquad u=\frac{\log (n)}{n^3}+O\left(\frac{1}{n^4}\right)$$

So, for large $n$ $$x=1+\sum_{k=1}^\infty \frac {Q_k(L)}{k!\,n^k}\qquad \text{where} \qquad L=\log(n)$$ and the first polynomials of degree $k$ are $$\left( \begin{array}{cc} k & Q_k(L) \\ 1 & L \\ 2 & L^2+2 L-2 \\ 3 & L^3+6 L^2+6 L-9 \\ 4 & L^4+12 L^3+36 L^2+36 L-56 \\ 5 & L^5+20 L^4+120 L^3+210 L^2+380 L-490 \\ 6 & L^6+30 L^5+300 L^4+1260 L^3+600 L^2+5820 L-5634 \\ \end{array} \right)$$ which is less accurate than the previous one.