2

I would like to determine the initial value such that $a_n$ given by $a_{n+1}=2-a_{n}^2$ converges. Trivially it has fixed points $a_1=1, a_2=-2$. Some obvious initial values are as follows:

Initial values s.t. $a_n\to -2$: $\{-2,2,0,-\sqrt{2},\sqrt{2},\sqrt{2+\sqrt{2}},\sqrt{2-\sqrt{2}},\cdots\}$

Initial values s.t. $a_n\to 1$: $\{1,-1,-\sqrt{3},\sqrt{3},\sqrt{2+\sqrt{3}},\sqrt{2-\sqrt{3}},\cdots\}$

By continuing we may find two sets of initial values that the sequence converges to different fixed points. However I can't show whether these are all of the initial values such that the sequence is stable.

  • Suppose the sequence converges to a limit $a$. Then by basic calculation rules for limits we have $a = 2-a^2$. Hence the limit is either $1$ or $-2$ if it exists (you already mentioned these in your post). Moreover, a straight forward strategy is to find initial values, s.t. the sequence is monotonic and bounded (which yields existence of a limit). I think those are the tools one has to use. – Joseph Expo Jul 25 '24 at 12:23
  • By fixed point iteration, since the gradients of $y=2-x^2$ at the points $x=-2$ and $x=1$ have absolute value greater than $1,$ the sequence will not converge for any $x$ (other than starting values $x=1$ and $x=-2$). Indeed, you can prove that, whenever $x_n$ is close to one of the fixed points, the process ($x_{n+1}, x_{n+2},...$) then moves us away from that fixed point. Therefore, it will not converge to that root. – Adam Rubinson Jul 25 '24 at 12:54
  • 1
    @AdamRubinson what you claim is not true... The sequence will still converge if $x_p$ coincides with one of the fixed points, for some $p$. The OP provided several examples, for instance $x_0=\sqrt{2}$. – PierreCarre Jul 25 '24 at 12:58

2 Answers2

4

A priori, $a_0\in\Bbb C$. You can check by induction that $$a_n=-\beta^{2^n}-\frac1{\beta^{2^n}},$$ where $\beta\in\Bbb C$ is such that $a_0=-\beta-\frac1\beta$. For this sequence to converge, one needs $|\beta|=1$, which implies $$a_n=-2\cos(2^n\theta)$$ for some $\theta\in[0,\pi]$. With arguments similar to those for $\sin(2^n\theta)$, one easily concludes that $(a_n)$ is then convergent iff $$\exists N,k\in\Bbb N_0\quad\theta=\frac{k\pi}{3\cdot2^N},$$ i.e. $$\exists N,k\in\Bbb N_0\quad a_0=-2\cos\left(\frac{k\pi}{3\cdot2^N}\right).$$ The sequence is then eventually constant, converging to $-2$ if $3$ divides $k$, and to $1$ else.

Your initial values $-2,2,0,-\sqrt{2},\sqrt{2},\sqrt{2+\sqrt{2}},\sqrt{2-\sqrt{2}}$ correspond respectively to $\theta=0,\pi,\frac\pi2,\frac\pi4,\frac{3\pi}4,\frac{7\pi}8,\frac{5\pi}8$, and your $1,-1,-\sqrt3,\sqrt3,\sqrt{2+\sqrt3},\sqrt{2-\sqrt3}$ to $\frac{2\pi}3,\frac\pi3,\frac\pi6,\frac{5\pi}6,\frac{11\pi}{12},\frac{7\pi}{12}$.

Anne Bauval
  • 49,005
2

As you mention, if $(a_n)$ converges, the limit is $\ell = -2$ or $\ell=1$, the fixed points of $g(x)=2-x^2$. So, choosing $x_0=-2$ or $x_0=1$ produces converging (constant) sequences. Since $|g'(-2)|,|g'(1)| > 1$, the only way to obtain convergent sequences is if the sequence eventually attains $-2$ or $1$. So, you can characterize the sets of initial conditions leading to convergent sequences as:

$$ S_{-2} = \{x_0 \in \mathbb{R}: \left(\exists_{p \in \mathbb {N}}:\,\, x_p = -2 \right)\} $$

$$ S_{1} = \{x_0 \in \mathbb{R}: \left(\exists_{p \in \mathbb {N}}:\,\, x_p = 1 \right)\} $$

Another issue is if you can obtain a more useful characterization, but the values that you presented are methodically recovering such initial approximations.

PierreCarre
  • 23,092
  • 1
  • 20
  • 36