3

This question has kind of two parts:

  1. Is it possible to find a closed non recursive form of $f(x)$ where $f(x) = \sin(x + f(x))$.

  2. If the answer to 1 is no, then is it possible to approximate this function in a less computationally expensive way than $f_k(x) = \sin(x + f_{k-1}(x))$

Using the $f_k(x)$ at a depth of $k>10000$ I seem to be able to find that $f_k(x)$ as $k$ approaches ∞ converges to the following graph, where the blue line is $f_{10000}(x)$ and the orange line is $\sin(x)$.

enter image description here

I have tried searching google and here but I can't seem to find any information on what techniques I could use to find a non-recursive form or a better approximation of an infinitely recursive function like $f(x)$.

Yuriy S
  • 32,728
Sam
  • 31
  • 1
  • 2
    There's Fourier series. See the answer to this https://math.stackexchange.com/q/1652612/269624 – Yuriy S Feb 10 '18 at 20:14
  • However the Fourier series contain Bessel functions. So they might not offer any advantage when it comes to computation. Though they are much better to work with analytically – Yuriy S Feb 10 '18 at 20:58

4 Answers4

2

[More of an extended comment than an actual answer.]

This can sort of be done in closed form. By differentiating $f(x) = \sin(x+f(x))$ twice, one obtains a differential equation $$f''(x)=\frac{f'(x) f''(x)}{f'(x)+1}-f(x) \left(f'(x)+1\right)^2$$ subject to $f(0) = 0$, which Mathematica tells me is solved by letting $f$ be an inverse of the function $$y \mapsto \pm \tan^{-1}\left(\frac{y\sqrt{1+2a-y^2}}{y^2-2a-1}\right) - y$$ for some constant $a$. It should be possible to fix the value of $a$ using another value of $f$ (perhaps at $x=\pi$); note, however, that your function looks very likely to be discontinuous, so it may be necessary to stitch together pieces of the solution on different intervals; and the boundary condition I applied, namely $f(0) = 0$, does not constrain the solution if we move past a discontinuity.

2

You can efficiently approximate $f(c)$ by solving $\sin(x + c) - x = 0$ for $x$. This always has a unique solution, the solution always lies in $[-1, 1]$ and the function is always decreasing. Therefore we can simply use binary search to get one fractional bit of precision per iteration.

E.g. in Python:

import math
def f(x, prec=1e-10):
    lo = -1.1
    hi = 1.1
    while hi - lo > prec:
        mid = (lo + hi) / 2
        if math.sin(x + mid) - mid > 0:
            lo = mid
        else:
            hi = mid
    return (lo + hi) / 2

And to show the graph:

import matplotlib.pyplot as plt
import numpy as np
x = np.linspace(-2*math.pi, 2*math.pi, 1000)
y = list(map(f, x))
plt.grid(True)
plt.plot(x, y)
plt.show()

enter image description here

orlp
  • 10,724
1

Yes, there is a non recursive form as a power series in $\;y=x^{1/3}.\;$ We have the result $$f(x) \!=\! 6^{1/3}y^1 -\frac{9}{10}y^3 + \frac{3}{350}(9/2)^{1/3}y^5 + \frac1{350}(3/4)^{1/3}y^7 + \frac{1161}{2156000}y^9 + O(y^{11}) .$$ Of course, this is only the power series around $0$. The function $f(x)$ is an odd function with period $2\pi$ just like $\sin(x).$ You can use Newton's method to approximate the function to arbitrary accuracy given an initial approximation value.

The method for finding the power series result is fairly standard. You first need to realize that $\,f(x)\,$ is a power series in $\,y=x^{1/3}\,$ and try the Ansatz $\,f(x) = a_1 y^1 + a_3 y^3 + a_5 y^5 +\cdots\,$ which is supposed to satisfy $\,f(x)=\sin(y^3 + f(x)).\,$ By substitution you quickly find that $\,f(x)=6^{1/3}y^1-\frac{9}{10}y^3 +a_5y^5+a_7y^7+O(y^9). $ By substitution you solve for $\,a_5\,$ and so on. For example, you can use the Wolfram Mathematica code

 With[{fx = 6^(1/3)*y^1 - (9/10)*y^3 + (a5 + O[y]^4)*y^5},
 Solve[Sin[y^3 + fx] == fx, a5]]

to solve for $\,a_5\,$ and similarly for other coefficients.

Somos
  • 37,457
  • 3
  • 35
  • 85
1

Following the idea of the link in @Yuriy S's comment, notice that the graph of $y = f(x)$ can be parametrized by

$$ (x, y) = (t - \sin t, \sin t). \tag{1}$$

This already provides one way of computing $y = f(x)$. You can solve $x = t - \sin t$ in terms of $t$ and compute $y = \sin(t)$. You may utilize various methods to numerically solve this, for instance, Newton's method will work.

Another consequence of $\text{(1)}$ is that we obtain the following Fourier series

$$ f(x) = \sum_{n=1}^{\infty} \frac{2J_n(n)}{n} \sin (nx). $$

This series converges only polynomially fast (and hence computationally inefficient), however, as we have the asymptotics

$$ \frac{2J_n(n)}{n} \sim \frac{1}{\Gamma(\frac{2}{3})} \left( \frac{2}{\sqrt{3}\,n} \right)^{4/3} \quad \text{as } n\to\infty.$$

Alternatively, if $\langle x \rangle = x \text{ mod } 1$ denotes the fractional part of $x$, then

$$ f(x) = \int_{0}^{\pi} \left( \left< \frac{t - \sin t - x}{2\pi} \right> - \left< \frac{t - \sin t + x}{2\pi} \right> \right) \, dt. $$

Due the the jump discontinuity, this formula is not terribly nice. Indeed it requires the knowledge of jumps of the integrand, which boils down to solving the equation $x = t - \sin t$. Not to mention, it is simply easier to work with $\text{(1)}$.

Sangchul Lee
  • 181,930