Why is it that cobweb diagrams converge to a point when the absolute value of the gradient at the point is less than one and why do they diverge in other situations? I find it easier to understand why a cobweb diagram does not converge in situations when the gradient is > 1 but struggle to understand why a diagram would converge in other situations. For example, in this situation if the cobweb diagram begins at a point x=1, is there any intuitive explanation behind why it would converge to the point of intersection? (despite the gradient at the point being greater than 1.)
-
1If you start at $x=1$ in that diagram, you will go off to $-\infty$, won't you? – Hans Lundmark Oct 26 '24 at 12:45
1 Answers
As Hans Lundmark noted in a comment, I think you are misunderstanding what happens in this particular example -- for all $x < 1.7$, iteration will go to $-\infty$ and all $x > 3.7$ to $+\infty$.
But your initial question is: Why, when the gradient has an absolute value smaller than 1, is a fixed point attractive? This is a consequence of the mean value theorem: Suppose $p$ is fixed and $|f'(p)| < 1.$ Then there is an interval around $p$ for which the gradient is strictly less than 1. Then for any $x$ in that interval: $$\left|\frac{f(x)-f(p)}{x-p}\right| < 1$$ so and since $f(p) = p$ we have $$|f(x)-p| < |x-p|$$ and thus for each successive iteration of $x$ by $f$ gets closer to the fixed point at $p$.
Intuitively, the gradient at a point $p$ measures (approximately, unless the function is linear) how far apart $f(x)$ and $f(p)$ are, relative to the difference between $x$ and $p$, so long a $x$ is close to $p$, since $$f(x) - f(p) \approx f'(p)(x-p).$$ In the particular case that $p$ is a fixed point of $f$, then, the gradient approximates how close $f(x)$ is to $p$ relative to how close $x$ was to $p$ to begin with -- when this gradient is smaller than 1, we see that $f(x)$ is closer to $p$ than $x$ was originally.
- 452
-
Hi, thank you for the answer. So in this example where there are two intersection points italic , the iteration reaches the right point regardless of the fact that the gradient between the starting point and the right intersection point is greater than 1. Would this be because it is pushed away from the left intersection point then goes right? In my original answer I forgot to mention that I am only looking at cobweb diagrams finding intersections with the line y=x. – coban Oct 29 '24 at 05:25
-
So, that gets complicated -- this particular example in your comment, all points (other than 0 and any point whose orbit contains 0). But in general, you cannot know. There are lots of examples of functions with a single attracting fixed point, but very complicated dynamics for inputs outside of the region where the gradient was < 1. – Lars Seme Oct 29 '24 at 15:58