25

Say I was trying to find the derivative of $x^2$ using differentiation from first principles. The usual argument would go something like this:

If $f(x)=x^2$, then \begin{align} f'(x) &= \lim_{h \to 0}\frac{(x+h)^2-x^2}{h} \\ &= \lim_{h \to 0}\frac{2hx+h^2}{h} \\ &= \lim_{h \to 0} 2x+h \end{align} As $h$ approaches $0$, $2x+h$ approaches $2x$, so $f'(x)=2x$.

Throughout this argument, I assumed that $$ \lim_{h \to 0}\frac{f(x+h)-f(x)}{h} $$ was actually a meaningful object—that the limit actually existed. I don't really understand what justifies this assumption. To me, sometimes the assumption that an object is well-defined can lead you to draw incorrect conclusions. For example, assuming that $\log(0)$ makes any sense, we can conclude that $$ \log(0)=\log(0)+\log(0) \implies \log(0)=0 \, . $$ So the assumption that $\log(0)$ represented anything meaningful led us to incorrectly conclude that it was equal to $0$. Often, to prove that a limit exists, we manipulate it until we can write it in a familiar form. This can be seen in the proofs of the chain rule and product rule. But it often seems that that manipulation can only be justified if we know the limit exists in the first place! So what is really going on here?


For another example, the chain rule is often stated as:

Suppose that $g$ is differentiable at $x$, and $f$ is differentiable at $g(x)$. Then, $(f \circ g)$ is differentiable at $x$, and $$ (f \circ g)'(x) = f'(g(x))g'(x) $$

If the proof that $(f \circ g)$ is differentiable at $x$ simply amounts to computing the derivative using the limit definition, then again I feel unsatisfied. Doesn't this computation again make the assumption that $(f \circ g)'(x)$ makes sense in the first place?

Joe
  • 22,603
  • I'm not sure why you feel that we're making this assumption? Had the derivative of $x^2$ not existed, we wouldn't be able to derive the limit as we did in the example you cite. Think of the thousands of non-differentiable functions that are out there. Our assumptions, whatever they may be, don't really prevent us from claiming and proving their non-differentiability. – Chubby Chef Jan 08 '21 at 23:04
  • 1
    @ChubbyChef But at the start of the argument, when we wrote $$\lim_{h \to 0}\frac{(x+h)^2-x^2}{h} , ,$$we didn't know at that point $x^2$ could even be differentiated. With this kind of reasoning, I could 'prove' $\log(0)=0$ by simply manipulating it using known log laws. Then, I could say $\log(0)$ must be equal to $0$, since it is always true that $\log(a)+\log(b)=\log(ab)$. The reason that my argument was faulty was because I assumed that $\log(0)$ represented something meaningful from the outset. – Joe Jan 08 '21 at 23:10
  • 5
    @Chubby Chef: I think what Joe is referring to what is actually written, namely an equality such as $\lim_\limits{h \to 0}\frac{(x+h)^2-x^2}{h} = \lim_\limits{h \to 0}\frac{2hx+h^2}{h}$ when neither the left side nor the right side has yet been shown to exist. I agree (with Joe), but this is a frequently used short-hand and it's easy to see how to make it rigorous -- perform the algebraic rewriting without the limit operator, THEN include the limit operator at the end when the existence and value of the limit are obvious. – Dave L. Renfro Jan 08 '21 at 23:12
  • @DaveL.Renfro I understand your perspective. I'm not sure, however, that just writing down $\lim_{h\to 0} \frac{(x+h)^2-x^2}{h}$ necessarily asserts its existence... For instance, I can write something like $\lim_{x\to 0}\frac{1}{x} = \lim_{x\to 0}\frac{x}{x^2}$, which is a perfectly valid, albeit completely pointless, mathematical statement. It just so happens that this object isn't equivalent to any numerical value. – Chubby Chef Jan 08 '21 at 23:17
  • 1
    Joe, here's an example of how I usually handled this in class. One advantage is that there's less blackboard writing when you aren't putting the limit operator on at every step. – Dave L. Renfro Jan 08 '21 at 23:19
  • @DaveL.Renfro Thanks for responding. I agree that the example I gave can be remedied by avoiding taking the limit until the very last step. However, I think the problem I have is with more sophisticated proofs, such as the proof of the chain rule, where it seems pretty difficult to establish that $(f \circ g)'(x)$ makes any sense. – Joe Jan 08 '21 at 23:20
  • I agree with @DaveL.Renfro. To clarify the argument, you could rephrase it like, "Let $x \in \mathbb R$. Notice that if $h \neq 0$ then $\frac{f(x+h) - f(x)}{h} = 2x + h$. Because $\lim_{h \to 0} 2x = 2x$ and $\lim_{h \to 0} h = 0$, we see that $\lim_{h \to 0} \frac{f(x+h) - f(x)}{h} = 2x +0 = 2x$." Or something like that. The proof of the chain rule can be handled in a similar way. – littleO Jan 08 '21 at 23:21
  • It might help to recall (or be aware of, if you're not there yet) that in rigorous treatments one is usually working with epsilons and deltas, or sequences, and so this type of "abuse of notation" doesn't arise. – Dave L. Renfro Jan 08 '21 at 23:23
  • 1
    In calculating the limit you are proving it's existence. If the limit didn't exist you wouldn't have been able to calculate it. .... Had they done anything else other that then evaluate it the argument would not be valid. – fleablood Jan 08 '21 at 23:26
  • I found it extremely useful/helpful to notice that if $f:U\to\mathbb{R}$ is differentiable at some $x_0\in U$ for some open set $U$ with the usual topology on $\mathbb{R}$, then in the $\varepsilon-\delta$ definition of the limit using $h$, we must have $$\forall\varepsilon>0\exists\delta>0\forall h\in U-x_0:0<|h|<\delta \Longrightarrow \bigg|\frac{f(x_0+h)-f(x_0)}{h}-f'(x_0)\bigg|<\varepsilon$$ where $U-x_0={x-x_0:x\in U}$. – C Squared Jan 08 '21 at 23:47
  • Possibly related: https://math.stackexchange.com/questions/3530539/is-taking-a-limit-a-function-is-it-a-procedure-a-ternary-operation - in brief, the limit is technically a relation between functions, input values, and prospective limit values. In many cases (in particular any cases you'll see in real analysis classes), the limit is in fact a partial function from functions and input values to limit values, making it a bit less bad to abuse notation in pretending $\lim$ is a function from functions and input values to limit values. – Daniel Schepler Jan 09 '21 at 00:54
  • Note that your example has missing brackets around the expression under the limit, which make your statements actually wrong according to standard conventions for operator precedence. – user21820 Jan 09 '21 at 09:26
  • 2
    The process of evaluating a limit in step by step manner using limit laws (or other theorems related to evaluation of limits) ensures that if the process is successful the limit exists. If not successful the limit does not exist. However textbooks don't emphasize this about the limit evaluation process and laws. – Paramanand Singh Jan 09 '21 at 12:16
  • For more details you can have a look at the limit laws I discussed in this thread: https://math.stackexchange.com/q/2971122/72031 – Paramanand Singh Jan 09 '21 at 12:18
  • @ParamanandSingh Thank you, that looks very helpful. I'll take a look. – Joe Jan 09 '21 at 12:19
  • Note however the use of L'Hospital's Rule works only in one direction and usually in reverse. Thus $\lim_{x\to a} f(x)/g(x)=\lim_{x\to a} f'(x) / g'(x) =L$ has to be understood in backward direction. And since this is not a reversible step one must take care when ratio $f'/g'$ does not have a limit. One can't conclude in this case that $f/g$ does not have a limit. – Paramanand Singh Jan 09 '21 at 14:58
  • @ParamanandSingh, re, probably you did not mean to say that, if the process is not successful, then the limit definitely does not exist, only that it might not exist (EDIT: as I see now you clarified in a different comment). For example, $\lim_{n \to \infty} (\sin(1/n) - \sin(1/n)) = \lim_{n \to \infty} \sin(1/n) - \lim_{n \to \infty} \sin(1/n) = 0$ is an unsuccessful process, but the limit exists anyway. – LSpice Jan 09 '21 at 21:46
  • @LSpice: I definitely meant that if the process is not successful the limit does not exist. Also you should have seen the question I had linked in my comments. – Paramanand Singh Jan 10 '21 at 01:55
  • You may also have a look at a similar discussion in this thread. – Paramanand Singh Jan 10 '21 at 04:19
  • The example about limits and logs is missing the implicit assumption, ever present in mathematics, that when one calculates or solves, we are actually saying: IF THIS CAN BE DONE (whether solved, calculated, simplified, whatever), THEN THE ONLY POSSIBLE SOLUTION/ VALUE IS .... So, in the case of using the log of a product, what you have actually shown is, IF THE LOG OF A PRODUCT RULE HOLDS, THEN THE ONLY POSSIBLE VALUE OF LOG(0) IS 0. Think of those equations with extraneous roots as a practical example of this, and thus the need for the so-called answer check. – Dr. Michael W. Ecker Jan 12 '21 at 20:59

10 Answers10

27

You're correct that it doesn't really make sense to write $\lim\limits_{h\to 0}\frac{f(x+h)-f(x)}{h}$ unless we already know the limit exists, but it's really just a grammar issue. To be precise, you could first say that the difference quotient can be re-written $\frac{f(x+h)-f(x)}{h}=2x+h$, and then use the fact that $\lim\limits_{h\to 0}x=x$ and $\lim\limits_{h\to 0}h=0$ as well as the constant-multiple law and the sum law for limits.

Adding to the last sentence: most of the familiar properties of limits are written "backwards" like this. I.e., the "limit sum law" says $$\lim\limits_{x\to c}(f(x)+g(x))=\lim\limits_{x\to c}f(x)+\lim\limits_{x\to c}g(x)$$ as long as $\lim\limits_{x\to c}f(x)$ and $\lim\limits_{x\to c}g(x)$ exist. Of course, if they don't exist, then the equation we just wrote is meaningless, so really we should begin with that assertion.

In practice, one can usually be a bit casual here, if for no other reason than to save word count. In an intro analysis class, though, you would probably want to be as careful as you reasonably can.

pancini
  • 20,030
  • 22
    By the way, math is full of little things like this. One notable example is when people "define" a function, and then proceed to show it is well-defined. – pancini Jan 08 '21 at 23:21
  • 1
    This answer omits to mention that in some instances $\lim_{x\to c} \big( f(x) +g(x)\big)$ exists even though one or both of those separate limits fails to exist. – Michael Hardy Jan 09 '21 at 18:40
  • @ElliotG Thanks for the answer. I suppose the thing I was wondering about was if this way people handled limits was more than a grammar issue. For example, $\lim_{x \to \infty}\sin(x)$ clearly does not exist. But if we continued to manipulate the expression, then would it be possible to incorrectly find that it is equal to $5$, say? With the logarithm example, we concluded that $\log(0)$ was equal to $0$. There is no doubt that $0$ is well-defined. But this does not mean that $\log(0)$ is well-defined. Can something similar happen with limits? – Joe Jan 09 '21 at 20:16
  • 1
    @Joe: the limit laws used in evaluating limits must be reversible so that each step is equivalent to next one. If this is guaranteed then you can't have a situation described in your comment. – Paramanand Singh Jan 09 '21 at 20:36
  • 3
    @Joe: The limit laws are designed in such a manner that they can be used to reduce the question of existence of limit of a complicated expression to that of a simpler expression. After a finite number of steps we reach an expression where the question of existence of limit and its value becomes trivial. – Paramanand Singh Jan 09 '21 at 20:58
  • @ElliotG yes, they should define a relation and proof it is a function (for all x one and only one y exists such that (x,y) in R) – lalala Jan 10 '21 at 10:18
5

The other answers are perfectly fine; just a perspective that can save your day in situations in which the existence of the limit is actually a critical point.

The crucial definition is the one of limsup and liminf: these are always well defined, and all you have to know at the moment are the following two properties:

  1. $\liminf_{x \to x_0} f(x) \le \limsup_{x\to x_0} f(x) $
  2. The limit of $f$ exist if and only if $\liminf_{x \to x_0} f(x) = \limsup_{x\to x_0} f(x) $, and in this case the limit agree with this value.

Now imagine you do your computation twice: firstly, you compute the liminf; then you compute the limsup. In both computations, as soon as you arrive to something that actually has limit (like $2x+h$), because of property (2) you can forget about the inf/sup story and just compute the limit.

Since with some manipulations you arrive to something that actually has limit, both calculations will give the same result and, because of property (2) again, the limit exist and coincide with the value you just computed.

Now this is not really the thing you should do if you are doing introductory analysis and you don't know liminf and limsup: formal properties of these two are slightly different from the formal properties of lim, and you could end up with an error. But as long as you don't "touch" the limit, and you just make some manipulation inside theimit, the same argument will carry on: if you end up with a well defined result, it is the limit :)

  • This is just a lucky break with $\mathbb R$, though: it doesn't work so well if we're trying to compute limits in $\mathbb C$. OK, then we can take lim(sup/inf)s of real and imaginary parts; but eventually we have to face situations involving limits in arbitrary topological spaces, where there is (I think?) no finite family of always-defined "potential limits" that collapses to a point if and only if there is an "actual limit". So at some point we do need the sort of 'sociological' explanation in @ElliotG.'s answer. – LSpice Jan 09 '21 at 20:16
  • 1
    Well, yes, you could take the accumulation points; this would work in arbitrary topological spaces. Specifically, this is the sub-sub-sequence criterion. Let $x_n \in X$ be a sequence. Then the limit of $x_n$ exists and equals $x$ iff for every subsequence $x_{n_k}$, there exist a convergent sub-subsequence $x_{n_{k_m}}$ that converges to $x$. So you can repeat the liminf/sup story by taking arbitrary subsequence that converges to some accumulation point: if you always get the same result, you have your limit – Andrea Marino Jan 09 '21 at 21:06
  • Right, but I said 'finite'. (This is admittedly an artificial restriction, but I perceive, in a non-rigorous sense, a difference between speaking broadly of limit points rather than just of lim(sup/inf)'s.) – LSpice Jan 09 '21 at 21:19
  • I don't see any obstruction with finiteness. You can make the argument with convergent subsequences perfectly formal. I think it's even better than the liminf/sup story; I am tempted to change it :) in a way when you find written "lim" it is intended along a convergent subsequence; if you always get the same result - in particular if at some point the limit manifestly exist - the limit has just one value. Or you can think as $\lim$ being set valued. – Andrea Marino Jan 10 '21 at 11:27
5

What we have here should really be interpreted as multiple statements:

(1.) If $ \lim_{h \to 0} \frac{2hx + h^2}{h} $ exists then $ \lim_{h \to 0} \frac{(x+h)^2 - x^2}{h}$ exists and is equal to $\lim_{h \to 0} \frac{2hx + h^2}{h} $.

(2.) If $ \lim_{h \to 0} [2x + h] $ exists then $ \lim_{h \to 0} \frac{2hx + h^2}{h}$ exists and is equal to $\lim_{h \to 0} [2x + h]$.

(3.) If $ \lim_{h \to 0} 2x$ exists then $ \lim_{h \to 0} [2x + h]$ exists and is equal to $ \lim_{h \to 0} 2x$.

(4.) $ \lim_{h \to 0} 2x$ exists and is equal to $ 2x $.

Note that once we have (4.) the "if" (conditional) part of (3.) is satisfied and so on all the way up to (1.). You can see that assuming that the limit exists in statements 1 to 3 is not a problem because you haven't used used that assumption to prove that it actually does. That would be circular logic and no good.

Your log example is different to this in the way that you don't have a statement that takes the role of statement (4.) above, which would allow you to escape the conditional. You have only proven that $\log(0) = 0$ IF $\log(0)$ exists, not that $\log(0)$ exists! This in itself is not an incorrect conclusion.

Dark
  • 223
4

If you want to be more precise you could write:

$f'(x) = \lim_{h→0} \frac{(x+h)^2-x^2}{h}$ if the limit exists

    $= \lim_{h→0} (2x+h)$ if the limit exists

    $= 2x$.

Meaning that each line only holds "if the limit exists". But we do not have to actually bother to do so in most cases for two reasons:

  1. It is usually easy enough to mentally add such conditions and check that we did not at any point rely on the existence of the limit.

  2. If we allow expressions to attain an "undefined value", and define that every expression with an "undefined" subexpression is itself undefined, then we do not even have to write the condition "if the limit exists"! If the limit is not defined, then the "$\lim \cdots$" expression would simply have value "undefined", which will not lead to any incorrect conclusions.

user21820
  • 60,745
  • 1
    I would even add at the end "which exists!". But more seriously it is exactly this sort of care that needs to be exercised when using L'Hospital - and usually isn't. – ancient mathematician Jan 09 '21 at 15:25
  • 1
    @ancientmathematician: You're absolutely right (about L'Hopital abuse)! The reason I didn't add a "which exists" at the end was that it was not needed. The final line meant "$\lim_{h→0} (2x+h) = 2x$", and it does already state that the limit on the left of the equality is equal to the expression on the right, so of course it exists since $2x$ exists! =) – user21820 Jan 09 '21 at 15:39
2

Proposition: Let $c \in \mathbb{R}$. Suppose $f$ and $g$ are defined and equal to each other on some punctured open ball $(c - \delta) \cup (c + \delta)$ of $c$, where $\delta > 0$. Then $\lim_{x \to c} f(x)$ exists if and only if $\lim_{x \to c} g(x)$ exists. And if either limit exists, so does the other, and they are both equal.

Sketch of proof: Observe that the definition of limit at a point $c$ concerns itself only with points close to $c$ but not equal to $c$. So whatever the value of $f$ or $g$ at $c$, or for that matter whether or not they are defined there, does not matter. Since $f$ and $g$ are equal at points close to $c$ but not equal to $c$, our limit statement about either function at $c$ must therefore also hold for the other. $\square$

This justifies the various limit calculations that we often do, such as the one you showed. In fact, let us go through your example step by step.

If $f(x)=x^2$, then \begin{align} f'(x) &= \lim_{h \to 0}\frac{(x+h)^2-x^2}{h} \\ &= \lim_{h \to 0}\frac{2hx+h^2}{h} \\ &= \lim_{h \to 0} 2x+h \end{align} As $h$ approaches $0$, $2x+h$ approaches $2x$, so $f'(x)=2x$.

What do these sequences of calculations really mean or imply? Well, in the final step/equality, we computed $\displaystyle \lim_{h \to 0} 2x + h$, which we agree exists and equals to $2x$. Since the function $\displaystyle \frac{2hx + h^2}{h}$ equals $2x + h$ in some punctured neighborhood of $0$, we can now use the proposition to conclude that $\displaystyle \lim_{h \to 0} \frac{2hx + h^2}{h}$ equals $\displaystyle \lim_{h \to 0} 2x + h$, which equals $2x$. So going from line (3) to line (2) is justified. Next, the function $\displaystyle \frac{(x+h)^2 - x^2}{h}$ equals $\displaystyle \frac{2hx + h^2}{h}$ in some punctured neighborhood of $0$, so again we can use the proposition to justify going from line (2) to line (1).

So we have sort of reasoned backwards, but practically speaking this is not necessary in ordinary limit calculations. Our reasoning also "works" even when the limit does not exist. If at the end we arrive at a limit that exists, then necessarily we can work backwards and guarantee that the initial first limit exists; and if at the end we arrive at a limit that does not exist, then necessarily the initial first limit cannot exist, otherwise we could go down the series of equivalences guaranteed by the proposition to guarantee that the final limit exists.

So in all cases things "work out fine". The important thing to note is simply that we have certain logical equivalences at each step: the limit exists at some step if and only if it exists at any earlier or later step.

twosigma
  • 3,332
2

The derivative does not exist unless the limit of the difference quotient exists.

The "limit law" that says the limit of a sum of two functions is equal to the sum of the two separate limits is not applicable unless the two separate limits exist. Notice that

  • There are no cases where the two separate limits exist and the limit of the sum does not. If the two separate limits exist, then so does the limit of the sum.

  • However, there are cases in which the two separate limits do not exist and the limit of the sum does. A similar situation applying to products rather than sums arose in something I posted here recently (I can't find it right now). For one of the two factors the limit did not exist, but the function was bounded and therefore the limit of the product could be found by squeezing.

1

No property of the limit was used in the first argument before the last step so actually what we have done inside the limit is just rewriting and when we reach the last step we can show the existence using the epsilon-delta definition which apparently deals with the existence issue , same thing applies to the chain rule thing since every thing in the proof before the last steps is just rewriting and the final steps which use the properties of limits which is justified since the epsilon delta definition deals with the existence issue, hope this helps

1

The issue largely vanishes if we just consider $\lim$ and $\log$ explicitly as partial functions. A partial function can be seen as a function whose codomain contains one extra (distinguishable!) element, basically the “error value”. $$\begin{align} \log :&& \mathbb{R} \not\to \mathbb{R} \\ \lim_0 :&& ((\mathbb{R}\setminus\{0\})\to\mathbb{R}) \not\to \mathbb{R} \end{align}$$ where we have for example $$\begin{align} \log(1) =& \text{OK}(0) \\ \log(0) =& \text{ERR} \\ \lim_0( h\mapsto \tfrac{\sin h}{h}) =& \text{OK}(1) \\ \lim_0( h\mapsto \tfrac1{h}) =& \text{ERR} \end{align}$$

Now, the logarithm law $$ \log(a\cdot b) = \log a + \log b $$ is to be understood with a “lifted” $+$ operator, that just passes failure on either side on. But that means that for this operator, we can't infer from $p+q=p$ that $q=0$, because $\text{ERR}+q$ is always $\text{ERR}$ regardless! Instead, only from $\text{OK}(p)+q = \text{OK}(p)$ we can infer $q = \text{OK}(0)$. Thus we don't get to the wrong conclusion about $\log(0)$, because that is not an $\text{OK}$ value.

Applied to the limits in the differentiation, we can immediately write $$ f'(x) = \lim_0\left(h\mapsto\frac{f(x+h)-f(x)}{h}\right) $$ just noting that the result might be $\text{ERR}$. What we can also do without any problem is rewrite the expression inside the limit with anything that – as a function $h\mapsto\ldots$ – really is (extensionally) the same. This is in particular no problem for $$\begin{align} f'(x) =& \lim_0\left(h\mapsto\frac{(x+h)^2-x^2}{h}\right) \\ =& \lim_0\left(h\mapsto\frac{2\cdot h\cdot x+h^2}{h}\right) \end{align}$$ because $h\mapsto\frac{(x+h)^2-x^2}{h}$ and $h\mapsto\frac{2\cdot h\cdot x+h^2}{h}$ really are the same for all $h\in\mathbb{R}$. Still, at this point we don't know if either of the limits actually exist – they might be both $\text{ERR}$, or both $\text{OK}$, but at any rate equal.

For the next step we need the fact that the limit considers its argument as only a function with nonzero numbers as the domain, because only considered as a function on that domain is $h\mapsto\frac{2\cdot h\cdot x+h^2}{h}$ the same function as $h\mapsto 2\cdot x+h$.

And that's it, at this point we can read off that the limit is indeed $\text{OK}(2\cdot x)$ and going back we see that the other limits must also have been $\text{OK}$ with that same value.

1

Note that $\dfrac{(x+h)^2-x^2}{h}$ is undefined at $h=0$ and that, when $h \ne 0$,

$$\dfrac{(x+h)^2-x^2}{h} = \frac{2hx+h^2}{h} = 2x+h$$

However, the function $:x \mapsto 2x+h$ is defined, continuous, and has a value of $2x$ at $h=0$.

We also need to use

$$\lim_{h \to 0}\frac{2hx+h^2}{h} = \lim_{h \to 0}\frac hh \; \lim_{h \to 0}\frac{2x+h}{1} = \lim_{h \to 0} (2x+h) = 2x$$

The rest follows.

0

If we want to be absolutely clear, then the argument for the derivative should be the following: $\lim\limits_{h\to0}\frac{(x+h)^2-x^2}{h}$ and $\lim\limits_{h\to0}2x+h$ both exist and are equal if and only if at least one of them exists. Since $\lim\limits_{h\to0}2x+h$ does in fact exist and is $2x$, so too must the other limit (that's $\lim_{h\to0}\frac{(x+h)^2-x^2}{h}$) exist and be $2x$.

This does not work for your logarithm example: You can argue that $\log0$ and $\log0+\log0$ exist and are the same if at least one of the two exists. But neither exists, so the point is moot.