25

For instance, the absolute value function is defined and continuous on the whole real line, but its derivative behaves like a step function with a jump-discontinuity.

For some nice functions, though, such as $e^x$ or $\sin(x)$, the derivatives of course are no "worse" than the original function.

Can I say something that is typical of the derivative? Is it typically not as nice as the original function?

jwodder
  • 1,539
  • 8
    worse in what sense ? At least derivative of Analytic functions is not only worse but I can say sometimes it s better than original one! – Red shoes Oct 08 '17 at 00:47
  • 1
    @Redshoes in what sense is it better? I think they have exact same radius of convergence. – Vim Oct 08 '17 at 02:17
  • @Vim: It's more well-behaved, right? A line is better-behaved than a parabola, etc. – user541686 Oct 08 '17 at 06:09
  • @Mehrdad most analytic functions aren't finite polynomials so no, I don't think this's what Red Shoes meant. – Vim Oct 08 '17 at 06:10
  • @Vim: I didn't mean to imply that's always true, yeah. But I'm pretty sure that's what he means. He says it's not only not worse, but sometimes actually better -- which corresponds with the polynomial case. Not sure what else you think he could mean. – user541686 Oct 08 '17 at 06:13
  • @Mehrdad that's why I don't know why he made that comment. Maybe you're right. But, after all, in terms of regularity I don't see how linear is better than quadratic or cubic etc. – Vim Oct 08 '17 at 06:15
  • 3
    In systems and control theory, there is a general statement about avoiding derivatives and differentiators as much as possible. Because they insert noise and high-frequency signals into system which are hardly controllable/detectable/predictable/stabilizable and so on. – polfosol Oct 08 '17 at 07:28

4 Answers4

41

Yes, that is completely right. And inversely, integration makes functions nicer.

One way of measuring how "nice" a function is, it by how many derivatives it has. We say a function $f\in C^k$ if it is $k$ times continuously differentiable. The more times differentiable it is, the nicer a function is. It is "smoother". So if a function is $k$ times differentiable then its derivative is $k-1$ times differentiable. A function is "as nice" as its derivative if and only if its smooth (infinitely differentiable). These are functions like $\sin(x), e^x$, polynomials, etc.

Inversely, integration makes things nicer. For example integrating even a non continuous function results in a continuous function: Is an integral always continuous?

  • except in instances where the function doesn't have a nice integral and you're left with some nasty sum or product or something. – tox123 Oct 08 '17 at 15:56
  • 7
    @tox123 See also this MathOverflow answer. As noted in a comment there, if you're thinking about it in terms of formulas then integration can certainly lead to nastier results. But the function itself will still be nicer, or at least as nice, as the function you started with, in the way described in this answer. – Carmeister Oct 08 '17 at 17:18
  • 4
    It's worth pointing out, for the unwary reader, that the MathOverflow post @Carmeister linked to is an answer to a question asking for harmful heuristics, and as such the bolded sentence there is not bolded for emphasis but to highlight it as a dangerous belief. –  Oct 08 '17 at 21:04
14

The concept you're talking about is called smoothness (Wikipedia, MathWorld).

Functions like $e^x$ and $\sin(x)$ and polynomials are called "smooth" because their derivatives, and $n$th derivatives are all continuous. Smooth functions have derivatives all the way down, so they're as nice as their derivatives.

But functions like $\operatorname{abs}(x)$ and $\operatorname{sgn}(x)$ aren't smooth since there are discontinuities in either them or their derivatives. They're nicer than their derivatives.

A function is in class $C^k$ if its derivatives up to, and including, $k$ are continuous. So the number of levels of nice-ness will depend on $k$. Think about how integrating $\operatorname{sgn}(x)$ gives you $\operatorname{abs}(x)$, which in turn gives $\operatorname{sgn}(x)x^2$, and so on. As Zachary Selk points out, you can make functions nicer by integrating them.

In fact, for most functions, its more likely that they can be integrated than differentiated. Not only is being "nice" a rare trait, being differentiable at all is too.

Jam
  • 10,632
  • 3
  • 29
  • 45
  • 2
    In numerical methods, integration is more accurate than differentiation, for given effort. – Philip Roe Oct 08 '17 at 01:53
  • @PhilipRoe So? How does that relate to anything I said? The accuracy of integration/differentiation says nothing about whether a solution exists. – Jam Oct 08 '17 at 01:55
  • 1
    Its an aspect of niceness, and in its own way quite important. – Philip Roe Oct 08 '17 at 02:02
  • @PhilipRoe It's irrelevant to the question. OP's asking about the continuity and differentiability of functions. The accuracy of numerical analysis algorithms isn't pertinent. – Jam Oct 08 '17 at 02:02
  • 1
    Actually, that is your interpretation of nice. But mine is related. Its all about the sources of error. – Philip Roe Oct 08 '17 at 02:06
  • 1
    I'm ending this discussion since I don't think you're really listening to me. The question, as framed, is about which functions are continuous and differentiable. Your point seems to be about the error terms of algorithms, which has no relevance whatsover. OP wants to know when a function will have a jump in it, like $y=\mathrm{sgn}(x)$. You're making the case that numerical integration has more rapid convergence than numerical differentiation. – Jam Oct 08 '17 at 02:13
  • 2
    Actually the derivative of functions $|.| , sgn(.) $ are continuous on their domains. ! – Red shoes Oct 08 '17 at 02:44
  • 1
    @PhilipRoe Depends on what you mean by "numerical method". Automatic differentiation usually is accurate and free of numerical issues. – Federico Poloni Oct 08 '17 at 10:14
  • @Redshoes $\operatorname{sgn}(0)$ is commonly defined as $0$, leading to a discontinuity. https://en.wikipedia.org/wiki/Sign_function#Definition – Jam Oct 08 '17 at 14:28
  • @Federico Interesting variant on "nice". If you are given $f(x)$ as an expression, then differentiating is easier than integrating, whether automatically or "by hand", but if $f(x)$ is only defined numerically it is the other way round. – Philip Roe Oct 08 '17 at 16:09
  • @Jam Red shoes spoke of the derivative of $\operatorname{sgn}$ being continuous (and he's right), not $\mathrm{sgn}$ itself. – Adayah Oct 08 '17 at 16:50
  • @Adayah Right you are, I've edited that :) – Jam Oct 08 '17 at 20:49
2

You can view it as a consequence of convolution being at least as nice as the involved functions. Because an integral of a function $f$ is exactly that, a convolution :

$$\int_{-\infty}^t f(\tau)d\tau = \int_{-\infty}^\infty f(\tau) H(t-\tau)d\tau = (f*H)(t)$$

Where $H(t)$ is the Heaviside step function:

$$H(t) = \cases{0 &, t < 0 \\ 0.5 &, t = 0 \\1 &, t > 0}$$

mathreadler
  • 26,534
1

As others have pointed out, derivative operators generally decrease the smoothness of a function. This is true for many classes of function spaces: Sobolev, Besov, Triebel-Lizorkin, etc. In fact most of these spaces have a smoothness parameter that decreases by applying differential operators. Terrence Tao has a nice diagram illustrating the relationship between these spaces on his blog.

One thing that I did not see mentioned in the other answers is that derivatives can improve the decay of a function. So if you start from a smooth function that grows, its derivative could be considered as nicer than the original.

So I would argue that differentiation creates a trade-off between smoothness and decay. This is most clearly seen in the Fourier domain, where differentiation is basically multiplication by a polynomial. Suppose the original function is $f$ and its Fourier transform is $\widehat{f}$. In the Fourier domain, a derivative of $f$ is given by $P(\omega)\widehat{f}(\omega)$, where $P$ is some polynomial. If $\widehat{f}$ has a singularity, it can be killed by the zeros of the polynomial, thus causing the derivative of $f$ to have a reduced rate of growth. At the same time, $P$ grows at infinity, which means the derivative of $f$ will be less smooth.

Dunham
  • 3,357