5

I have recently discovered the relation

\begin{equation} \frac{\mathrm d^2}{\mathrm dx^2} \big| x \big| = 2\delta (x). \end{equation}

I was very intrigued when I found this expression, and as it makes sense to me intuitively. However, I am aware that Dirac's delta "function" is defined as a distribution, not a function in the conventional sense, so there is probably more going on here than meets the untrained eye.

So my question is, what caveats are necessary for the above equation for it to be formally correct? Can it be taken literally?

I would have presented a definition of the delta function for completeness, but I am also not certain which definition is appropriate in this context. Am I correct in my understanding that the piecewise definition

\begin{equation} \delta (x) \equiv \begin{cases} \infty, & \operatorname{if} x = 0 \\ 0, & \operatorname{if} x \neq 0 \end{cases} \end{equation}

Is only an informal, intuitive definition and actually an "abuse of notation" of proper functions?

I have knowledge up to multivariable/vector calculus and differential equations, but no formal or distribution theory, analysis, number theory, etc. training. Please accommodate if appropriate.

[I apologize for the formatting issues, I am genuinely trying to learn. Can someone please point me in the direction of a crash course for LaTeX (particularly math)? I've looked around a lot on SE and elsewhere on the web and I'm having trouble finding tutorials that explain the simplest of things, like when to use the dollar sign and when to use the double dollar, how to write dollar signs without them taking effect, when to use a package in the preamble, if the preamble is a specific location or just anything before the expression to be formatted, etc. Sorry for the meta question.]

  • 1
    That piecewise expression isn't even an intuitive definition; all it conveys is the idea that $\delta$ is only supported at zero and "blows up" in some fashion there, and doesn't give any suggestion of the property that $\delta$ has when integrated. –  Dec 15 '16 at 13:59
  • Excellent point, I have edited. – electronpusher Dec 15 '16 at 15:22

2 Answers2

10

The intuitive definition is just intuitive, mathematically it is wrong since it cannot differentiate between $\delta$ and $5 \delta$.

Formally, $\delta$ is understood as a function of functions: $$\delta(f)=f(0)$$

Your relation \begin{equation} \frac{\mathrm d^2}{\mathrm dx^2} \big| x \big| = 2 \delta . \end{equation} means that for all functions $f$ which are infinitely many times differentiable and have compact support we have $$ \int_{-\infty}^\infty f''(x) |x| dx = 2 f(0) = 2 \delta(f) $$

Added clarification If $g$ is a twice differentiable function, with $g''$ continuous, the for all functions $f$ which are infinitely many times differentiable and have compact support we have $$ \int_{-\infty}^\infty g''(x) f(x) dx =\int_{-\infty}^\infty g(x) f''(x) dx $$

Note that this identity doesn't necessary hold if $f$ is not compactly supported. This is the reason why we always test distributions on compactly supported functions.

The idea of distributions is the following:

Consider a function which is linear $$u : C_c^\infty(\mathbb R ^d) \to \mathbb C$$ and continuous (we will ignore the topology we consider).

$\delta$ is one such distribution. If we have a continuous (integrable) and (locally) bounded function $g$ we can interpret it as a distribution the following way: $$u_g(f) = \int_{-\infty}^\infty f(x) g(x) dx $$

This definition makes sense because $f$ has compact support.

Now, if $g$ is a function which is differentiable everywhere, we get by parts (and using support of $f$ compact): $$\int_{-\infty}^\infty f(x)g'(x) dx =- \int_{-\infty}^\infty f'(x)g(x) dx$$ or $$u_{g'}(f)=-u_g(f')$$

This suggest that we can differentiate any distribution: $$u'(f):= -u(f')$$

With this definition $\delta$ is derivative of the $H(x)=1$ when $x \geq 0$ and $H(x)=0$ when $x <0$. Moreover the derivative of $\delta$ is $$f \to -f'(0)$$

The concept of differentiability as distribution coincides with the classical concept for differentiable functions. But in this more general sense, many new functions become differentiable.

Note that much of what I wrote only makes sense if we use compactly supported functions....

Finally, there is a theorem which says that if $g$ has continuous derivative on $(0, \infty)$ and $(- \infty, 0)$ and has a jump discontinuity at $x=0$ then, in distribution sense the derivative is $$g'+C \delta$$ where $g'$ is the derivative on $\mathbb R \backslash \{ 0\}$ and $C$ is the jump at $0$. Applying this result twice to $|x|$ you get the above formula.

N. S.
  • 134,609
  • Thank you for you excellent explanation. I'm still not seeing the whole picture: if I let f(x), then isn't the integral and the middle term both zero for all x... but 2delta(x) is not equal to zero for /all/ x? – electronpusher Dec 15 '16 at 16:35
  • Correction, let f(x) = x – electronpusher Dec 15 '16 at 16:36
  • @electronpusher That's why we use compactly supported functions, and $f(x)=x$ is not. – N. S. Dec 15 '16 at 20:50
  • @electronpusher Check the extra comments. – N. S. Dec 15 '16 at 21:08
  • Thank you for the additional clarification. I will do more research on distributions and reflect on your explanation. – electronpusher Dec 15 '16 at 22:03
  • @N.S. hello, could you please give me some reference for the proof of the theorem you mention at the end of your post? Some link where I can find it pethaps. Thanks – la flaca Sep 15 '17 at 19:05
  • Can the first derivative of $|x|$ be defined in a similar way? – a06e Sep 24 '19 at 17:08
  • @becko Yes. It is an easy execrsice to check that the first derivative of $|x|$ is the distribution $u(f)=\int_0^\infty f(t) dt- \int_{-\infty}^0 f(t) dt$, i.e. the regular distribution given by the function $g(x)=1$ for $x>0$ and $g(x)=-1$ when $x<0$. – N. S. Sep 24 '19 at 22:42
  • @N.S. So $d|x|/dx = g(x) = \mathrm{sgn}(x)$. I was expecting something like $g(x) = \mathrm{sgn}(x) + \delta(x)$, to account for the discontinuity at $x=0$. – a06e Sep 25 '19 at 06:01
  • @becko The discontinuity at zero is irrelevant, since the interpretation as distribution is via integration, and sign(x) is integrable... $\delta(x)$ has an "infinite mass" at 0, it can only appear when the limit in the definition of the derivative is infinite – N. S. Sep 25 '19 at 19:27
  • @becko If $f$ is consinuous and differentiable almost everywhere, then the derivative as distribution is just the derivative at those points (and you can ignore the zero measure set, it is irrelevant for differentiation). Also, if $f$ has a jump discontinuity and differentiable almost everywhere, the derivative is $f'+C \delta_a$ where $a$ is the position of the jump, and $C$ is the jump. – N. S. Sep 25 '19 at 19:29
-2

Notice that the first derivative of $|x|$ is the sign function. This is why the second derivative is the Delta Dirac function.

ViHdzP
  • 4,854
  • 2
  • 20
  • 51
  • You're kind of restating the question. Yes, it makes some sort of sense that the second derivative of $|x|$ is related to the Delta Dirac function, but OP is more precisely asking if this can be in some way formalized. – ViHdzP Nov 24 '19 at 05:22