3

My book shows with some steps that $$\int_{-L}^L {f(x)}^2dx=\int_{-L}^L\left\{\frac{1}{2}a_0+\sum_{n=1}^\infty\left[a_n\cos{\left(\frac{n\pi x}{L}\right)}+b_n\sin{\left(\frac{n\pi x}{L}\right)}\right]\right\}^2=L\left[\frac{1}{2}a_0^2+\sum_{n=1}^\infty a_n^2+b_n^2\right]$$

and hence it says

This can be restated as $$\lVert f \rVert^2=\sum\text{squares of componentsof $f$ on basis vectors}$$ the norm of $f$ is often called the power contained in the function.

However it doesn't make any sense to me (not the passages). What is the norm of the function? And what does this identity or theorem actually say?

Looking here and here I could only find different formulations. However in all these sources they seem to forget that $\sum\text{squares of componentsof $f$ on basis vectors}$ is multiplied by $L$.

Euler_Salter
  • 5,547
  • Do you mean you think there is an $L$ where there should not be or that nothing makes sense? There do exist different definitions of Fourier transformations. Some are normed and some are not. – mathreadler Apr 19 '17 at 14:54
  • @mathreadler well we shouldn't be learning the derivation, however it says that using orthogonality and various properties we get the final result that I wrote, i.e. $\int_{-L}^L f(x)^2 dx = L \left[\frac{1}{2}a_0^2+\sum_{n=1}^\infty a_n^2+b_n^2\right]$. I don't know whether the computation (and hence the result) is correct, however I hope so, being in a book. Anyway, yes either there should be no $L$ or I don't understand. But in any case, can you please give me an insight into what the theorem is saying? – Euler_Salter Apr 19 '17 at 14:57
  • @mathreadler Also because $\frac{1}{2}a_0$ is not really a component of $f$, or is it? – Euler_Salter Apr 19 '17 at 14:58
  • Yes it is. It is one of the basis functions and it is orthogonal to all the others as any sine or cosine that revolves a whole number of times will have an integral 0. – mathreadler Apr 19 '17 at 15:12
  • @mathreadler see I don't know anything about basis functions... – Euler_Salter Apr 19 '17 at 15:14
  • @mathreadler are they like basis vectors in linear algebra? – Euler_Salter Apr 19 '17 at 15:14
  • The basis functions are the functions you use to make linear combinations with to build functions. Just as basis vectors are what you use to make linear combinations to build vectors. – mathreadler Apr 19 '17 at 15:30
  • You're missing a square around your braces ${; }$ on the right side of the $=$ of the first line. – Disintegrating By Parts Apr 19 '17 at 20:57
  • @TrialAndError thank you, missed that typo! – Euler_Salter Apr 19 '17 at 23:23

2 Answers2

2

It's easier to understand what's going on here if we talk in more generality, so as to avoid getting bamboozled by all the trig functions (plus, it's useful to know about these ideas for other bases).

The norm (or at least, the one it means) is defined as $$ \lVert f \rVert_2 = \sqrt{\int_{-L}^L f(x)^2 \, dx}, $$ and there is an associated inner product $\langle f,g \rangle = \int_{-L}^L f(x) g(x) \, dx$, so $\langle f,f \rangle = \lVert f \rVert_2^2 $.

Suppose we want to have an expansion of $f$ in terms of an orthogonal basis of functions $ (e_n)_{n=0}^{\infty}$ (orthogonal means $\langle e_m,e_n \rangle=0$ if $n \neq m$), $$f(x) = \sum_{n=0}^{\infty} A_n e_n(x)$$ (this is what a Fourier series is, if we choose $e_n$ to include $\frac{1}{2}$, $\cos{(k\pi x/L)}$ and $\sin{(k\pi x/L)}$). We find $A_n$ using the orthogonality: $$ \langle f, e_n \rangle = \sum_{m=0}^{\infty} A_m\langle e_m,e_n \rangle = A_n \langle e_n , e_n \rangle = A_n \lVert e_n \rVert_2^2 $$ Thus $$f = \sum_{n=0}^{\infty} \frac{\langle f, e_n \rangle}{\lVert e_n \rVert_2^2} e_n. $$

Parseval's identity gives the norm of $f$ in terms of the coefficients $A_n$: \begin{align} \lVert f \rVert_2^2 &= \langle f , f \rangle \\ &= \left\langle \sum_{n=0}^{\infty} A_n e_n , \sum_{m=0}^{\infty} A_m e_m \right\rangle \\ &= \sum_{n=0}^{\infty} \sum_{m=0}^{\infty} A_n A_m \langle e_n , e_m \rangle \\ &= \sum_{n=0}^{\infty} \sum_{m=0}^{\infty} A_n^2 \lVert e_n \rVert^2. \end{align} To get back to your problem, the important thing is that this includes the norm of the basis functions $e_n$. Therefore if the basis functions are not normalised to have $\lVert e_n \rVert^2=1$, you'll get an extra factor. Going back to Fourier series explicitly, the basis functions are not normalised in the usual formulation: $$ \int_{-L}^L \cos^2{\left( \frac{n\pi x}{L} \right)} \, dx = \int_{-L}^L \sin^2{\left( \frac{n\pi x}{L} \right)} \, dx = L, \\ \int_{-L}^L \frac{1}{2} \, dx = L. $$ Hence the extra factor of $L$.

Chappers
  • 69,099
  • how come the notation is $||f(x)||_2$ and not just $||f(x)||$? In general then, what would $||f(x)||_2$ be? – Euler_Salter Apr 19 '17 at 23:24
  • The $2$ is the power on the function (so its square is integrated). The $p$-norm is $$ \lVert f \rVert_p = \left( \int_E \lvert f(x) \rvert^p , dx \right)^{1/p}, $$ where $E$ is some set (in your case, the interval $[-L,L]$). – Chappers Apr 19 '17 at 23:28
  • oh now it makes sense! Is this a infinite-dimensional generalization of Pythagoras then? – Euler_Salter Apr 19 '17 at 23:37
  • 2
    Yes, exactly! One has to make some restrictions on what sort of functions are allowed, but when this is done, Fourier series can be treated as living in an infinite-dimensional linear inner-product space called a Hilbert space. (One has to worry about convergence with infinitely many things, of course.) The "geometry" of Hilbert space is very like the geometry of ordinary Euclidean space. – Chappers Apr 19 '17 at 23:47
1

Fourier transforms are often a bit tough to digest. There are many subjects and concepts coming together.


You first have an inner product space. Not just vectors containing a bunch of numbers like vectors in $\mathbb R^n$ having some matrix defining lengths, but functions are the vectors.

The meaning of a function being orthogonal to another is just as in linear algebra : inner product between two vectors is 0. Although the inner product in our case is an integral.

So one $f$ (which is linear combination of sines and cosines) times itself and then integrated.

$$\begin{array}{|c|c|}\hline \text {Subject}&\text{Linear Algebra}& \text{Fourier Analysis} \\\hline \text{vector } f& {\bf f} = [f_1,f_2,\cdots,f_n]^T&x\to f(x)\\\hline \text{Inner Product } \langle f,g \rangle&{\bf f}^T{\bf Gg}&\int f(x)g(x)dx\\\hline\text{Squared norm: }\|f\|^2&{\bf f}^T{\bf Gf}&\int f(x)^2dx\\\hline\end{array}$$


Parseval's identity basically say that after change from trivial to Fourier basis, the "lengths of the vectors" or the "energy of the functions" are the same as before.

Chappers
  • 69,099
mathreadler
  • 26,534