8

If $P_3$ is a vector space of third-degree polynomials. It is known the basis for $P_3$ is ${( 1,x,x^2 , x^3})$

and $\langle p, q\rangle = \int_{0}^{1} p(x)q(x)\, dx.$ is a valid product on $P_3$

I am trying to use the Gram-Schmidt method to get a basis for $P_3$ which is orthonormal with respect to the above inner product.

Even though I found partial solutions or similar problems the explanations are limited.

PS. I read the rules before posting my first question. Even though I found similar problems I didn't understand entirely the method and calculations.

Additional Sources  
  1. the below exercise which has a partial solution, but I am not sure how to calculate the remaining values. enter image description here
  1. this question which is similar but in $P_2$ Finding an orthonormal basis for the space $P_2$ with respect to a given inner product

I hope I did not violate any rule. It was my last hope to ask here since due to current conditions I can't ask my Teacher face to face.

Fawkes4494d3
  • 3,009

3 Answers3

2

Graham Schmidt.

Pick a vector, to make it a candidate for your first basis vector.

$w_0 = 1$

Normalize it. Since $\|w_0\| = 1$ we that step is already done.

$e_0 = w_0 = 1$

Your second basis vector.

$w_1 = x$

Subtract the projection of $e_1$ onto $x.$

$e_1^* = x - \langle e_1,x\rangle e_1$

$e_1^* = x - \int_0^1 x \ dx = x-\frac 12$

Normalize it...

$e_1 = \frac {e_1^*}{\|e_1^*\|}$

$\|e_1^*\|^2 = \langle e_1^*,e_1^*\rangle = \int_0^1 (x-\frac 12)^2 \ dx\\ \int_0^1 x^2 -x + \frac 14\ dx = \frac 13 - \frac 12 + \frac 14 = \frac 1{12}\\ e_1 = \sqrt {12} x - \sqrt 3$

$w_2 = x^2\\ e_2^* = w_2 - \langle e_0,w_2\rangle - \langle e_1,w_2\rangle$

Normalize it...

lather, rinse, repeat.

Doug M
  • 58,694
  • There is often less work if we orthogonalize at each step without normalizing, to get a pair-wise orthogonal set $S$, and then normalize by replacing each $t\in S$ with $t/|t|.$ – DanielWainfleet Sep 08 '20 at 05:21
2

SUMMARY:Given an (ordered) basis we can create the Gram matrix $G$ of inner products of basis vectors. An orthonormal basis is given as the columns of a square matrix $W$ such that $W^T GW = I.$ That is, the coefficients (in the original basis) of an orthonormal basis are the columns of $W.$

ORIGINAL: Given a symmetric matrix $H,$ there are methods for finding an invertible matrix $P$ such that $P^T HP = D$ is diagonal. In your case, the matrix is the Gram matrix of inner products of basis vectors.

$$ \left( \begin{array}{rrrr} 1 & \frac{1}{2} & \frac{1}{3} & \frac{1}{4} \\ \frac{1}{2} & \frac{1}{3} & \frac{1}{4} & \frac{1}{5} \\ \frac{1}{3} & \frac{1}{4} & \frac{1}{5} & \frac{1}{6} \\ \frac{1}{4} & \frac{1}{5} & \frac{1}{6} & \frac{1}{7} \\ \end{array} \right) $$

This is Hilbert's matrix, or at least a square upper left corner of the infinite matrix, and constructed by precisely Hilbert's manner. https://en.wikipedia.org/wiki/Hilbert_matrix

I multiplied by $420$ to get a matrix of integers, then went through the method I asked about at reference for linear algebra books that teach reverse Hermite method for symmetric matrices

$$ P^T H P = D $$ $$\left( \begin{array}{rrrr} 1 & 0 & 0 & 0 \\ - \frac{ 1 }{ 2 } & 1 & 0 & 0 \\ \frac{ 1 }{ 6 } & - 1 & 1 & 0 \\ - \frac{ 1 }{ 20 } & \frac{ 3 }{ 5 } & - \frac{ 3 }{ 2 } & 1 \\ \end{array} \right) \left( \begin{array}{rrrr} 420 & 210 & 140 & 105 \\ 210 & 140 & 105 & 84 \\ 140 & 105 & 84 & 70 \\ 105 & 84 & 70 & 60 \\ \end{array} \right) \left( \begin{array}{rrrr} 1 & - \frac{ 1 }{ 2 } & \frac{ 1 }{ 6 } & - \frac{ 1 }{ 20 } \\ 0 & 1 & - 1 & \frac{ 3 }{ 5 } \\ 0 & 0 & 1 & - \frac{ 3 }{ 2 } \\ 0 & 0 & 0 & 1 \\ \end{array} \right) = \left( \begin{array}{rrrr} 420 & 0 & 0 & 0 \\ 0 & 35 & 0 & 0 \\ 0 & 0 & \frac{ 7 }{ 3 } & 0 \\ 0 & 0 & 0 & \frac{ 3 }{ 20 } \\ \end{array} \right) $$

When we divide back again by the same 420, we find $$\left( \begin{array}{rrrr} 1 & 0 & 0 & 0 \\ - \frac{ 1 }{ 2 } & 1 & 0 & 0 \\ \frac{ 1 }{ 6 } & - 1 & 1 & 0 \\ - \frac{ 1 }{ 20 } & \frac{ 3 }{ 5 } & - \frac{ 3 }{ 2 } & 1 \\ \end{array} \right) \left( \begin{array}{rrrr} 1 & \frac{1}{2} & \frac{1}{3} & \frac{1}{4} \\ \frac{1}{2} & \frac{1}{3} & \frac{1}{4} & \frac{1}{5} \\ \frac{1}{3} & \frac{1}{4} & \frac{1}{5} & \frac{1}{6} \\ \frac{1}{4} & \frac{1}{5} & \frac{1}{6} & \frac{1}{7} \\ \end{array} \right) \left( \begin{array}{rrrr} 1 & - \frac{ 1 }{ 2 } & \frac{ 1 }{ 6 } & - \frac{ 1 }{ 20 } \\ 0 & 1 & - 1 & \frac{ 3 }{ 5 } \\ 0 & 0 & 1 & - \frac{ 3 }{ 2 } \\ 0 & 0 & 0 & 1 \\ \end{array} \right) = \left( \begin{array}{rrrr} 1 & 0 & 0 & 0 \\ 0 & \frac{ 1 }{ 12 } & 0 & 0 \\ 0 & 0 & \frac{ 1 }{ 180 } & 0 \\ 0 & 0 & 0 & \frac{ 1 }{ 2800 } \\ \end{array} \right) $$

To get the identity matrix, we now multiply on the far left and far right by diagonal matrix

$$ \left( \begin{array}{rrrr} 1 & 0 & 0 & 0 \\ 0 & 2 \sqrt 3 & 0 & 0 \\ 0 & 0 & 6 \sqrt 5 & 0 \\ 0 & 0 & 0 & 20 \sqrt 7 \\ \end{array} \right) $$

Finally, the desired orthonormal basis are the COLUMNS of

$$ \left( \begin{array}{rrrr} 1 & - \frac{ 1 }{ 2 } & \frac{ 1 }{ 6 } & - \frac{ 1 }{ 20 } \\ 0 & 1 & - 1 & \frac{ 3 }{ 5 } \\ 0 & 0 & 1 & - \frac{ 3 }{ 2 } \\ 0 & 0 & 0 & 1 \\ \end{array} \right) \left( \begin{array}{rrrr} 1 & 0 & 0 & 0 \\ 0 & 2 \sqrt 3 & 0 & 0 \\ 0 & 0 & 6 \sqrt 5 & 0 \\ 0 & 0 & 0 & 20 \sqrt 7 \\ \end{array} \right) $$

as coefficients for the original ordered basis $(1,x,x^2, x^3).$

These give $$ \color{red}{ 1,} \; \; \color{blue}{ \sqrt 3 \cdot (2x-1) ,} \; \; \color{green}{ \sqrt 5 \cdot (6 x^2 -6x+1),} \; \; \color{magenta}{ \sqrt 7 \cdot (20 x^3 - 30 x^2 + 12 x -1)} $$

Zuriel
  • 5,601
  • 2
  • 25
  • 55
Will Jagy
  • 146,052
1

What is Gram-Schmidt?

It is a way of converting a given basis to an orthonormal basis.

What is an orthonormal basis?

If the basis is described as $\{b_1, b_2, b_3,..., b_n\}$, then the basis is orthonormal if and only if $$<b_i, b_j> = \begin{cases}0 & i \neq j\\ 1 & i = j\end{cases}$$

Motivation for this?

It is an elegant way of representing the vector space, and can help draw parallels to a rectangular coordinate system, and helps in things like Fourier series expansions etc

The process

The basic process hinges on starting with a base vector, and adding new vectors to the set which are orthonormal to the ones already added - so we construct this set element by element

Starting point: Any vector can be chosen as the starting point. Let it be $v_1 = \frac{b_1}{||b_1||}$

Now if you take the next vector in the set, $b_2$, how do you get an orthonormal vector to $v_1$?

The vector $v_2 = b_2 - \langle v_1,b_2\rangle v_1$ will be orthogonal to $v_1$, as we are essentially removing the component of $b_2$ parallel to $v_1$, and we will only be left with the perpendicular component. We also have to normalise $v_2$ by dividing by it's magnitude so that we get orthonormality

Now, let us take $b_3$. We need to remove the components that are parallel to both $v_1$ and $v_2$, and then normalise the result

Hence $v_3' = b_3 - \langle b_3, v_1 \rangle v_1 - \langle b_3, v_2 \rangle v_2$

$v_3 = \frac{v_3'}{||v_3'||}$

You can continue this process till all the vectors are converted to orthonormal vectors

TLDR

  1. Pick a base vector $v_1$ as any normalised vector of your current basis

  2. $$v_k' = b_k - \sum_{i=1}^{k-1} \langle b_k, v_i \rangle v_i$$

  3. $$v_k = \frac{v_k'}{||v_k'||}$$