8

I don't understand this from a textbook.

"... dual space could have an inner product that is induced from the vector space."

Suppose there is a vector space $V$. The inner product is determined $\langle \mathbf{v}, \mathbf{v} \rangle = v^i v^i$ or in matrix form

$$\begin{bmatrix} v^1 & v^2 & v^3\\ \end{bmatrix} \begin{bmatrix} v^1 \\ v^2 \\ v^3 \\ \end{bmatrix}$$

Are the dual space inner products constructed in a similar way?

Tursinbay
  • 307
  • Without telling us which textbook you are quoting from, it is impossible to be sure you've constructed the dual inner product "in a similar way." Many important vector spaces occur in applications with more than three or finitely many dimensions. Some mention of this possibility appears in some of the answers posted, but I would refrain from going into details without knowing which textbook you read. – hardmath Feb 13 '25 at 21:52

3 Answers3

23

Let $V$ be a finite-dimensional vector space over the field $\Bbb{R}$, and let $g:V \times V \to \Bbb{R}$ be an inner product on $V$. (I write $g$ rather than $\langle \cdot, \cdot\rangle$ simply for convenience.)

Then, recall that the dual space $V^*$ is by definition the set of all linear transformations from $V$ into $F$. Now, using the inner product $g$ on $V$, we can contruct the following map: $g^{\flat}:V \to V^*$ defined by \begin{align} g^{\flat}(x) = g(x, \cdot) \end{align} In other words, $g^{\flat}$ assigns to each vector $x \in V$, that element of $V^*$, such that for all $y \in V$, $\left(g^{\flat}(x) \right)(y) = g(x, y)$.

Now, using the fact that $g$ is an inner product, it is easy enough to verify (just unwind all the definitions) that $g^{\flat}$ is a linear map, and it is also injective. Hence, by rank-nullity theorem, it is also surjective, hence $g^{\flat}:V \to V^*$ is an isomorphism of finite-dimensional vector spaces. Now, let's denote the inverse of $g^{\flat}$ by $(g^{\flat})^{-1} \equiv g^{\sharp}:V^* \to V$

Now, you can define the function $h$ on $V^*$ as follows: define $h:V^* \times V^* \to \Bbb{R}$ by \begin{align} h(\phi, \psi) &:= g \left( g^{\sharp}(\phi), g^{\sharp}(\psi)\right) \end{align} I'll leave it to you to verify that $h$ satisfies all the properties of an inner product.

Note that while this is a lot of constructions, the idea is actually very simple. To define an inner product on $V^*$ means you need to tell me how to construct a number from two elements $\phi, \psi \in V^*$. Well, the above recipe says first "convert" $\phi, \psi$ from "dual vectors" into vectors by applying $g^{\sharp}$ to them. Then, since $g^{\sharp}(\phi)$ and $g^{\sharp}(\psi)$ are vectors in $V$, we can take their inner product using $g$ to get a number.


The above answer is the completely basis-free construction of how to get an inner product on $V^*$ from the one on $V$. Now, if we resort to a basis, the computations is actually very simple. Let $\{e_1, \dots, e_n\}$ be a of $V$, which is orthonormal with respect to the inner product $g$ (i.e $g(e_i, e_j) = \delta_{ij}$). Also, let $\{\epsilon^1, \dots, \epsilon^n\}$ be the unique dual basis of $V^*$, which means that by definition, for all $i,j$, we have that $\epsilon^i(e_j) = \delta_{ij}$. It is easy to verify that $\epsilon^i = g^{\flat}(e_i)$, and hence, $\{\epsilon^1, \dots \epsilon^n\}$ will be an orthonormal basis of $V^*$ with respect to the inner product $h$.


So, for computations, if $\{e_1, \dots, e_n\}$ is an orthonormal basis of $V$, then to compute $g(x,y)$ what we can do is first expand $x,y$ in terms of the basis: \begin{align} x = \sum_{i=1}^n x^i e_i \quad \text{and} \quad y = \sum_{i=1}^n y^i e_i \end{align} ($x^i, y^i \in \Bbb{R}$ being scalars). Then, (by orthonormality) \begin{align} g(x,y) = \sum_{i=1}^n x^i y^i \end{align}

Now, the inner product on the dual space is not that much different: given $\phi, \psi \in V^*$, to compute $h(\phi, \psi)$, what you can do is first expand them in terms of the dual basis $\{\epsilon^1, \dots, \epsilon^n\}$: \begin{align} \phi = \sum_{i=1}^n \phi_i \epsilon^i \quad \text{and} \quad \psi = \sum_{j=1}^n \psi_j \epsilon^j \end{align} ($\phi_i, \psi_j \in \Bbb{R}$ being scalars). Then, (by the orthonormality) \begin{align} h(\phi, \psi) = \sum_{i=1}^n \phi_i \psi_i \end{align}

peek-a-boo
  • 65,833
  • 1
    Excellent explanation. You are denoting the space where inner appears as a field $F$, not by $R$. – Tursinbay Dec 24 '19 at 19:54
  • 1
    @Tursinbay oh yes I should probably change the field to $\Bbb{R}$ because over general fields, inner products do not make sense (and also, since if we work over $\Bbb{C}$, then $g^{\flat}$ wouldn't be an isomorphism anymore so I would have to re-word a lot of stuff) – peek-a-boo Dec 24 '19 at 19:59
0

The dual space,V*, to a given vector space, V, is the set of all functions from V to its field of scalars (typically the real numbers or complex numbers). That set becomes a vector space itself defining addition and scalar multiplication by (f+ g)(v)= f(v)+ g(v) and (af)(v)= a(f(v)). If V is n-dimensional then V* is also n-dimensional. Given the basis {v1, v2, …, vn} for V, the set of functions {f_1, f_2, …., fn} where fi is defined by fi(vi)= 1, fi(vj)= 0 for j not equal to I and extended to all vectors "by linearity": f(v)= f(a1fv1+ a2v2+ …+ anvn)= a1f(v1)+ a2f(v2)+ ….+ anf(vn). So given a vector, v, in V, this associates a unique function, v*, in V* by writing v as a linear combination of the basis vectors of V, then defining v* to be the linear combination of the corresponding basis of V*, with the same scalar coefficients.

Once we have a dual space, we can define a "dot product", u.v, in V, by taking the function, u*, associated to the vector, u, and applying it to v, u*(v).

Similarly, given two functions, u* and v*, in V*, we can define a "dot product", u*.v*, in V*, by taking the vector, v, associated to the function, v*, and applying u* to it, u*(v).

user247327
  • 19,020
  • 2
  • 15
  • 21
0

Given a norm $|\cdot|$ on a vector space $V$, there is the operator norm on $V^*$, where for each $\xi \in X^*$, $$ |\xi| = \sup\left\{ \frac{\langle\xi,v\rangle}{|v|}:\ v \in V\backslash\{0\}\right\}. $$ If the norm on $V$ is an inner product norm (i.e., it satisfies the parallelogram law), the operator norm is also an inner product norm.

You can verify that it is the same inner product as the one defined by @peek-a-boo.

Deane
  • 10,298
  • Do you have a reference for what you ask the Reader to verify? It is pretty straightforward and would be almost as convenient for you to inline in your Answer as to pass through as a claim to verify. Cf. this older post. – hardmath Feb 13 '25 at 21:57