9

Theorem: Let $V$ and $W$ be finite dimensional vectorspaces over the same field $F$, with dimensions $n$ and $m$ respectively. Suppose also that $\beta$ and $\gamma$ are ordered bases for resp. $V$ and $W$. Then the function $\Psi : \mathcal{L}(V, W) \rightarrow M_{m \times n}(F)$, defined as $\Psi(T) = [T]_{\beta}^{\gamma}$ for an arbitrary $T \in \mathcal{L}(V,W)$, is an isomorphism.

Attempt at proof: We need to show that it is bijective, and hence an isomorphism. This means that for every $m \times n$-matrix $A$ we need to find an unique linear map $T: V \rightarrow W$ such that $\Psi(T)=A.$ So let $\beta = \left\{v_1, v_2, \ldots, v_n\right\}$ and $\gamma = \left\{w_1, w_2, \ldots, w_m\right\}$ be ordered bases for $V$ and $W$. Then we know already that there exists an unique linear map $T: V \rightarrow W$ such that for $1 \leq j \leq n$ \begin{align*} T(v_j) = \sum_{i=1}^{m} a_{ij} w_i \end{align*} But this means that $[T]_{\beta}^{\gamma} = A$, so $\Psi(T) = A$. Hence $\Psi$ is an isomorphism.

Can someone check if my proof is sound and valid? If not, where did I go wrong? Thanks in advance.

Kamil
  • 5,309

1 Answers1

11

Seems good to me. There is only one thing I want to point out (but perhaps you know this): you only proved that $\Psi$ in surjective with this. However, since $\mathcal{L}(V,W)$ and $M_{m \times n}(F)$ are finite dimensional vector spaces, $\Psi$ is an isomorphism $\iff \Psi$ is injective $\iff \Psi$ is surjective.


To prove linearity, we must prove that given $\lambda \in F$, and $T,S: V \to W$ linear maps, it is true that: $$[T+\lambda S]_\beta^\gamma = [T]_\beta^\gamma+\lambda[S]_\beta^\gamma.$$ Write $c_{ij} = \left([T+\lambda S]_\beta^\gamma\right)_{ij}, a_{ij} = ([T]_\beta^\gamma)_{ij}$ and $b_{ij} = ([S]_\beta^\gamma)$, and verify that $c_{ij}=a_{ij}+\lambda b_{ij},$ by taking an arbitrary column vector ${\bf x} = [x_1,\cdots,x_n]_\beta^T$, and using the definitions of $[T+\lambda S]_\beta^\gamma, [T]_\beta^\gamma$ and $[S]_\beta^\gamma$ to compute $[T+\lambda S]_\beta^\gamma{\bf x}$ and similarly for the other ones.


It will suffice to check for the vectors of the basis $\beta$ instead of a general vector $\bf x$. In the notation above, we know that: $$T({\bf v}_j) = \sum_{i=1}^m a_{ij}{\bf w}_i, \quad S({\bf v}_j) = \sum_{i=1}^m b_{ij}{\bf w}_i, \quad (T+\lambda S)({\bf v}_j) = \sum_{i=1}^m c_{ij}{\bf w}_i.$$ Going from the last identity: $$\begin{align}\sum_{i=1}^m c_{ij}{\bf w}_i &= (T+\lambda S)({\bf v}_j) \\ &= T{\bf v}_j + \lambda S{\bf v}_j \\ &= \sum_{i=1}^ma_{ij}{\bf w}_i + \lambda \sum_{i=1}^mb_{ij}{\bf w}_i \\ &= \sum_{i=1}^ma_{ij}{\bf w}_i + \sum_{i=1}^m\lambda b_{ij}{\bf w}_i \\&= \sum_{i=1}^m(a_{ij}+\lambda b_{ij}){\bf w}_i \end{align}$$

Since $\{ {\bf w}_i \}_{i=1}^m$ is linearly independent, $c_{ij} = a_{ij}+\lambda b_{ij}$. Hence $[T+\lambda S]_\beta^\gamma = [T]_\beta^\gamma+\lambda[S]_\beta^\gamma.$

Ivo Terek
  • 80,301
  • What about linearity? It seems I didn't proof this; and definition of isomorphism is: a linear and bijective map. My textbook also gives a very long (complicated) proof, so I figured there must be something wrong with mine. – Kamil Mar 22 '15 at 23:21
  • I thought you already had the result, my bad. I'll edit my answer a bit. – Ivo Terek Mar 22 '15 at 23:22
  • Thanks for your help sofar, but I'm not sure how to do that. I'm confused about what to do with this column vector. Why do I have to compute $[T + \lambda S]_{\beta}^{\gamma} x$ ? – Kamil Mar 22 '15 at 23:40
  • Added more details. Hope this clarifies things. – Ivo Terek Mar 22 '15 at 23:55
  • That was so clear! Thanks a lot. – Kamil Mar 23 '15 at 00:06
  • @IvoTerek Can it further be shown that there exists an isomorphism between $\mathcal{L}$ and $M_{m \times n}$ by considering them as normed Banach spaces. Hence does there exist a linear surjective isometry? –  Aug 02 '15 at 14:52
  • 1
    They're finite dimensional so all norms are equivalent on them. Use the isomorphism as vector spaces to define a norm on one of them from the other. Then it'll be an isometry. – Ivo Terek Aug 02 '15 at 16:09
  • @IvoTerek Since L(V,W) and Mm×n(F) are finite dimensional vector spaces, Ψ is an isomorphism ⟺Ψ is injective ⟺Ψ is surjective.For this to happen, both L and M should have Same dimension right(which they do) ? – manifold Mar 01 '18 at 06:17
  • @IvoTerek That is if T:V-->W is Linear Transformation such that dim V= dim W then Surjectivity of T ⟺ Injectivity of T ? – manifold Mar 01 '18 at 06:29
  • Yes, correct. $$$$$$ – Ivo Terek Mar 01 '18 at 14:01