Problem:
Define $C=\left(\begin{matrix} a & -b \\ b & a \end{matrix}\right)$ and $\Delta=\left(\begin{matrix} 1 & 0 \\ 0 & 1 \end{matrix}\right)$.
Define $L=\left(\begin{matrix} C & \Delta & \\ &C & \Delta & \\ & & \ldots \\ & & & C & \Delta \\ & & & & C \end{matrix}\right) \in \mathscr{M}(n \times n,\mathbb{R})$ where $\mathscr{M}(n \times n,\mathbb{R})$ denote the space of real matrix $n \times n$.
Remembering that we define $e^B=\sum_{k \geq 0}\frac{B^k}{k!}$ for $B \in \mathscr{M}(n \times n,\mathbb{R})$, how can we conlclude that there exists $M \in \mathscr{M}(n \times n,\mathbb{R})$ such that $e^M=L^2$?
I had this problem reading this answer and I would like to know if we can conclude the statement above without using such advanced (for me) arguments of algebra (i.e. Lie groups, Lie algebra and so on).
Remark: I am not sure if this post really needs the tag "lie-algebras".