Taylor's theorem holds in very general circumstances, you can take the domain and target space to be any Banach space. For this of course, you need to understand how differentiation works in Banach spaces. Take a look at this answer for a definition of differentiability, and this answer for a statement, (and outline of a proof) of Taylor's theorem.
If $V$ is a Banach space (for example, the space of matrices), $I \subset \Bbb{R}$ is an open interval around the origin, and $A: I \to V$ is a $k$-times differentiable at the origin, then by Taylor's theorem, we have
\begin{align}
A(t) &= A(0) + A'(0)t + \dots + \dfrac{A^{(k)}(0)}{k!}t^k+ o(|t|^k) \qquad \text{as $t \to 0$}
\end{align}
In particular, if you choose $k=1$, then
\begin{align}
A(t) &= A(0) + A'(0)t + o(|t|).
\end{align}
(note that the formula you wrote isn't quite right).
By the way you should note that the case $k=1$ is exactly the definition of $A$ being a differentiable function at the origin; there's no need to invoke Taylor's theorem at all.
Suppose we specialize this further very explicitly to the case that $V = \Bbb{C}^{n\times n}$ is the space of matrices. Then one can prove that if $A(t) = (a_{ij}(t))$, then $A'(t) = (a_{ij}'(t))$. In other words, the derivatives are easy to calculuate; just do it component by component. Hence,
\begin{align}
(a_{ij}(t)) &= (a_{ij}(0)) + (a_{ij}'(0)) \cdot t + o(|t|).
\end{align}
So, in essence, if the target space is a (finite-dimensional normed) vector space, you just perform a Taylor expansion of the given function about the given point (in my answer I chose $t=0$) "component by component".