Assume that a matrix $B$ (real or complex, it's not important) has a null eigenvector $x$, namely $B x = 0$. Therefore, $B^{-1}$ does not exist. Define $A = \alpha I +B$, where $I$ is the identity and $\alpha$ is a number to be chosen such that $A^{-1}$ exists.
Is it true, in general, that $A^{-1}x = \alpha^{-1}x$? It seems so, because $$ x = A A^{-1}x= A^{-1} (Ax) = A^{-1} (\alpha x ) = \alpha ( A^{-1}x ) \, . $$
Question: Is it possible to find a more "explicit/direct" proof based on some known structural properties of $A^{-1}$? Answer: Yes, see the very useful comments below!
Note: it is always possible to find a whole interval of values of $\alpha$ such that $A^{-1}$ exists, see this question about a slightly different case (here I add $\alpha$ only on the diagonal of $B$, not to every component).