Given a real square matrix $A \in \Re^{n \times n}$, let's say its eigenvalues have a lovely property: $Re(\lambda^A_i) < 0$. We also have another diagonal real square matrix $K \in \Re^{n \times n}$ whose entries are all positive. Now, if we define $B = KA$, what is the lower bound of element $k_{ii}$ that guarantees all $Re(\lambda^{B}_{i}) < 0$ too?
If $K = kI$, we don't have any problem. If $A$ is a diagonal matrix, we again do not have any problem. This seems to be a problem when both are not true.
motivation
Given a continuous-time system $\tau \frac{dh}{dt} = W_{hh}h + W_{hi}x$, I was thinking of adding a gain matrix $K$ which can be left-multiplied to the right side of the equation. With that, $K_{ii}$ can properly scale all the input to the ith node, say $W_{ij}$.
However, if $K$ is not chosen properly, the system $\tau \frac{dh}{dt} = KW_{hh}h + KW_{hi}x$ can become unstable, as the real part of the eigenvalues of $KW_{hh}$ can become positive even when $Re(\lambda^{W_{hh}}_i) < 0$.
EDIT 1
Constraining on $K$ is an interesting idea. As Robert suggested, can we think of a way to constrain $K$ to keep the real part negative?
One another boring example is $A$ being block-diagonal and $K$ having a unique gain factor $k_i$ for each block. Though, this seems to be a bit too special case.