Multivariable asymptotics can be defined in terms of limits involving the infinity norm, which is defined as follows:
$$\Vert x \Vert_{\infty} = \max_i x_i$$
Then for $x \in \mathbb{R}^n$, $f(x) = O(g(x))$ if there exists $c > 0$ such that
$$\lim_{\Vert x \Vert_{\infty} \rightarrow \infty} \left( c g(x) - f(x) \right) \geq 0.$$
where we assume that $f(x), g(x) \geq 0$ are continuous functions.
Equivalently, this means that for every sequence $(a_i)_{i = 1}^{\infty}$ with each $a_i \in \mathbb{R}^n$, where the entry in some coordinate becomes arbitrarily large (i.e. $\lim_{i \rightarrow \infty} \max_j \vert a_{i,j} \vert = \infty$), there exists $c > 0$ such that $f(a_i) \leq c g(a_i)$ for sufficiently large $i$.
It's not clear to me that your case by case analysis works in general. However, your argument does work with the above definition if you apply it to the elements of an arbitrary sequence $(a_i)_{i = 1}^{\infty}$ such that $\lim_{i \rightarrow \infty} \max_j \vert a_{i,j} \vert = \infty$.
You may also find the wikipedia article useful. It has a section on multivariable asymptotics. The definition is more general, but it is not in terms of an explicit limit (this does not appear to be possible in general).