So, I don't see how can I measure the complexity of Chinese remainder theorem in term of the number of multiplication in G of order N.
Any ideas?
Well, the standard way to solve for $x$ (given that $n_0, n_1$ are relatively prime) is to compute:
$$x = n_1 ((a_0 - a_1) n_1^{-1} \bmod n_0) + a_1$$
(where $n_1^{-1}$ is computed modulo $n_0$)
It is straightforward to verify that:
$$ 0 \le x < n_0n_1$$
$$ x \equiv a_0 \pmod{ n_0 }$$
$$ x \equiv a_1 \pmod{ n_1 }$$
As for the cost of the computation, $n_1^{-1}$ can be computed with $O(\log N)$ multiplication operations; the rest of the operations are two multiplication operations (one modulo $n_0$), and two addition/subtraction operations.
So, I don't see how can I measure the complexity of Chinese remainder theorem in term of the number of multiplication in G of order N.
So, in Pohlig-Hellman, you compute $H = a_0(n_1G)$ and $H = a_1(n_0G)$, and combine $a_0$ and $a_1$ to form the value $x$ s.t. $H = xG$. Computing $a_0, a_1$ takes $O(\sqrt{\max(n_0, n_1)})$ multiplications (assuming the curve does not have any special form that allows us to compute the discrete logs faster); combining $a_0, a_1$ doesn't take any (because those are computed using integer computations, not elliptic curve operations).