In univariate case, if $p(x) = 0$ for $x = x_0$, then $p(x) = (x - x_0) q(x)$, so you can continue by looking for roots of polynomial $q(x)$. I'm not sure that anything like this is possible for multivariate polynomial systems. I suppose you can invent some similar things using resultants and Groebner bases, but I'm no expert in those.
In order to find all roots by Newton-Raphson iteration you can simply start it from many starting points. The more starting points you try, the more chance to detect all the roots. But this way you'll never be sure that you have found all of them (unless you use some theorems, e.g. Newton-Kantorovich theorem).
There are several practical algorithms for solving systems of polynomial equations. I'll only list some of the reliable ones, which are guaranteed to enclose all the zeros in some small sets.
Suppose that you want to find all zeros of multivariate polynomial function $F(x)$ in domain $D = [0; 1]^n$.
- Subdivision method. Suppose that given a simple set $A$ you can compute a simple set $B$, such that $F(A) \subseteq B$. You can find such set $B$ methodically by using interval arithmetics.
If $0 \notin F(D)$, then your system has no solutions on its domain $D$. Otherwise, split domain $D$ into several subdomains and solve your problem on all of them recursively.
- Interval Newton method is similar to Newton's method, but interval version of Jacobi matrix is evaluated. It allows to get quadratic convergence of Newton's method without losing reliability. It is quite hard to use, since you have to solve interval linear system on each step. Interval Krawcyk method does not require this, so it may be easier to implement.
- (Interval) Projected Polyherdon. This method uses Bernstein basis to represent polynomials. It is quite complex to explain, but it is a preferred method for solving polynomial equation systems in computational geometry, which means a lot. It relies on 2D convex hulls and line - polygon (convex) intersections.