In general, one almost always uses one of two methods: either you proceed through Lehmann-Scheffe directly or you find an unbiased estimator and then condition on a complete sufficient statistic. See Chapter 2 of Lehmann-Casella for details (for example; most math stat books will touch on material like this).
In either case, we need to find a complete sufficient statistic in the first place. There's lots of ways to do that, but in this case (and many cases) we may use the fact that we are dealing with an exponential family.
Start by abbreviating $$T_j(\mathbf X) = \sum_{i=1}^n X_{ij}.$$ The joint density is (up to a constant that I can never remember right) $$f(\mathbf x) = \prod_{i=1}^n \exp\left(- \frac{1}{2}\sum_{j=1}^3 (x_{ij} - \theta_j)^2\right) = \exp \left( -\frac{1}{2} \left(\sum_{i=1}^n\sum_{j=1}^3 x_{ij}^2 - 2x_{ij}\theta_j + \theta_j^2\right) \right)$$
which simplifies to $$\exp \left(\theta_1 T_1(\mathbf x) + \theta_2 T_2(\mathbf x) + \theta_3 T_3(\mathbf x) - \frac{1}{2} \sum_{i=1}^n\sum_{j=1}^3 x_{ij}^2 - \frac{1}{2}\sum_{i=1}^n \sum_{j=1}^3 \theta_j^2 \right).$$
Recalling that $\sum_{j=1}^3 \theta_j = 1$, this reduces to $$\exp \left( \theta_2 (T_2(\mathbf x) - T_1(\mathbf x)) + \theta_3 (T_3(\mathbf x) - T_1(\mathbf x)) + T_1(\mathbf x) + \frac{1}{2} \sum_{i=1}^n\sum_{j=1}^3 x_{ij}^2 - \frac{1}{2}\sum_{i=1}^n \sum_{j=1}^3 \theta_j^2 \right).$$
Now only considering $\theta_2, \theta_3$, the parameter space becomes $\theta_2 + \theta_3 \leq 1$; thus this is a exponential family of full rank and $(T_2(\mathbf X) - T_1(\mathbf X), T_3(\mathbf X) - T_1(\mathbf X))$ is complete sufficient. Put $$S_j = \frac{T_j(\mathbf X) - T_1(\mathbf X)}{n}$$ so that $(S_2, S_3)$ is still complete sufficient.
Now recall Lehmann-Scheffe: if we can find $f(S_2, S_3)$ such that $$E[f(S_2, S_3)] = \theta_1,$$ we win (and in fact such a function is unique). But this is easy: $$E[S_2] = \theta_2 - \theta_1 \text{ and } E[S_3] = \theta_3 - \theta_1$$ so $$E\left[\frac{1 - S_2 - S_3}{3}\right] = \frac{1 - \theta_2 - \theta_3 + 2\theta_1}{3}= \theta_1$$ as desired. Thus by Lehmann-Scheffe the UMVUE is
$$
\frac{1}{3}\left(\frac{1}{n}\sum_{i=1}^n (1 - X_{i2} - X_{i3}) + \frac{2}{n} \sum_{i=1}^n X_{i1}\right).
$$
Note that in some sense, this is what we expect: there are naively two obvious ways you can estimate $\theta_1$. Either you look at the average of the $X_{i1} \sim N(\theta_1, 1)$, or you look at average of $1 - X_{i2} - X_{i3} \sim N(\theta_1, 2)$. The above is the the average of those two approaches weighted inversely to their variances.
An exercise for you: compute $E\left[ \frac{1}{n}\sum_{i=1}^n X_{i1} \mid S_2, S_3\right]$ (and verify that it is the same as the above).