Let $n\ge2$ be an integer, let $\Sigma$ be a positive semidefinite, symmetric $n\times n$ matrix of real numbers partitioned as $$\Sigma=\begin{pmatrix}\Sigma_{a,a}&\Sigma_{a,b}\\\Sigma_{b,a}&\Sigma_{b,b}\end{pmatrix},$$ where $\Sigma_{a,a}$ is $1\times1$ and $\Sigma_{b,b}$ is $(n-1)\times(n-1),$ assume $\Sigma_{b,b}$ is positive definite (i.e., invertible) and let $X=(X_1,\dots,X_n)$ be $N(0,\Sigma),$ normal with mean zero and covariance matrix $\Sigma.$ I wish to find $E(X_1\mid X_2,\dots,X_n).$ In addition, I am using the Radon-Nikodym-derivative definition of conditional expectation, so I would prefer not to compute conditional densities $f_{X_a\mid X_b}(x_a\mid x_b)=f_{X_a,X_b}(x_a,x_b)/f_{X_b}(x_b).$
From Conditional Expectation Multivariate Normal, I can guess that $E(X_1\mid X_2,\dots,X_n)=\Sigma_{a,b}\Sigma_{b,b}^{-1}(X_2,\dots,X_n)^T.$ To prove this result, I tried reasoning as follows, similar to user357269's answer to "Conditional expectation of a joint normal distribution": If $X_1-\Sigma_{a,b}\Sigma_{b,b}^{-1}(X_2,\dots,X_n)^T$ and $\sigma(X_2,\dots,X_n)$ are independent, then we have $$E(X_1\mid X_2,\dots,X_n)$$ $$=E(X_1-\Sigma_{a,b}\Sigma_{b,b}^{-1}(X_2,\dots,X_n)^T+\Sigma_{a,b}\Sigma_{b,b}^{-1}(X_2,\dots,X_n)^T\mid X_2,\dots,X_n)$$ $$=E(X_1-\Sigma_{a,b}\Sigma_{b,b}^{-1}(X_2,\dots,X_n)^T)+\Sigma_{a,b}\Sigma_{b,b}^{-1}(X_2,\dots,X_n)^T=\Sigma_{a,b}\Sigma_{b,b}^{-1}(X_2,\dots,X_n)^T,$$ where the last equality follows from $EX_1=0$ and $E((X_2,\dots,X_n))=0.$
However, I am stuck on showing independence. For the case $n=2,$ we can compute the covariance $\text{Cov}(X_1-\Sigma_{a,b}\Sigma_{b,b}^{-1}X_2,X_2)=0$ and appeal to a theorem. However, I am unsure what to do for larger $n,$ since $(X_2,\dots,X_n)$ is vector-valued rather than real-valued.