Do not hesitate to comment answers if you need more explanation :)
For the sake of accuracy, the conclusion is not that $X$ is $\mathcal G$-measurable but a.s. equal to a $\mathcal G$-measurable random variable. Note that this makes no difference if $\mathcal G$ is complete.
Let us go with Daniel Fischer's method. For a strictly convex function $\phi:\mathbb R\to\mathbb R$ we have
$$
\forall t,c\in\mathbb R,\quad\exists\kappa\in\mathbb R,\quad\phi(t)\ge\phi(c)+\kappa\cdot(t-c),
$$
with equality iff $t=c$.
As you wrote, we would like to apply the above inequality with $c=\mathbb E[X\vert\mathcal G]$. However as you pointed out, it is a random variable, so we face the following problems:
- $\kappa$ depends on $c$, so here $\kappa$ would be a value which depends on the value of $\mathbb E[X\vert\mathcal G]$, so $\kappa$ should be replaced with a function of $\mathbb E[X\vert\mathcal G]$, say $K(\mathbb E[X\vert\mathcal G])$
- The function $K$ has no reason to be measurable if $\kappa$ is chosen without further specification. So $K(\mathbb E[X\vert\mathcal G])$ might not be a random variable which means we cannot play with expected values.
Both problems can actually be solved at once: just choose for example $\kappa$ as the right-derivative of $\phi$ at $c$, denoted $\phi'_+(c)$:
$$
\forall t,c\in\mathbb R,\quad\phi(t)\ge\phi(c)+\phi'_+(c)(t-c),
$$
with equality iff $t=c$.
We deduce that
$$
\phi(X)\ge\phi(\mathbb E[X\vert\mathcal G])+\phi'_+(\mathbb E[X\vert\mathcal G])(X-\mathbb E[X\vert\mathcal G]),
$$
with a.s. equality iff $X=\mathbb E[X\vert\mathcal G]$ a.s.
As $\phi$ and $\phi'_+$ are measurable, all terms in the inequality above are random variables. Therefore
$$Y:=\phi(X)-\phi(\mathbb E[X\vert\mathcal G])-\phi'_+(\mathbb E[X\vert\mathcal G])(X-\mathbb E[X\vert\mathcal G])$$ is a random variable, which has nonnegative values.
For technical convenience, suppose first that $\phi(\mathbb E[X\vert\mathcal G])$ and $\phi'_+(\mathbb E[X\vert\mathcal G])$ are bounded. Then we easily deduce that $\mathbb E[Y\vert\mathcal G]=\mathbb E[\phi(X)\vert\mathcal G]-\phi(\mathbb E[X\vert\mathcal G])=0$ (it is our starting assumption) and therefore $\mathbb E[Y]=0$. As $Y$ is nonnegative, we get $Y=0$ a.s. So $X=\mathbb E[X\vert\mathcal G]$ a.s., hence $X$ is a.s. equal to a $\mathcal G$-measurable random variable.
If $\phi(\mathbb E[X\vert\mathcal G])$ or $\phi'_+(\mathbb E[X\vert\mathcal G])$ is not bounded, then define for instance the set $A_n:=\{\vert\phi(\mathbb E[X\vert\mathcal G]\vert\le n,\ \vert\phi'_+(\mathbb E[X\vert\mathcal G])\vert\le n\}$. Then with the same method as above you show that $Y1_{A_n}=0$ a.s. You then readily deduce $Y=0$ a.s. by letting $n$ go to $+\infty$.