I know that when $g''(x)>0$ it means that $g$ is strictly convex. By Taylor's theorem we have:
$$ g(x) = g(E(x)) + [x - E(x)]g'(E(x)) + \frac{[x-E(x)]^2}{2}g''(\epsilon_x) $$ for $\epsilon_x\in(x,E(x))$. So for $g''(x)>0$ we have $$ g(x) > g(E(x)) + [x - E(x)]g'(E(x)) $$
But I have no idea how to prove that if $E(g(x))=g(E(x))$ then $X$ is a constant random variable. Any help or hint would be appreciated.