There is a bunch of stuff going on here.
First: the entropy definition for discrete rv's does not apply for continuous rv's. Instead, you can use the differential entropy $h(X)$ which is defined for continuous r.v.'s:
$$h(X)=-\int_{\mathcal{X}}f_X(x)\log\big(f_X(x)\big)\,dx\,,$$
with $\mathcal{X}$ being the support of $X$ and $f_X(x)$ the PDF of $X$.
In the same way, the differential conditional entropy $h(Y|X)$ is given by
$$h(X|Y)=-\int_{\mathcal{X,Y}}f_{XY}(x,y)\log\big(f_{X|Y}(x|y)\big)\,dxdy=h(X,Y)-h(Y)$$
with the corresponding definitions of support, joint PDF and conditional PDF. These are also used to define mutual information of continuous r.v.'s.
Also, note that differential entropy can be negative (because PDFs can be larger than 1), and that varies with scale.
Second, the conditional differential entropy $h(g(X)|X)$ is not necessarily zero. Thus, the statement you are trying to prove is not true. See this MSE question for a better exposition and discussion. Hope this helps!