Disclaimer: This answer is currently downvoted. Needless to say, it is perfectly correct, and it answers the question, apparently to the OP's satisfaction. The downvote might be due to extra-mathematical reasons not worthy of being further commented upon. Happy reading!
Let $Y=|X|$. When $X$ has a PDF $f$, you might convince yourself that $$E(X\mid Y)=g(Y)$$ where $g(y)$ is defined on $y\geqslant0$ by
$$g(y)=y\cdot\frac{f(y)-f(-y)}{f(y)+f(-y)}$$
In the general case, some more measure theoretical machinery is needed. Consider the measures defined by $$\mu_+(B)=P(X\in B)\quad \mu_-(B)=P(-X\in B)\quad \mu(B)=P(Y\in B)$$ for every Borel set $B\subset\mathbb R_+$. Then $\mu_\pm\leqslant\mu$ hence $\mu_\pm\ll\mu$ hence there exists some measurable functions $f_\pm$ such that $$\mu_\pm(B)=\int_Bf_\pm d\mu$$ for every Borel set $B$. Then $E(X\mid Y)=g(Y)$ where $g(y)$ is defined on $y\geqslant0$ by
$$g(y)=y\cdot\frac{f_+(y)-f_-(-y)}{f_+(y)+f_-(-y)}$$
Edit 1: This conditional expectation reflects the fact that, for every $y>0$, the conditional distribution of $X$ conditionally on $Y=y$ is given by $$P(X=y\mid Y=y)=\frac{f_+(y)}{f_+(y)+f_-(-y)}$$ and $$P(X=-y\mid Y=y)=\frac{f_-(y)}{f_+(y)+f_-(-y)}$$ which can be summarized as $$P(X=Y\mid Y)=\frac{f_+(Y)}{f_+(Y)+f_-(-Y)}=1-P(X=-Y\mid Y)$$
Edit 2: As often on this site, one meets a specific problem when answering questions involving conditional expectations, which is that every nontrivial one requires a solid definition of the concept. To understand why the OPs interested in such questions regularly lack any such definition would require a separate analysis, hence, for the time being, we will concentrate on adding some sketchy explanations on this definition, referring the interested reader to some accessible and rigorous source (such as the small blue textbook Probability with martingales, by David Williams).
Thus, in full generality, one considers random variables $X$ and $Y$ defined on the same probability space with $X$ integrable, then $E(X\mid Y)$ is defined as the (class of) random variable(s) $u(Y)$, for some measurable function $u$ such that $u(Y)$ is integrable and, for every bounded measurable function $w$, $$E(Xw(Y))=E(u(Y)w(Y))$$ Equivalently, one can ask that $u(Y)$ is integrable and that, for every event $A$ in $\sigma(Y)$, $$E(X\mathbf 1_A)=E(u(Y)\mathbf 1_A)
$$
In the present case, to prove the formulas proposed in this post when $X$ has PDF $f$, one should find $g$ such that, for every bounded measurable function $w$, $$E(Xw(Y))=E(g(Y)w(Y))$$ that is, since $Y=|X|$ and $X$ has PDF $f$, $$\int xw(|x|)f(x)dx=\int g(|x|)w(|x|)f(x)dx$$ which is equivalent to $$\int_{y>0}yw(y)(f(y)-f(-y))dy=\int_{y>0}g(y)w(y)(f(y)+f(-y))dy$$ This identity holds for every function $w$ if and only if $$y\cdot(f(y)-f(-y))=g(y)\cdot(f(y)+f(-y))$$ almost everywhere for $y>0$, and $g$ follows.