I really like the above answer, but I spotted another one in another thread which used it as a corollary. I'll write the same as an answer.
The basic idea is that not only can one show the inequality : but one can in fact find the exact difference.
Suppose that $P(X>Y) > 0$ (so that one can divide by it). Following some elementary justification, we can write $$
E[X | X>Y] = \frac{E_X[X1_{X>Y}]}{P_Y(X>Y)}
$$
Some excellent formal justifications for this statement can be found here. Note that it is an equality of random variables. The basic idea is that if one is conditioning with respect to a sigma-algebra defined by a single event, the definition of conditional expectation reduces to some very easily checked conditions.
Anyway, by the tower property of conditional expectation, $$
\frac{E[X1_{X>Y}]}{P(X>Y)} = \frac{E[XE[1_{X>Y}|X]]}{P(X>Y)}
$$
However, note that $X$ is independent of $Y$. So what is $E[1_{X>Y} |X]$? "Informally", given that $X=a$, we just get $E[1_{X>Y} | X=a] = E[1_{a>Y}] = P(a>Y) = F_Y^{-}(a) = F_Y^{-}(X)$, where $F_Y^{-}(a) = P(Y<a)$ is the left-hand-limit of the CDF of $y$ at the point $a$. This can be formally verified by definition, and we have
$$
\frac{E[XE[1_{X>Y}|X]]}{P(X>Y)} = \frac{E[XF_Y^{-}(X)]}{P(X>Y)}
$$
Finally, we write by the definition of covariance $$
\frac{E[XF_Y^{-}(X)]}{P(X>Y)} = \frac{E[X]E[F_{Y}^-(X)]}{P(X>Y)} + \frac{Cov(X, F_{Y}^{-}(X))}{P(X>Y)} = E[X] +\frac{Cov(X, F_{Y}^{-}(X))}{P(X>Y)}
$$
because $E[F_Y^{-}(X)] = P(X>Y)$. Thus, we have the identity $$\boxed{
E[X | X>Y] - E[X] = \frac{Cov(X, F_{Y}^{-}(X))}{P(X>Y)}
}
$$
All we need to do now is to see why the RHS is non-negative. Of course, $P(X(\omega) > Y(\omega)) \geq 0$. What about the numerator, which is constant? Well, the intuition is obvious : as the value of $X$ increases, so does the value of $F_Y^{-}(X)$, because $F_Y^{-}(X)$ is increasing. If the increase of one random variable implies an increase in another random variable, then the two are positively correlated : and their covariance is then intuitively non-negative.
This is now reminiscent of the famous Fortuin-Kasteleyn-Ginibre inequality, whose proof basically proceeds the same way that this one's does from here on. (Which is basically this question whose statement immediately yields the given result anyway).
Namely, suppose that $X_1,X_2$ are independent copies of $X$. Then, $$
E_{X_1,X_2}[(X_1-X_2)(F_{Y}^{-}(X_1) - F_Y^{-}(X_2))] \geq 0
$$
simply because on the sample space, the function $(X_1-X_2)(F_{Y}^{-}(X_1) - F_Y^{-}(X_2))$ is a non-negative function. Now use linearity of expectation and independence to deduce
$$
E_{X_1,X_2}[(X_1-X_2)(F_{Y}^{-}(X_1) - F_Y^{-}(X_2))] = E_{X_1,X_2}[(X_1)F_{Y}^{-}(X_1)] - E_{X_1,X_2}[X_1F_Y^{-}(X_2)] - E_{X_1,X_2}[X_2F_{Y}^{-}(X_1)] + E_{X_1,X_2}[(X_2)F_{Y}^{-}(X_2)] \geq 0
$$
By independence (hence exchangeability) of $X_1$ and $X_2$, the first and fourth (equivalently, both positive) terms are equal. Similarly, the second and third (equivalently, the negative terms) are both equal. In fact we have $$
E_{X_1,X_2}(X_1F_{Y}^{-}(X_1)) = E[XF_{Y}^{-}(X)]
$$
and by independence $$
E_{X_1,X_2}(X_1F_{Y}^{-}(X_2)) = E[X] E[F_{Y}^{-}(X)]
$$
putting these straight back into that expression tells you that $$
E_{X_1,X_2}[(X_1-X_2)(F_{Y}^{-}(X_1) - F_Y^{-}(X_2))] = 2 Cov(X, F_{Y}^{-}(X)) \geq 0
$$
which completes the proof.
Note that while the identity in the box is derived without invocation of the sample space, the FKG computation is essentially a pathwise inequality translated into a proper inequality.
The statement is rather obviously false when $X,Y$ are dependent. What's nice, however, is that you can come up with good counterexamples based on the fact that the covariance of $X$ and $F_{Y}^{-}(X)$ will be negative if one decreases while the other increases!
Therefore, for example let $X \sim N(0,1)$ and $Y = 2X$ (so $Y$ depends on $X$). Then, $F_{Y}^{-}(X) = P(X>2X) = P(X<0)$, therefore as $X$ increases, this probability decreases (and will decrease strictly at the point $0$). It can now easily be proved that $E[X | X > Y] < E[X]$.