Let X and Y be two random variables such that $\forall$ a, b $\in$ $\mathbb{R}$, \begin{equation} E[(X-a)(Y-b)] = E(X-a) E(Y-b) \end{equation}
Does it imply X and Y are independent r.v.s?
Let X and Y be two random variables such that $\forall$ a, b $\in$ $\mathbb{R}$, \begin{equation} E[(X-a)(Y-b)] = E(X-a) E(Y-b) \end{equation}
Does it imply X and Y are independent r.v.s?
$E((X-a)(Y-b))=E(XY)-aE(Y)-bE(X)+ab$
$E(X-a)E(Y-b)=E(X)E(Y)-aE(Y)-bE(X)+ab$
These are equal, independent of $a$ and $b$ iff $E(XY)=E(X)E(Y)$. In this case, independence of $X$ and $Y$ cannot be inferred.
Whenever $X,Y$ are integrable, we always have $$E[(X-a)(Y-b)] = E[XY - aY - bX + ab] = E[XY] - aE[Y] - bE[X] + ab \\ E(X-a)E(Y-b) = (E[X] - a)(E[Y] - b) = E[X]E[Y] - aE[Y] - bE[X] + ab$$
Compare the two lines to see that this just means $E[XY] = E[X]E[Y]$. That is, if we assume further that $X,Y$ are square-integrable, then all of this is just equivalent to the statement $X,Y$ are uncorrelated.
For some square integrable random variables, this can imply independence (for instance, if $X,Y$ are jointly normal, then they'd be independent) but in general, no, this wouldn't necessarily mean they're independent. [see for instance, this question]