Prove that for any distinct positive integers $a$ and $b$ the number $2a(a^2 + 3b^2)$ is not a perfect cube.
Note that $2a(a^2 + 3b^2) = (a+b)^3 + (a-b)^3.$ If $2a(a^2 + 3b^2)$ were a perfect cube, then the equation $x^3 + y^3 = z^3$ would have a solution in positive integers (both when $a > b$ and when $b > a$). However, the latter claim uses a highly nontrivial theorem, namely Fermat's last theorem for the case n = 3. I was wondering if there's a shorter and more elementary solution to this problem (e.g. infinite descent, Fermat's little theorem, the Euler'fermat theorem, etc.)? Maybe one can assume $a$ and $b$ are coprime? Indeed, write $a=da_1, b=db_1$ where $\gcd(a_1,b_1)=1, d=\gcd(a_1,b_1)$ (note that we didn't need to specify $d=\gcd(a_1,b_1)$). Then $2a(a^2 + 3b^2)$ is a perfect cube iff $2a_1 (a_1^2 + 3b_1^2)$ is a perfect cube (since dividing a perfect cube from a perfect cube yields a perfect cube and multiplying two perfect cubes yields a perfect cube). Then it might be useful to consider the gcd of $2a_1$ and $a_1^2 + 3b_1^2$. We may need additional information on whether $a_1$ and $b_1$ are odd.