The accepted answer is 100% correct, but to someone less witty, it might not be immediately obvious that the easiest way to solve the problem is by writing down the equality $0 = E[X(t)(X(s)-X(t))]$ and continuing to work with this expression.
I think it's more intuitive to start with the definition of covariance and it's also worth to emphasize why the "independence of increments property" is useful inside the expectation.
$$Cov(X,Y):=\mathbb{E}\left[XY\right]-\mathbb{E}\left[X\right]\mathbb{E}\left[Y\right]$$
Autocovariance in the context of Brownian motion is nothing more that $Cov(W_s,W_t)$, therefore:
$$Cov(W_s,W_t)=\mathbb{E}\left[W_sW_t\right]-\mathbb{E}\left[W_s\right]\mathbb{E}\left[W_t\right]=\mathbb{E}\left[W_sW_t\right]$$
Above, we trivially used the property that the expected value of Brownian motion is zero.
Now assume that $t>s$ and write $t$ as $t=s+h$. Then:
$$\mathbb{E}\left[W_sW_t\right]=\mathbb{E}\left[W(s)W(s+h)\right]$$
Now in my view the critical step is to realize that although $W(s+h)$ does not equal $W(s)+W(h)$ (i.e. $W(s)$ and $W(h)$ in this context are two independent random processes, whilst $W(s+h)$ is a single Brownian motion), we have that $W(s+h)$ is equal in distribution to $W(s)+W(h)$: since we are interested in the expression inside the expectation, we can use this equality in distribution to replace one expression with the other, writing:
$$Cov(W_s,W_t)=\mathbb{E}\left[W(s)(W(s)+W(h))\right]=\mathbb{E}\left[W(s)^2+W(s)W(h)\right]$$
And using the linearity of the expectation operator, we finally get:
$$Cov(W_s,W_t)=\mathbb{E}\left[W(s)^2\right]+\mathbb{E}\left[W(s)\right]\mathbb{E}\left[W(h)\right]=\mathbb{E}\left[W(s)^2\right]=s$$
Therefore concluding that indeed $Cov(W_s,W_t)=min(s,t)$.
The above "lengthy" reasoning approach might be tedious and not the most elegant to a pure talented mathematician, but for the "average Joe" (i.e. me) it is easier to understand and replicate.