If I have a covariance matrix of some random vector $X$ with expectation $\mathbb{E} (X) =: \mu \in \mathbb{R}^n$, it is not that difficult to show that its covariance matrix is positive semidefinite; given
$$ \operatorname{Cov} (X) = \mathbb{E} \left[ (X-\mu) (X-\mu)^T \right] $$
For any vector $z\in \mathbb{R}^n$, we have
$$ z^T \operatorname{Cov} (X) z = z^T \mathbb{E} \left[ (X-\mu) (X-\mu)^T \right] z = \mathbb{E} \left[ z^T (X-\mu) (X-\mu)^T z \right] $$
which is just the inner product squared
$$ \mathbb{E} \left[ z^T(X-\mu) (X-\mu)^T z \right] = \mathbb{E}\left[ \langle z, X - \mu \rangle^2 \right] \geq 0 $$
and hence always greater than or equal $0$. I don't really understand however why a non-PSD cannot function as a covariance matrix on an intuitive level. Suppose you have a non-PSD matrix. Can you prove by contradiction that it can not be the covariance matrix of some random vector $X$?