You have to think carefully the definition of the $O$-notation:
$f(n) = O(g(n))$ if and only if there exists a constant $c>0$ and $N \in \mathbb{N}$ such that
$$
f(n) \le c\cdot g(n), \quad \forall n \ge N.
$$
Note that you need two things: the constant $c$ and $N$. The definition does not require that the inequality $f(n) \le c\cdot g(n)$ holds for all $n$. It requires that it holds for sufficiently large $n$.
In your example, it is easy to see that $f_{1}(n) = O(f_{2}(n))$, because $f_{1}(n) \le 1\cdot f_{2}(n)$, and that happens to hold for all $n$.
What about the second relation? Can you find appropriate $c>0$ and $N$?
(The statement is indeed correct, but you should be able to justify it by finding a good pair of $c$ and $N$.)
Additional remark: when $f(n)$ is a polynomial of $n$, like for example $f(n)=n^{2}+1000n$, then the complexity is determined/dominated by the highest degree term. In this example, the highest degree term is $n^{2}$, and hence $f(n) = O(n^{2})$. But again, more general observation originates from the fact that we are able to find appropriate $c>0$ and $N$ and apply the definition.