Apologies for my first hasty attempt, perhaps this will be more satisfactory. As you noted, the problem is equivalent to showing that $\sum_i \log(X_i) = -\infty$ a.s.
Firstly, since the logarithm is concave and not affine and $X_i$ is not a.s. constant, one has by Jensen's inequality that
$$0 = \log(1) \geq \log q = \log(E[X_i]) > E[\log(X_i)] := \mu.$$
You can see, e.g. Convexity and equality in Jensen inequality for when Jensen's inequality can be an equality. In our case, it must be strict inequality, which we used in the previous line.
Thus it suffices to show (set $Y_i := -\log(X_i)$) the following:
Theorem. If $Y_i$ are iid with $\mu := E[Y_i] > 0$, then $\sum_i Y_i = \infty$ a.s.
Lemma. For any real valued $(y_i)_i$, if $\frac{1}{n} \sum_{i=0}^n y_i \to \mu > 0$ then $\sum_i y_i = \infty$.
Proof of lemma. Let any subsequence $n_k \to \infty$ be given.
Then $$\sum_{i=0}^{n_k} y_i =(\frac{1}{n_k}\sum_{i=0}^{n_k} y_i) n_k \to \mu \times \infty = \infty.$$ In particular this shows $\liminf_n \sum_{i=1}^{n} y_i = \infty$, so $\sum_i y_i = \infty$. QED.
Proof of theorem. If $\mu < \infty$ then by the law of large numbers $\frac{1}{n}\sum_{i=0}^n Y_i \to \mu >0$ a.s.
By the lemma, $\sum_i Y_i = \infty$ a.s. in this case.
Now consider the case $\mu = \infty$. Considering that $E[Y_i] > -\infty$ and writing $Y_i = Y_i^+ - Y_i^-$ with $Y_i^+ = \max(Y_i, 0)$ and $Y_i^-= \max(-Y_i, 0)$ the usual decomposition of $Y_i$ into its positive and negative parts, it must be that $E[Y_i^-] \in [0,\infty)$ and $E[Y_i^+] = \infty$.
In particular
$$
E[\max(Y_i, n)] = E[\max(Y_i^+,n) - Y_i^-] = E[\max(Y_i^+,n)] - E[Y_i^-] \to \infty
$$
by monotone convergence of $\max(Y_i^+,n) \to Y_i^+$. In particular we may choose $N$ large enough that $0 < E[\max(Y_i, N)] \leq N$. The random variables $(\max(Y_i, N))_i$ are iid with mean in $(0,\infty)$, so by what we showed earlier $\sum_i \max(Y_i, N) = \infty$ a.s. Then it follows that
$$
\infty = \sum_i \max(Y_i, N) \leq \sum_i Y_i
$$
a.s., QED.