In this section of the Wikipedia article on coprime integers, it is stated that:
More generally, the probability of $k$ randomly chosen integers being coprime is $1/\zeta(k)$.
where $\zeta$ is the Riemann zeta function.
Although there is no proof of this, the statement follows a fleshed out explanation about the probability of two random integers being coprime. Briefly, it proceeds as follows:
- The probability of any integer being divisible by some other integer $p$ is $1/p$ (because every $p^\mathrm{th}$ integer is).
- The probability that two random integers (chosen independently) are both divisible by $p$ is $1/p^2$.
- For a given integer, being divisible by primes $p$ or $q$ are two independent events, and hence the probability of both being true is $1/pq$.
- Therefore the probability that two random integers are coprimes (i.e. that they are not both divisible by any prime) is:
$$ \prod_{\text{prime } p} 1 - \frac{1}{p^2} = \left( \prod_{\text{prime } p} \frac{1}{1 - p^{-2}} \right)^{-1} = \frac{1}{\zeta(2)} \approx 0.61 $$
And it seems straightforward to apply the same reasoning for $k$ integers:
- The probability of $k$ random integers chosen independently being divisible by $p$ is $1/p^k$.
- Hence the probability that they are all coprimes is obtained when they are not all divisible by any prime:
$$ \prod_{\text{prime } p} 1 - \frac{1}{p^k} = \frac{1}{\zeta(k)} $$
My problem with this is one of intuition; intuitively, considering several integers should increase the chances of two of them having a common prime factor, and therefore the probability of $k$ random integers being coprime should asymptotically decrease with $k$. However I think $1/\zeta$ is increasing on $[1, \infty)$, and quickly converges to 1. Where am I going wrong? And what is the right way to think about this?