[This is from "Introduction to Mathematical Statistics" by Hogg, McKean, Craig]
Let ${ X _1, \ldots, X _n }$ be iid ${ N (\mu, \sigma ^2) }$ random variables. Consider the sample mean and sample variance
$${ \overline{X} = \frac{1}{n} \sum _{i = 1} ^{n} X _i, \quad S ^2 = \frac{1}{n-1} \sum _{i = 1} ^{n} (X _i - \overline{X}) ^2 . }$$
Thm: The sample mean ${ \overline{X} }$ and sample variance ${ S ^2 }$ are independent. Further the random variable
$${ T = \sqrt{n} \frac{\overline{X} - \mu}{S} }$$
is distributed as Student's ${ t - }$distribution with ${ (n-1) }$ degrees of freedom.
Pf: Consider the vector of ${ 1 }$s ${ \mathbb{1} = (1, \ldots, 1) ^T }$ and the transformation
$${ W = \begin{pmatrix} \overline{X} \\ X _1 - \overline{X} \\ \vdots \\ X _n - \overline{X} \end{pmatrix} = \begin{pmatrix} \frac{1}{n} \mathbb{1} ^{T} \\ I - \frac{1}{n} \mathbb{1} \mathbb{1} ^T \end{pmatrix} X . }$$
We see the random vector ${ W }$ is distributed as
$${ N _{n +1} \left( \begin{pmatrix} \frac{1}{n} \mathbb{1} ^{T} \\ I - \frac{1}{n} \mathbb{1} \mathbb{1} ^T \end{pmatrix} \mu \mathbb{1} , \, \begin{pmatrix} \frac{1}{n} \mathbb{1} ^{T} \\ I - \frac{1}{n} \mathbb{1} \mathbb{1} ^T \end{pmatrix} \sigma ^2 I \begin{pmatrix} \frac{1}{n} \mathbb{1} ^{T} \\ I - \frac{1}{n} \mathbb{1} \mathbb{1} ^T \end{pmatrix} ^T \right) }$$
that is
$${ N _{n + 1} \left( \begin{pmatrix} \mu \\ 0 _{n} \end{pmatrix} , \sigma ^2 \begin{pmatrix} \frac{1}{n} &0 _n ^T \\ 0 _n &I - \frac{1}{n} \mathbb{1} \mathbb{1} ^T \end{pmatrix} \right) . }$$
Especially ${ \overline{X} }$ and ${ (X _1 - \overline{X}, \ldots, X _n - \overline{X}) }$ are independent. Hence ${ \overline{X} }$ and ${ S ^2 = \frac{1}{n-1} \sum _{i = 1} ^{n} (X _i - \overline{X}) ^2 }$ are independent as needed.
Consider the normalised sums
$${ \sum _{i = 1} ^{n} \left( \frac{X _i - \mu}{\sigma} \right) ^2 \, \, \text{ and } \, \, \sum _{i = 1} ^{n} \left( \frac{X _i - \overline{X}}{\sigma} \right) ^2 . }$$
We have
$${ {\begin{align} \sum _{i = 1} ^{n} \left( \frac{X _i - \mu}{\sigma} \right) ^2 = &\, \sum _{i = 1} ^{n} \left( \frac{(X _i - \overline{X}) + (\overline{X} - \mu) }{\sigma} \right) ^2 \\ = &\, \sum _{i = 1} ^{n} \left( \frac{X _i - \overline{X}}{\sigma} \right) ^2 + \left( \frac{\overline{X} - \mu}{\sigma / \sqrt{n}} \right) ^2 . \end{align}} }$$
We see the last two terms are independent, hence taking MGFs
$${ (1 - 2t) ^{-n/2} = E\left[\exp(t \, (n-1) S ^2 / \sigma ^2) \right] (1 - 2t) ^{-1/2} }$$
that is
$${ \sum _{i = 1} ^{n} \left( \frac{X _i - \overline{X}}{\sigma} \right) ^2 \, \sim \, \chi ^2 (n-1) . }$$
The random variable
$${ T = \sqrt{n} \frac{\overline{X} - \mu}{S} = \frac{ (\overline{X} - \mu) / (\sigma / \sqrt{n}) }{\sqrt{ (n -1) S ^2 / (n-1) \sigma ^2}} }$$
is of the form
$${ \frac{N(0, 1) \text{ variable}}{\sqrt{(\chi ^2 (n-1) \text{ variable}) / (n-1)} } }$$
where the normal random variable and the ${ \chi ^2 (n-1) }$ random variable are independent. Hence it is distributed as Student's ${ t - }$distribution with ${ (n-1) }$ degrees of freedom, as needed. ${ \blacksquare }$