Expanding on the comment of Henry, following his answer.
$\underline{\text{Preliminary Results}}$
PR-1
$\sum_{i=0}^n x^i = \frac{1 - x^{n+1}}{1-x}.~$
Proof
Proceed by induction.
$~n=1.~$
$(1 + x^1)(1 - x) = 1 - x^2.$
Assume true for $~n = N.$
Then $~\displaystyle \left( ~1 + x^1 + \cdots + x^N + x^{N+1} ~\right) \times (1 - x) $
$\displaystyle = \left[ ~\left( ~1 + x^1 + \cdots + x^N ~\right) \times (1 - x) ~\right] + \left[ ~x^{N+1} \times (1 - x) ~\right]$
$\displaystyle = \left[ ~1 - x^{N+1} ~\right] + \left[ ~x^{N+1} - x^{N+2} ~\right]$
$\displaystyle = 1 - x^{N+2}.$
PR-2
For $~\displaystyle |x| < 1, \lim_{n\to\infty}x^n = 0.$
Proof
Immediately seen to be true when $~x = 0.~$
Now, assume that $~0 < x < 1 \implies \ln(x)~$ is some fixed negative constant.
Then $~\ln(x^n) = n\ln(x),~$
and as $~n \to \infty,~$ you have that $~n\ln(x) \to -\infty.~$
Therefore, (somewhat informally)
as $~n \to \infty, ~\displaystyle x^n = e^{n\ln(x)} \to e^{-\infty} = 0.$
Now suppose that $~-1 < x < 0,~$ and let $~y = -x \implies 0 < y < 1.~$
Then, by the first part of this proof, you have that
$~\displaystyle \lim_{n \to \infty} y^n = 0.~$
You also have that the sequence $~x^1, ~x^2, ~x^3,~$ is the same as the sequence
$-y^1, ~+y^2, ~-y^3, ~+y^4, \cdots, ~$
with $~\pm y^n~$ going towards $~0.~$
Therefore, when $~-1 < x < 0,~$
you also have that $~\displaystyle \lim_{n \to \infty} x^n = 0.~$
PR-3
For $~\displaystyle |x| < 1, ~\sum_{i=0}^n x^i = \frac{1}{1-x}.~$
Proof
Using the result in PR-1, for a specific positive integer $~n,~$
you can regard the estimation of $~1 + x + x^2 + \cdots + x^n \approx \dfrac{1}{1 - x},~$
as having an error of $~\displaystyle \frac{-x^{n+1}}{1 - x}.~$
Then, by PR-2, for $~|x| < 1,~$ you have that the error goes to $~0.$
PR-4
For $~\displaystyle |x| < 1, ~\sum_{i=0}^\infty ix^i = \frac{x}{(1-x)^2}.~$
Proof
See the preliminary results in this answer.
See also the Addendum, at the end of this answer, for an alternative proof of PR-4.
Inelegant solution based on geometric series.
Let $~f(n)~$ denote the probability that it takes exactly $~n~$ die throws to get the first $~6.~$
Then
$$f(n) = \left[ ~\frac{5}{6} ~\right]^{n-1} \times \frac{1}{6}.$$
Therefore, the expected number of throws is
$$\sum_{n=1}^\infty [ ~n \times f(n) ~]$$
$$= \frac{1}{6} \times\sum_{n=1}^\infty \left\{ ~n \times \left[ ~\frac{5}{6} ~\right]^{n-1} ~\right\}$$
$$= \frac{1}{6} \times \frac{6}{5} \times\sum_{n=1}^\infty \left\{ ~n \times \left[ ~\frac{5}{6} ~\right]^n ~\right\}. \tag1 $$
Note that the expression in (1) above can harmlessly start at $~n = 0,~$ because of the factor of $~n.~$
Therefore, the expected number of die throws is
$$= \frac{1}{5} \times \sum_{n=0}^\infty \left\{ ~n \times \left[ ~\frac{5}{6} ~\right]^n ~\right\}. \tag2 $$
All that remains is to plug in the formula for PR-4, into the expression in (2) above.
This gives
$$\frac{1}{5} \times \frac{5/6}{\left[ ~1 - 5/6 ~\right]^2}$$
$$= \frac{1}{5} \times \frac{5/6}{1/36} = \frac{1}{5} \times [ ~5 \times 6 ~] = 6.$$
$\underline{\text{Addendum}}$
For what it's worth, assuming that PR-3 is true, PR-4 may also be (inelegantly) alternatively proven as follows:
For $~|~x~| < 1,~$
let $~S~$ denote $~1 + x + x^2 + \cdots = \dfrac{1}{1 - x}.~$
Then,
$$1x + 2x^2 + 3x^3 + \cdots$$
may be alternatively expressed as
x + x^2 + x^3 + x^4 + ... : equal to x \times S
+ x^2 + x^3 + x^4 + ... : equal to x^2 \times S
+ x^3 + x^4 + ... : equal to x^3 \times S
+ x^4 + ... : equal to x^4 \times S
+ ...
So, you end up with
$$S \times \left[ ~x + x^2 + x^3 + x^4 + \cdots ~\right]$$
$$ = \frac{1}{1 - x} \times \frac{x}{1 - x}.$$