Say I have a table of numbers ${1,2,3,4,5,6}$. Every time I throw a fair die, if the position in the table is unchecked, it becomes checked and if it is already checked it becomes unchecked. For how long (how many dice throws) can I expect to go on until the whole table is checked?
My first approach to solve this numerically would be to write down a state machine and then a $\bf P$ matrix of conditional transition probabilities, the initial state ${\bf b} = (1\,0\,0\,0\,0\,0\,0)^T$, and approximate the series $$\sum_{k=0}^\infty k({{\bf P}^k{\bf b}})_7$$ to some finite set of terms.
Updated approach: calculate for which the lowest $n$ such that: $$\prod_{k=0}^n (1-{{\bf P}^k{\bf b}})_7 < 0.50$$ I.e. calculate for how many throws the probability of never having had a 6 first goes below 50%. This turns out to be 50.
If we calculate a column of $(P^{10000}+P^{10001})/2$ (they are all the same) we see that we would on average get $1.56 \%$ full checked table in the long run, which is on average one in every 64 throws. Which makes sense that it is larger than the 50 to never have had 6.
However I am curious for any theoretical approaches to solve this problem.