Suppose we have an algorithm that gives either one of the two correct answers for a problem with probability $1-q = \frac{8}{\pi^2}$. I want to show that if we run this algorithm $O(\log_2 r)$ times and consider the most frequently occurring result as our answer then this will be correct with probability at least $1-\frac{1}{2^r}$.
My initial thought was to bound the probability of error by the probability of our algorithm returning the wrong answer in at least one-third of the runs. That is, we can model the algorithm runs as independent random variables $X_i \sim \text{Be}(q)$ and say that the probability of error in $3k$ runs is at most $$\mathbb{P}\left(\sum_{i=1}^{3k} X_i\geq k\right)$$
The sum of $X_i$ follows the Binomial distribution $\text{Bin}(3k, q)$. I have asked this question related to my problem but the answer given there suggests this approach won't yield the bound I'm looking for.
Any help would be appreciated!