Let $\land_p$ be a gate with error $p$ only when the inputs are $1$ and $0$. What can we say about
$$
(x \land_p y) \land_p (x \land_p y)?
$$
If $x=y=1$ then we always get $1$. If $x = 0$ then we always get $0$. When $x = 1$ and $y = 0$, we get the wrong answer $1$ with probability
$$
p \cdot p + p \cdot (1-p) \cdot p = p^2(2-p).
$$
Call that function $f(p)$. We conclude that this construction results in an $\land_{f(p)}$ gate.
The function $f(p)$ is monotone increasing over $[0,1]$, and satisfies $f(p) \leq 2p^2 = (2p)^2/2$. Therefore $f(f(p)) = (2f(p))^2/2 = (2p)^4/2$. More generally, $f^{(t)}(p) = (2p)^{2^t}/2$.
Consequently, if we apply this construction recursively $O(\log\log(1/\epsilon))$ times to your $\land_{1/3}$ gate, we get an $\land_q$ gate with error $q \leq \epsilon$. This requires a gadget of size $2^{O(\log\log(1/\epsilon))} = \operatorname{polylog}(1/\epsilon)$.
To handle a circuit of size $S$, you need to choose $\epsilon = 1/(2S)$, which results in a blowup of $\operatorname{polylog}(S)$.