Let $X_{0}=1$ and define the Markov Chain $X_{n}$ given by the transition probabilities $p_{01}=1$ $p_{k,k+1}=p$ and $p_{k,k-1}=(1-p)=q$, $k\geq 1$ where $p$ is some fixed number in $(0,1)$.
I want to show that if $p\leq\frac{1}{2}$, then $\frac{X_{n}}{n}\xrightarrow{a.s.} 0$ and if $p>\frac{1}{2}$, then $\frac{X_{n}}{n}\xrightarrow{a.s.}p-q$.
Intuitively, I understand why this must be the case. I have shown that for $p>\frac{1}{2}$, the chain is transient and for $p\leq \frac{1}{2}$ the chain is positive recurrent by applying results from the usual simple random walk.
I can see that therefore, if $p> \frac{1}{2}$, then $X_{n}$ will hit $0$ only finitely many times.
I can write $X_{n}=X_{n}-X_{T_{last}^{0}}$ where $T_{last}^{0}$ denotes the last time $X_{n}$ hits $0$ and then, as for $p>\frac{1}{2}$ and then try and then try to do something with $\frac{X_{n}-X_{T_{last}^{0}}}{n}$ but the problem is that $T_{last}^{0}$ is not a stopping time and so I cannot any the markov property.
Similarly, for $p\leq\frac{1}{2}$, I can see that it will hit $0$ infinitely often and hence, $X_{n}(\omega)$ should remain bounded almost surely. But I am failing at doing this rigorously.
Any help is appreciated.