Let $B$ be a standard Brownian motion and the local time of $B$ at $0$ defined by Tanaka's formula \begin{equation} L_t = |B_t| - \int_0^t sgn(B_s) dB_s \ . \end{equation} Now I want to prove that $L_t$ only increases on the zero set of $B$. I was given the pointer to explain why it suffices to show that for fixed rationals $r < r'$, $B \neq 0$ on $[r,r']$ implies $L_r = L_{r'}$. I don't really see how looking at rationals will help me here.
Edit: What I tried: Let $p<q$ be rationals and $B_s \neq 0$ a.s. for all $s \in [p,q]$. By continuity of $L$ we have that either $B_s >0$ or $<0$ for all $s \in [p,q]$. Then, \begin{equation} L_{q} - L_p = | B_{q} | - | B_p |- \int_p^q sgn(B_s) dB_s = 0, \end{equation} because $sgn(B_s) = 1$ or $-1$.
Is this approach correct? And why wouldn't it work for any reals $u<v \in \mathbb{R}$?