Let $g\in \mathcal C^\infty_c (\mathbb R)$ a test function. Set $$f(t,x)=g(x)\text{sgn}(t-x)$$ where $$\text{sgn}(x)=\begin{cases}1&x>0\\0&x=0\\-1&x<0\end{cases}.$$
I have a theorem that says : if
1) $x\mapsto f(t,x)$ is $L^1$ for all $t$,
2) For a.e. $(t,x)\in \mathbb R^+\times \mathbb R$, $\frac{\partial }{\partial t}f(t,x)$ exist and $\left|\frac{\partial f}{\partial t}(t,x)\right|\leq h\in L^1$ for a certain function $h$,
then
$$F(t)=\int_{\mathbb R}f(t,x)dx$$ is derivable and $$F'(t)=\int_{\mathbb R}\frac{\partial }{\partial t}f(t,x)dx.$$
So, of course that $\frac{\partial f(t,x)}{\partial t}$ exist for almost all $t$ and almost all $x$ (since it's derivable on $\mathbb R^2\setminus \{(x,x)\mid x\in\mathbb R\}$ and the derivative is $0$). So, by the theorem, we should have $$F'(t)=0.$$ However, in my solution, they wrote $F'(t)=2f(t)$, and I really don't get how (well, I see that they used the fact that the derivative in distribution sense of the sign function is $2\delta _0$), but in somehow, this mean that the theorem before is not correct anymore because the derivative should be $0$. So what's wrong in the hypothesis here ?