In the regression setting, the hat matrix $\bf{H}_{n \times n}=\bf{X} (\bf{X}^T\bf{X})^{-1}\bf{X}^T$ is an idempotent matrix.
Why $\boxed{\bf{H} \bf{P}= \bf{P}}$ is true? where $\bf{P}$ is an $n \times n$ matrix of $p \in\mathbb{R} \quad \forall \,\, ij$.
To be clear, an example of $\bf{P}$ is the following matrix:
$\begin{pmatrix} 1&1&\ldots&1 \\ 1&1&\ldots&1 \\ \vdots&\vdots&\ddots&\vdots \\ 1&1&\ldots&1 \end{pmatrix}_{n \times n}$
The motivation for this question is that I'm trying to prove the independence between SSE and SSR by showing
$\frac{1}{\sigma^2}SSE \equiv \frac{1}{\sigma}\bf{y}^{T}(\bf{I-H})\bf{y}\frac{1}{\sigma}$ and $\frac{1}{\sigma^2}SSR \equiv \frac{1}{\sigma}\bf{y}^{T}(\bf{I-\frac{1}{n}J})\bf{y}\frac{1}{\sigma}$ are independent since
$(\bf{I-H})\bf{I} (\bf{H-\frac{1}{n}J}) = \bf{0}$ which implies that $\bf{H}\frac{1}{n}\bf{J} - \frac{1}{n}\bf{J}=\bf{0}$
I just replace $\bf{J}$ by $\bf{P}$ in my question. I know that $\bf{J}$ is in the column space of $\bf{H}$, but still don't see how that answers to my question.
Could anyone please elaborate on the following explanation for proving that the elements of any column (or row) of $\bf{H}$ sum up to 1?
The matrix $\bf{H}$ is the projection matrix onto the column space of $\bf{X}$. But the first column of $\bf{X}$ is all ones; denote it by $\bf{u}$. This implies that $\bf{Hu}$ = $\bf{u}$, because a projection matrix is idempotent. The i-th coordinate of $\bf{Hu}$ is the sums of elements of the i-th row of $\bf{H}$, so we claim is true for rows. By the symmetry of $\bf{H}$, it must also be true for columns. We are done