1

I've read this article. In this article. it's explained that $k$-eigenvectors of a Laplacian matrix are an approximation of RatioCut for $k$ clusters. To prove this, a matrix named $H$ is defined as follow: \begin{equation} h_{ij} = \begin{cases} \frac{1}{|A_j|} & \text{if $v_i\in A_j$} \\ 0 & \text{otherwise} \end{cases} \end{equation}

It's shown that the $k$ eigenvectors associated with $k$ lowest eigenvalues ($U$) are the approximation of the matrix $H$. To find the $k$ clusters on graphs, it's said that we can consider every row of $U$ as a point in the space and apply the $k$-means on them. My question is that matrix $H$ is so specific. Every row of it has only one non-zero value and all the others are zeros. To approximate the clusters, why don't we find the index of the maximum value in every row of $U$ and select that index as the cluster number for that node?

0 Answers0