4

I learned from this lecture that for the PageRank algorithm the following equation holds:

$$r^{i+1}=L r^{i}$$

I thought when the $r$ vector converges $r^{i+1}=r^{i}$, and hence the equation would become this:

$$r=L r$$

which means that $r$ is just the eigenvector of $L$ with eigenvalue one, if I am not wrong, meaning that to calculate the results of PageRank we just need to find the eigenvector of a matrix with its eigenvalue being one. And since $L$ can be any matrix, then I wonder if that implies that any matrix would have an eigenvector with eigenvalue one?

Alp Uzman
  • 12,209

4 Answers4

5

This is true if $L$ is a positive Markov matrix, which is what we assume in PageRank (all values positive, columns sum to 1). See this answer for more a in-depth explanation.

rb612
  • 3,710
2

I don't know PageRank algorithm, but the answer is absolutely not. For example, the matrix $$ \begin{pmatrix} 3 & 0\\ 0 & 5 \end{pmatrix} $$ has, of course, eigenvalues 3 and 5. Its characteristic polynomial is $(x-3)(x-5)$ which, of course, doesn't have 1 as a root.

I guess you misunderstood something. Definitely, the matrix $L$ cannot be "any" matrix.

0

The existence of positive eigenvalue having an eigenvector with all positive entries for this case guaranteed by the application of Perron Frobenius theorem.

Perron Frobenius theorem: If all entries of a $ n \times n $ matrix $A$ are positive, then it has a unique maximal eigenvalue. Its eigenvector has positive entries.

Though this can't be directly applicable, in this case one assumes the matrix is irreducible, that is, from any page, we can go to any other page via a sequence of pages. Another way of saying this is that the graph must be connected. If this happens, then there will be a power of the matrix which will have all entries positive. So, PF theory is applicable to this power of $ A$ which in turn will imply the result on $A$.

By the way, this is connected with Markov chain theory in probability except that you have to take the transpose of the matrix to get the transition matrix of the Markov chain. Then general theory of Markov chain will imply that there is stationary distribution (the version of the eigenvector) under the same assumption of connected graph. In fact there is a fairly simple expression for the eigenvector for the undirected graph.

Rana
  • 582
0

Even for general matrices coming from networks of webpages it is not necessarily true that $1$ is an eigenvalue. Consider for instance the very simple network consisting of two webpages, $A$ and $B$, and there is only one link, namely $A\to B$. Then the associated matrix is

$$L = \begin{pmatrix} 0 & 0 \\ 1 & 0 \end{pmatrix}, $$

with $0$ as the repeated eigenvalue. This is where the "damping factor" comes into play, mentioned toward the end of the video, and indeed the patented version of PageRank too has this detail (see https://patents.google.com/patent/US6285999B1/en). Ultimately, as others have mentioned, this adjustment with the damping factor is introduced so that the Perron-Frobenius theory kicks in.

Alp Uzman
  • 12,209