4

I understand that for a low damping factor, some nodes will rarely be reached and a high damping factor will slow down the algorithm and cause the random walk to be stuck in 'sinks', and as such a middle ground is preferable. However, is there a precise reason for a 0.85 factor ? Does 0.8, 0.9, or 0.75 work just as well ?

Alp Uzman
  • 12,209
justworks
  • 41
  • 1

1 Answers1

1

As mentioned in the pagerank article

This residual probability, d, is usually set to 0.85, estimated from the frequency that an average surfer uses his or her browser's bookmark feature.

So it is based on human behaviour which is quite random to begin with. But it seems that there is a mathematical way to study this. Here PageRank as a Function of the Damping Factor with slides they even give a mathematical justification

In this paper, we give the first mathematical analysis of PageRank when α changes.

by studying the dominant eigenvector for Pagerank

$$r(α) = (1 − α)v(I − αP)^{−1},$$

where $\alpha\in [0,1]$, and they state the open question

Is there a (deep and yet undiscovered) analytical reason behind the magical value 0.85?

Thomas Kojar
  • 7,349