0

I have implemented the page rank convergence in the below manner, Summing up all page rank scores for all the pages and comparing to previous iteration. One of my friends gave me the below explanation: " If you look at the pagerank paper they state that their implementation would result in all scores to sum to 1 and these sum will not change over multiple iterations. This is from the paper: "Note that the PageRanks form a probability distribution over web pages, so the sum of all web pages' PageRanks will be one"

Whats the right approach for finding if the page rank algorithm has converged or not? Please note i am implementing page rank in Hadoop as well as spark. Kindly advice, i am confused.

Vinayak
  • 61
  • 1
  • 1
  • 6

1 Answers1

0

You can do define convergence as in any iterative algorithm. If change of estimated parameters between iteration is lower than a threshold value algorithm converged.

  • For google Page Rank Algorithm, will taking a total of page rank of all pages in second iteration and comparing it to first iteration work? – Vinayak Dec 10 '16 at 06:26