What are the algorithms for processing a large graph on a regular user computer (conventionally 8-16 GB RAM).
There is a task to process a sufficiently large graph (to calculate the PageRank) which does not fit completely into the operative memory under the given conditions.
I would like to know what algorithms exist for this or in which direction is it better to start studying? As I understand it, the algorithms for dividing the graph can help me with this, but it is not very clear how, given that it is not possible to build the entire graph in the program at once.
Perhaps there are algorithms for calculating the PageRank for each separate part of the graph and then combining the counting results.
UPD:
More substantive. There is a problem of calculating Pagerank
on a large graph. The counting is done in a Python
program.
A graph is built based on the data using networkx
and the PageRank calculation will be performed using the same networkx
. The problem is that there are RAM limitations, the entire graph does not fit into memory.
So I wonder if there are any algorithms that would allow me to calculate PageRank for graphs smaller (subgraph?) than the original one?