I downloaded graph via osmnx
and I want to get distance between all the nodes.
I call nx.all_pairs_dijkstra_path_length(G, weight='length')
which returns generator of dicts.I then store it into DataFrame and then to .csv
How to cope with huge graphs (Stockholm, DC, Paris, etc.)?
I did this, which worked up to size of Amsterdam:
G = ox.graph_from_place('Amsterdam, Netherlands', network_type='drive')
skim_generator = nx.all_pairs_dijkstra_path_length(inData.G,weight='length') # this is generator of dicts
skim_dict = dict(inData.skim_generator)
skim = pd.DataFrame(inData.skim_dict).fillna(_params.dist_threshold).T.astype(int)
skim.to_csv(_params.paths.skim, chunksize=20000)
Yet now it kind of destroyed my RAM with bigger networks. So I try to tweak it and fit into memory, yet it is painstackingly slow, how to improve this?
ret = dict()
first = True
j=0
for i in nx.all_pairs_dijkstra_path_length(_inData.G, weight='length'):
ret[i[0]]=dict(i[1])
j+=1
if divmod(j,chunk)[1]==0:
print(j,_inData.nodes.shape[0])
df = pd.DataFrame(ret).reindex(_inData.nodes.index).fillna(999999).astype(int)
df.T.to_csv(path, mode = 'w' if first else 'a', header = first)
first = False
ret = dict()