1

I have some fairly large .graphml files (~7GB) and I would like to run some algorithms on these files using NetworkX. Whenever I try to read these graphml files with:

print "Reading in the Data...\n"
G = nx.read_graphml('%s' % path_string)
plt.title('%s Network' % name_string)
nx.draw(G)
plt.show()

I get the following output:

/usr/bin/python2.7 /home/user/PycharmProjects/G_Project/Graph.py
Reading in the Data...


Process finished with exit code 139

I'm assuming this happens because my computer runs out of memory when trying to open the file, but I was wondering, is there a way to work with large .graphml files and still use NetworkX?

I've gotten pretty use to NetworkX and find it useful, so if there is some sort of workaround for large graphml files I'd appreciate it.

user3708902
  • 161
  • 1
  • 2
  • 12
  • [This answer](http://stackoverflow.com/a/21661627/3001761) states that exit code 139 means fatal invalid memory access. Not sure that's necessarily an out-of-memory issue. – jonrsharpe Aug 02 '14 at 20:09
  • @jonsharpe, I've tried this on many .graphml files that were around the same size, and also on much smaller .graphml files. For the small graphml files (just a few MB) I was able to read & plot them, but for the larger files (~7GB) I can't do so, thus I'm lead to beleive that it isn't an issue with one specific .graphml file. – user3708902 Aug 03 '14 at 17:03
  • On @jonrsharpe's suggestion, Have you tried running a hardware memory check? – Back2Basics Aug 18 '14 at 10:59

1 Answers1

2

I realize this is not a networkX answer but I would suggest considering graph-tool. It supports graphml format as well.

You can check the comparison of graph-tool and networkx on their website.

Note: I don't have the reputation to comment. Hence, posting as an answer.

learner
  • 150
  • 3
  • 16