13

I am interested in running Newman's modularity clustering algorithm on a large graph. If you can point me to a library (or R package, etc) that implements it I would be most grateful.

best ~lara

Wok
  • 4,956
  • 7
  • 42
  • 64
laramichaels
  • 1,515
  • 5
  • 18
  • 30
  • You may also try on stats.stackexchange.com . – mbq Aug 19 '10 at 15:57
  • For those interested in answer: http://stats.stackexchange.com/questions/1915/newmans-modularity-clustering-for-graphs – mbq Aug 24 '10 at 21:50

3 Answers3

9

Use the igraph package for R: http://igraph.sourceforge.net/doc/R/fastgreedy.community.html this implements a fast algorithm for community finding using the newman-girvan modularity maximization method.

your code will look like this:

library(igraph)
# read graph from csv file
G<-read.graph("edgelist.txt", format="ncol")
fgreedy<-fastgreedy.community(G,merges=TRUE, modularity=TRUE)
memberships <-community.to.membership(G, fgreedy$merges, steps=which.max(fgreedy$modularity)-1)
print(paste('Number of detected communities=',length(memberships$csize)))
# Community sizes:
print(memberships$csize)
# modularity:
max(fgreedy$modularity)
Roja
  • 131
  • 1
  • 3
1

I'm not quite sure whether the open-source data visualization tool, Gephi, is running with this algorithm. As I know, it runs with the algo in paper: Fast unfolding of communities in large networks

It's also a modularity based methods.

zihaolucky
  • 196
  • 1
  • 11
0

There's a method in the excellent networkx package that returns a Newman-Watts-Strogatz small world graph.

shanethehat
  • 15,460
  • 11
  • 57
  • 87
Richard Careaga
  • 628
  • 1
  • 5
  • 11