I am trying to create a simple DiGraph in python's NetworkX from nested dictionary but it looks like built in initialization doesn't built final leaf nodes.
Toy example:
class_hierarchy= {-1: ["A", "B"],
"A":{"A1":[1], "A2":[3,4]},
"B": {"B1":[5,6], "B2": [7,8]}}
Building graph:
G = DiGraph(class_hierarchy)
Now let's see what we have in it:
G.nodes
Out[86]: NodeView((-1, 'A', 'B', 'A1', 'A2', 'B1', 'B2'))
Looks like final nodes are not added
Checking it:
list(G.successors('A'))
Out[88]: ['A1', 'A2']
Looks reasonable
But:
list(G.successors('A1'))
Out[89]: []
I am not sure why this is the case? Documentation for NetworkX specifies that:
incoming_graph_data (input graph (optional, default: None)) – Data to initialize graph. If None (default) an empty graph is created. The data can be any format that is supported by the to_networkx_graph() function, currently including edge list, dict of dicts, dict of lists, etc...
Any idea what I am doing wrong?