0

I'm using web crawler(Scrapy) in python to continuously downloading data(words), and save the words as nodes into neo4j by py2neo at real time. My code simply looks like:

graph = Graph("http://localhost:7474/db/data/")

def create(graph,word):
    node=graph.merge_one("WORD","name",word)
    return node

My web spider call this create function everytime it get a word. However, I find Jave memory of Neo4j keep increasing as my spider crawl the webpage. Beacuse my spider will run for weeks, I'm afraid there will be out of memory one day. I just want to create a node every time and will not use this node afterwards, why the Java memory keep increasing? I think the created node objects always stay alive in Java, but how to release these unnecessary objects? so is there any method to keep the Java memory not increasing?

Thanks in advance

1 Answers1

0

Sorry for this.

I just found that when the Java memory reach to a certain value, it will automatically decrease and increase again. Sorry for this question, I'm new to neo4j and Java