I have Sonarqube up and running with postgres db. I am using Sonar-scanner to scan for errors in multiple source codes which I have collected in another postgres db. As soon as I scan about 5 to 6 codes (each of around 10 lines maximum), the scanner starts failing with "insufficient memory for JRE to continue" error, not being able to malloc around 300mb of memory.
Is there a way I can optimize sonarqube to minimize memory usage, as I believe it is eating up all my memory very quickly, perhaps by clearing cache etc. each time I run the scanner?
UPDATE
I updated my Java from java version "1.7.0_95" to java version "1.8.0_77" and now I get a different error saying "Cannot allocate memory" and the following:
# There is insufficient memory for the Java Runtime Environment to continue.
# Native memory allocation (mmap) failed to map 65536 bytes for committing reserved memory.