0

I am using elasticsearch and fscrawler for searching about 7TB of data.. The process starts well until it just stalls after sometime. It must be running out of memory, I am trying to add my heap using https://fscrawler.readthedocs.io/en/latest/admin/jvm-settings.html but I keep getting the error invalid maximum heap size.

enter image description here

Is that the right way of setting up the heap? What am I missing?

dadoonet
  • 14,109
  • 3
  • 42
  • 49
Denn
  • 447
  • 1
  • 6
  • 27

1 Answers1

0

I think you are using the 32 bit version of Java. If that's the case, you need to install the 64 bit JVM and make sure to update your JAVA_HOME to reflect the new version.

More detailed info can be found here.

Val
  • 207,596
  • 13
  • 358
  • 360