I am using elasticsearch and fscrawler for searching about 7TB of data.. The process starts well until it just stalls after sometime. It must be running out of memory, I am trying to add my heap using https://fscrawler.readthedocs.io/en/latest/admin/jvm-settings.html but I keep getting the error invalid maximum heap size.
Is that the right way of setting up the heap? What am I missing?