I have a single node ElasticSearch cluster that has one index (I know, my bad) where I inserted 2B documents.
I did not know it was a best practice to split indices and mine grew to 400GB before it crashed.
I tried splitting my index with (https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-split-index.html) and I keep getting java.lang.OutOfMemoryError no matter what I do. I have maxed out my physical memory and threads just got stuck in the _split.
I had some files that were deleted via logstash when they were successfully indexed, so reinserting the data is not an option.
Any suggestions?