What happens when we try to ingest more documents into 'Lucene' instance past its max limit of 2,147,483,519?
I read that as we approach closer to 2 billion documents we start seeing performance degradation. But does 'Lucene' just stop accepting new documents past its max limit.
Also, how does 'Elasticsearch' handle the same scenario for one of its shard when it's document limit is reached.