In our Sitecore 8.2 installation we use Solr 5.1.0 as an indexing system. Recently we have had some issues like this:
[sitecore_analytics_index] org.apache.solr.common.SolrException; org.apache.solr.common.SolrException: Error opening new searcher Caused by: org.apache.lucene.store.AlreadyClosedException: this IndexWriter is closed Caused by: java.lang.OutOfMemoryError: Java heap space
What is the correct way to choose the heap threshold to give to Solr?
At the moment, among the different cores, the only one that exceeds a few hundred megabytes is sitecore_analytics_index
which has a size of 32.67 GB and these features:
- Num Docs: 102015908
- Max Doc: 105114766
- Heap Memory Usage: -1
- Deleted Docs: 3098858
- Version: 5563749
- Impl:org.apache.solr.core.NRTCachingDirectoryFactory
- org.apache.lucene.store.NRTCachingDirectory:​NRTCachingDirectory(lockFactory=org.apache.lucene.store.NativeFSLockFactory@​2e51764c;maxCacheMB=48.0 maxMergeSizeMB=4.0)
The server has 6 GB of RAM, 4 GB of which dedicated to Java, below some args of the JVM:
-XX:+CMSParallelRemarkEnabled-XX:+CMSScavengeBeforeRemark-XX:+ParallelRefProcEnabled-XX:+PrintGCApplicationStoppedTime-XX:+PrintGCDateStamps-XX:+PrintGCDetails-XX:+PrintGCTimeStamps-XX:+PrintHeapAtGC-XX:+PrintTenuringDistribution-XX:+UseCMSInitiatingOccupancyOnly-XX:+UseConcMarkSweepGC-XX:+UseParNewGC-XX:CMSInitiatingOccupancyFraction=50-XX:CMSMaxAbortablePrecleanTime=6000-XX:ConcGCThreads=4-XX:MaxTenuringThreshold=8-XX:NewRatio=3-XX:ParallelGCThreads=4-XX:PretenureSizeThreshold=64m-XX:SurvivorRatio=4-XX:TargetSurvivorRatio=90-Xms4G-Xmx4G-Xss256k-verbose:gc
Based on this amount of data, which is the correct configuration of the heap?