24

Anyone know why and how to resolve this as I have a very busy updates and searches at the same time.

Error opening new searcher.

Exceeded limit of maxWarmingSearchers=2, try again later

Community
  • 1
  • 1
d4v1dv00
  • 971
  • 4
  • 10
  • 21

3 Answers3

19

As per the Solr FAQ: What does "exceeded limit of maxWarmingSearchers=X" mean?

If you encounter this error a lot, you can (in theory) increase the number in your maxWarmingSearchers, but that is risky to do unless you are confident you have the system resources (RAM, CPU, etc...) to do it safely. A more correct way to deal with the situation is to reduce how frequently you send commits.

What this error means is that you are basically making commits too often and the internal cache can't keep up with the frequency you are saying "clear the cache and let me search with the new data". You need to decrease the frequency you are making commits. You can find more information about this problem here in Near Realtime Search Tuning but the basic idea is the more facets you use the greater the interval you will need between commits.

One way I got around this was I stopped making manual commits (i.e. having my application submit data to solr and then executing a commit request) and turning on solr autocommit.

Here's an example:

<!-- solrconfig.xml -->
<autoCommit>
  <maxDocs>10000</maxDocs> <!-- maximum uncommited docs before autocommit triggered -->
  <maxTime>15000</maxTime> <!-- maximum time (in MS) after adding a doc before an autocommit is triggered -->
  <openSearcher>false</openSearcher> <!-- SOLR 4.0.  Optionally don't open a searcher on hard commit.  This is useful to minimize the size of transaction logs that keep track of uncommitted updates. -->
</autoCommit>

You will have to figure out how large of interval you will need (i.e. maxTime) but in practice every time I add more faceted search to my application (or more indexes or what have you) I have to increase the interval.

If you need more real time search than the frequency of these commits will allow you can look into Solr soft commits.

jcroll
  • 6,875
  • 9
  • 52
  • 66
  • Great answer. I also read from `solrconfig.xml` that instead of enabling autoCommit, consider using "commitWithin" when adding documents. Read more [here](http://wiki.apache.org/solr/UpdateXmlMessages#Passing_commit_and_commitWithin_parameters_as_part_of_the_URL) – Stanley Sep 10 '15 at 04:00
17

As it's well explained here you should reduce the number of commits you make, or change the value of maxWarmingSearchers in solrconfig.xml (which is not a good practice)

Genjo
  • 371
  • 1
  • 5
  • 15
Samuele Mattiuzzo
  • 10,760
  • 5
  • 39
  • 63
  • 2
    They say :"that is risky to do unless you are confident you have the system resources (RAM, CPU, etc...) to do it safely" okay but how do you know you have enough resources?? – lizzie Aug 31 '12 at 10:07
  • Okay. I've been also looking into this and the user's guide says: "If you only encounter this error infrequently because of fluke situations, you'll probably be ok just ignoring it." – TTT Aug 31 '12 at 10:38
  • 1
    Getting this error in CF trying to index 35K documents. I found breaking down to adding the files individually with a `` gave my CPU enough time to finish its commits. Obviously 35K dox at an extra 1.5 seconds made for a very long indexing. I also changed the maxWarmingSearchers to 8 in the solrconfig.xml and in CF admin set Solr Server>Show Adv Set>Buffer Limit from 40 to 80. Monitored my Task Manager>Performance go from 58% to 10-28%. – gordon Sep 10 '13 at 21:37
  • Additionally, I found a static sleep() delay eventually became not long enough so I had to count processed files and mathematically increase the delay as more files were added. ' ' hth – gordon Sep 12 '13 at 20:08
  • I'm doing load testing, does it mean that Solr can only handle a number proportional to `maxWarmingSearchers` number of concurrent requests? – WoLfPwNeR May 17 '16 at 00:34
0

As per the https://wiki.apache.org/solr/CommitWithin

There are multiple commit strategies in Solr. The most known is explicit commits from the client. Then you have AutoCommit, configured in solrconfig.xml, which lets Solr automatically commit adds after a certain time or number of documents, and finally there may be a behind-the-scenes commit when the input buffer gets full.

You can use 'CommitWithin' to handle this problem.
Solr3.5 and later, it can be server.add(mySolrInputDocument, 10000);
In earlier versions, you can use below code
UpdateRequest req = new UpdateRequest(); req.add(mySolrInputDocument); req.setCommitWithin(10000); req.process(server);
It can reduce the frequently you send the commits.

Chao Jiang
  • 483
  • 3
  • 13