1

Just want to get some ideas from anyone who have encountered similar problems and how did you guys come up with the solution.

Basically, we have around 10K documents stored in RavenDB. And we need the ability to allow users to perform filter and search against those documents. I am aware that there is a maximum of 1024 page size within RavenDb. So in order for the filter and search to work, I need to do my own paging. But my solution gives me the following error:

The maximum number of requests (30) allowed for this session has been reached.

I have tried many different ways of disposing the session by wrapping it around using keyword and also explicitly calling Dispose after every call to RavenDb with no success.

Does anyone know how to get around this issue? what's the best practice for this kind of scenario?

var pageSize = 1024;
var skipSize = 0;
var maxSize = 0;

using (_documentSession)
{
    maxSize = _documentSession.Query<LogEvent>().Count(); 
}                                   

while (skipSize < maxSize)
{
    using (_documentSession)
    {
        var events = _documentSession.Query<LogEvent>().Skip(skipSize).Take(pageSize).ToList();

        _documentSession.Dispose();             

        //building finalPredicate codes..... which i am not providing here....

        results.AddRange(events.Where(finalPredicate.Compile()).ToList());

        skipSize += pageSize;
    }  
}
Thomas Freudenberg
  • 5,048
  • 1
  • 35
  • 44
developer
  • 718
  • 9
  • 28
  • andrew do your users need to be able to see more than 1024 documents in a single view? Can you post the code which shows how your `_documentSession` is created? – wal Nov 06 '14 at 02:17
  • I noticed also you are calling `ToList()` on this line `var events = _documentSession` <-- this is going to pull out 1024 *then* your conditions will be applied. You should try to pass the filters down to the query. – wal Nov 06 '14 at 02:21
  • good point, that will optimize the query. but it won't actually solve root problem, i think. – developer Nov 06 '14 at 02:43
  • there are ways to get around the request limit (30) however it exists simply to encourage the developer to think more about the issue at hand. You still did not reply as to whether you require more than 1024 docs in a single view to the client? (ie same session) - Basically the framework is encouraging you to paginate your results to the client (in page sizes of 128 by default but up to 1024 if required) – wal Nov 06 '14 at 02:47
  • For the client view, we don't need to have 1024 docs to be shown at once. If Raven wants to encourage developers to paginate, then it shouldn't just apply our predicates against the first 1024 documents and ignoring the rest. Example: _documentSession.Query().Where(finalPredicate.Compile()).ToList() . If you have records that match the predicate and they are located let's say at 2045, it wont be returned by that query. – developer Nov 06 '14 at 03:10
  • It *does* work that way. the predicate will be applied against ALL documents not just the first 1024. As explained before the fact that you are using `ToList()` prematurely breaks this – wal Nov 06 '14 at 03:12
  • hmm.. let me try that again. I am pretty sure it didnt work. but I might be wrong. brb – developer Nov 06 '14 at 03:13
  • wal, I have tried it and i dont think raven works that way. I have used this query: _documentSession.Query().Where(finalPredicate.Compile()).ToList(); and that didnt give me anything back. I have to skip to a few 1024 pages before I can get the query to return. If you have time, just try it out urself, but make sure you have more than 1024 pages. Thx – developer Nov 06 '14 at 03:25
  • trust me it does work the way I stated - something must be wrong with what you are doing. try something more basic in your query, ie instead of `finalPredicate.Compile( ))` do `Where(e=>e.Field==foo)` or similar to return results you know are 'past' 1024 – wal Nov 06 '14 at 03:27
  • You should put the Skip Take Stuff after your query. eg `var events = _documentSession.Query()Where(e=>e.Field1=foo).Skip(skipSize).Take(pageSize)` – wal Nov 06 '14 at 03:29
  • if raven will scan the whole documents, then i dont need the skip and take because the matching result that i have at the moment is less than 1024. but they are located quite a number of 1024 pages behind. – developer Nov 06 '14 at 03:42
  • remove your entire while loop and Skip as well (but leave Take as the default is 128) - you will get results past the first 1,024 if all else is done properly. this is the way it works – wal Nov 06 '14 at 03:46

1 Answers1

1

Raven limits the number of Request (Load, Query, ...) to 30 per Session. This behavior is documented.

I can see that you dispose the session in your code. But I don't see where you recreating the session. Anyways loading data they way you intend to do is not a good idea.

We're using indexes and paging and never load more than 1024.

If you're expecting thousands of documents or your precicate logic doesn't work as an index and you don't care about how long your query will take use the unbounded results API.

var results = new List<LogEvent>();
var query = session.Query<LogEvent>();

using (var enumerator = session.Advanced.Stream(query))
{
    while (enumerator.MoveNext())
    {
        if (predicate(enumerator.Current.Document)) {
            results.Add(enumerator.Current.Document);
        }
    }
}

Depending on the amount of document this will use a lot of RAM.

dusky
  • 1,133
  • 7
  • 12