I want to get 10001th document from Elastic Search.
How can I overcome the 10k batch limit
http://localhost:9200/_search?size=2&from=9999
Gives this error:
{
"error" : {
"root_cause" : [
{
"type" : "query_phase_execution_exception",
"reason" : "Result window is too large, from + size must be less than or equal to: [10000] but was [10001]. See the scroll api for a more efficient way to request large data sets. This limit can be set by changing the [index.max_result_window] index level setting."
}
],
"type" : "search_phase_execution_exception",
"reason" : "all shards failed",
"phase" : "query",
"grouped" : true,
"failed_shards" : [
{
"shard" : 0,
"index" : ".kibana",
"node" : "UWl8qQL8QomaoALoHI3BUw",
"reason" : {
"type" : "query_phase_execution_exception",
"reason" : "Result window is too large, from + size must be less than or equal to: [10000] but was [10001]. See the scroll api for a more efficient way to request large data sets. This limit can be set by changing the [index.max_result_window] index level setting."
}
}
],
"caused_by" : {
"type" : "query_phase_execution_exception",
"reason" : "Result window is too large, from + size must be less than or equal to: [10000] but was [10001]. See the scroll api for a more efficient way to request large data sets. This limit can be set by changing the [index.max_result_window] index level setting."
}
},
"status" : 500
}