2

I have an API (via hug) that sits between a UI and Elasticsearch.

The API uses elasticsearch-py to run searches, for example:

es = Elasticsearch([URL], http_compress=True)

@hug.post('/search')
def search(body):
  return es.search(index='index', body=body)

This works fine; however, I cannot figure out how to obtain a compressed JSON result.

Elasticsearch is capable of this because a curl test checks out — the following returns a mess of characters to the console instead of JSON and this is what I want to emulate:

curl -X GET -H 'Accept-Encoding: gzip' <URL>/<INDEX>/_search

I've tried the approach here to modify HTTP headers, but interestingly enough the "Accept-Encoding": "gzip" header is already there: it just doesn't appear to be passed to Elastic because the result is always uncompressed.

Lastly, I'm passing http_compress=True when creating the Elastic instance; however, this only compresses the payload — not the result.

Has anyone had a similar struggle and figured it out?

Chris
  • 149
  • 10
  • 1
    Hello Chris! Any chance you have found the solution to this? Thank you! – Yibin Lin Mar 05 '20 at 18:36
  • @YibinLin I'm afraid not :( – Chris Mar 06 '20 at 20:37
  • 1
    I used the following custom header in the Elasticsearch constructor: Elasticsearch([es_address], headers={"Accept-Encoding": "gzip,deflate"},) And it seems like it's automatically de-compressing the response from the ES server. May I ask how do you determine if the response is un-compressed? I am using version elasticsearch==7.1.0 in case it is useful. – Yibin Lin Mar 06 '20 at 20:59
  • Frankly, I don't recall the details — this was over 6 months ago and we were able to move forward without this. Is your approach working? – Chris Mar 11 '20 at 13:11
  • 1
    It seems like by changing headers I mentioned in my previous comment the response is gzipped. However, the performance/response time is even slower than plain JSON (non-zipped). We ended up using just plain JSON response. – Yibin Lin Mar 12 '20 at 16:57

0 Answers0