0

I am building Rails web app with elasticsearch-model gem. I would like to build search query with filtering for my Place model. I succeded in wrtiting search by name however I would like to be able to filter my search with city(In this example London).

For now my query looks like this:

"query": {
    "bool": {
      "must": {
        "match": {
          "name": {
            "query": term,
            "operator": "and",
            "fuzziness": 1
          }
        }
      },
      "filter": {
          "term": {
              "city": "London"
          }
      }
    }
  }

and than I simple invoke Place.search("here goes query").records.to_a

without filter part search works fine, but when I add filter I don't get any result.

This is mapping for Place search:

settings analysis: {
    filter: {
        ngram_filter: {
            type: "nGram",
            min_gram: 2,
            max_gram: 20
        }
    },
    analyzer: {
        ngram_analyzer: {
            type: "custom",
            tokenizer: "standard",
            filter: [
                "lowercase",
                "asciifolding",
                "ngram_filter"
            ]
        },
        whitespace_analyzer: {
            type: "custom",
            tokenizer: "whitespace",
            filter: [
                "lowercase",
                "asciifolding"
            ]
        }
    }
} do
  mappings dynamic: 'false' do
    indexes :name,
            type: "string",
            analyzer: "ngram_analyzer",
            search_analyzer: "whitespace_analyzer"

Here is the link I was using to see how to filter: Elasticsearch doc

Marek Michalik
  • 79
  • 3
  • 10
  • One commong pitfall is that the data gets broken down into what Elastic Search calls "tokens" based on the tokenizer in place and the filter has to match the tokens they create and not what data was originally indexed. For instance, after indexing a field with a value of "AD-13" I had tokens of "ad" and "13" and so "AD-13" yielded no results. – bkunzi01 Oct 07 '16 at 16:48
  • I think that I fixed a little bit, I added `indexes :city, type: "string"`. However it work only with city that is composed of single word and is lowercase I think that I have to work with analyzer to fix it. – Marek Michalik Oct 07 '16 at 17:01
  • The analyzer is definitely the issue. You can specify not to analyze fields which is what I did to avoid breaking values with spaces into separate tokens making them search incorrectly. – bkunzi01 Oct 07 '16 at 17:04

1 Answers1

0

I had to define in mapping type of city to string and not perform any sort of analysis on that field when I was building the tokens.

So in mapping I added:

indexes :city,
        type: "string",
        index: "not_analyzed"
Marek Michalik
  • 79
  • 3
  • 10