I'm using this mapping:
settings index: { number_of_shards: 1, number_of_replicas: 1 }, analysis: {
analyzer: {
custom_analyzer: {
type: "custom",
tokenizer: "standard",
filter: ["lowercase", "asciifolding", "custom_unique_token", "custom_tokenizer"]
}
},
filter: {
custom_word_delimiter: {
type: "word_delimiter",
preserve_original: "true"
},
custom_unique_token: {
type: "unique",
only_on_same_position: "false"
},
custom_tokenizer: {
type: "nGram",
min_gram: "3",
max_gram: "10",
token_chars: [ "letter", "digit" ]
}
}
} do
mappings dynamic: 'false' do
indexes :searchable, analyzer: "custom_analyzer"
indexes :year
end
end
And this query (rails app):
search(query: {match: {searchable: {query:params[:text_search], minimum_should_match:"80%"}}, size:100)
My main problem, if the app is always returning 100 documents (the max wanted). On these 100 documents, only the 10 or 15 first are relevant. Other docs are very too far from the search word.
I tried to: - increase the max_ngram from 3 to 10 - add the minimum should match up to 99%... but I always get 100 results.
I don't really understand, why, for example, if I'm searching "Boucab", il will get 15 great results first, but I will also get "Maucaillou" at the 99th place ? How to reduce the relevance ?
My app is multilingual.
How to not display results with poor scores ? Does I need to use the min_score parameter ? Is-it the only solution ?