{
"_source": {
"enabled": false
},
"analysis": {
"analyzer": {
"default": {
"type": "custom",
"tokenizer": "uax_url_email",
"filter": "lowercase,standard,stop"
}
}
},
"mappings": {
"table": {
"properties": {
"field1": {
"type": "string",
"include_in_all": false,
"index": "no"
},
"field2": {
"type": "long",
"include_in_all": false,
"index": "no"
},
"field3": {
"type": "string",
"index": "analyzed"
}
}
}
}
}
The analyzer doesn't seem to work when testing it. The analyzer should not index stop words and it should also index an email address as a whole. When I "TEST ANALYZER" and type "Jack is fine", indexing of all three words takes place. I do not want it to index the stopwords in english language such as "and","is" etc.