0
CustomAnalyzer productIndexAnalyzer = new CustomAnalyzer
        {
            Tokenizer = "standard",
            Filter = new List<string> { TURKISH_LOWERCASE, "apostrophe", "asciifolding", ENGRAM_FILTER }
        };

        IIndicesOperationResponse indicesOperationResponse = _elasticClientFactory.Create().CreateIndex(SearchConstants.ProductIndex, c => c
            .Analysis(analysis => analysis
                .TokenFilters(tf => tf
                    .Add(TURKISH_LOWERCASE, turkishLowercaseTokenFilter)
                    .Add(ENGRAM_FILTER, edgeNgramTokenFilter))
                .Analyzers(a => a
                    .Add(PRODUCT_SEARCH_ANALYZER, productSearchAnalyzer)
                    .Add(PRODUCT_INDEX_ANALYZER, productIndexAnalyzer)
        ))
            .AddMapping<Product>(m => m.MapFromAttributes().Properties(props => props
                .Completion(s => s
                    .Name(p => p.Suggest)
                    .MaxInputLength(20)
                    .Payloads()
                    .PreservePositionIncrements()
                    .PreserveSeparators()
                )
            ))
            );

nest client mapping result on elastic:

"suggest": {
              "type": "completion",
              "analyzer": "simple",
              "payloads": true,
              "preserve_separators": true,
              "preserve_position_increments": true,
              "max_input_length": 20
           },

productIndexAnalyzer changed and I add a TURKISH_KEYWORDS analyzer I indexed the results as follows nest with client.It is set to suggest the type of completion.

CustomAnalyzer productIndexAnalyzer = new CustomAnalyzer
        {
            Tokenizer = "standard",
            Filter = new List<string> { TURKISH_LOWERCASE, "apostrophe", "asciifolding", ENGRAM_FILTER, "turkish_keywords" }
        };

But it suggests the type is distorted when I add a TURKISH_KEYWORDS analyzer and it's a string.

nest client mapping result on elastic:

"suggest": {
              "properties": {
                 "input": {
                    "type": "string"
                 },
                 "output": {
                    "type": "string"
                 },
                 "weight": {
                    "type": "long"
                 }
              }
           }

I'm using NEST Version 1.7.0 and Elasticsearch Version 1.7.3

hezarfen
  • 25
  • 1
  • 5
  • What version of NEST are you using and what version of Elasticsearch are you running against? – Russ Cam Mar 23 '16 at 00:38
  • I'm using NEST Version 1.7.0 and Elasticsearch Version 1.7.3 @RussCam – hezarfen Mar 23 '16 at 07:03
  • How are you updating the definition for `PRODUCT_INDEX_ANALYZER` (to include the `turkish_keywords`)? – Russ Cam Mar 24 '16 at 03:23
  • I added "turkish_keywords" filter to productIndexAnalyzer for updating @RussCam – hezarfen Mar 24 '16 at 06:22
  • No, I mean __how__ are you performing the update? In your question, you don't show how you are updating. Are you closing the index, applying the analyzer change, then opening the index again? When you apply the change, are you also sending the mappings again? If you could show a complete, concise example of what you're doing, it would really help :) – Russ Cam Mar 24 '16 at 06:42
  • I just delete existing index with nest and then I recreated same index with analyzer.@RussCam – hezarfen Mar 24 '16 at 07:59

0 Answers0