0

here's my query:

"query":{ 
  "query":{ 
     "bool":{ 
        "must":[ 
           { 
              "term":{ 
                 "device_id":"1"
              }
           },
           { 
              "range":{ 
                 "created_at":{ 
                    "gte":"2019-10-01",
                    "lte":"2019-12-10"
                 }
              }
           }
        ],
        "must_not":[ 
           { 
              "term":{ 
                 "action":"ping"
              }
           },
           { 
              "term":{ 
                 "action":"checkstatus"
              }
           }
        ]
     }
  },
  "sort":{ 
     "created_at":"desc"
  },
  "size":10,
  "from":0
}

the logs vary a lot with completely different set of fields. What I can do to search if in any of them is a string that I'm looking for? Let's say I'm looking for "mit" and it shows me one log with

surname -> Smith

and the second one with

action -> "message comMITted"

I've tried using match [ '_all' => "mit" ] but it's deprecated I heard.

I'm using Elasticsearch 7.3.1

Zbyszek Kisły
  • 2,110
  • 4
  • 26
  • 48

1 Answers1

0

To perform search across all fields, use copy_to to copy the values of all fields into a grouped field, and perform the queries against that field. In case you have dynamic index mappings refer this.

For supporting infix matching, you can use ngrams. This is a good tutorial for ngrams.

Infix matching can be performed using wildcard query too. But it's not recommended due to performance reasons.