I have Kafka running on confluent cloud where I can produce data with a Node.js client, data is sent as a string and I get following fields in confluent cloud.
Then, I created a ElasticsearchSink Connector
and connected it to elastic search cloud. If I don't create any mapping in elastic search, transfer of data is successful as excepted and the format is something like this.
"_source" : {
"booked" : false,
"phone_number" : "919191919191",
"location" : {
"lon" : 60.23,
"lat" : 78.233
}
}
Now the problem is If I want to run any geo queries
it won't allow me and give the following error:
"root_cause" : [
{
"type" : "query_shard_exception",
"reason" : "failed to find geo_point field [location]",
"index_uuid" : "C8Xxu9QlTMKN4Lk1LjpOmQ",
"index" : "locations"
}
Reason being dynamic mapping does not support geo_field. So now when I try to create a custom mapping for elastic search while creating an index as follow:
PUT /locations
{
"mappings": {
"properties": {
"phone_number": {
"type": "text"
},
"booked": {
"type": "boolean"
},
"location": {
"type": "geo_point"
}
}
}
}
Then confluent connector fails and shows the following error:
There is a mapping collision in your index: Can't merge a non object mapping with an object mapping.
I have also tried booked
as a text
field but things does not seem to work. I haven't enforced any schema on Confluent cloud.
Here is few basic config from confluent cloud.
How can I enforce mapping so that I can run geo queries
in Elastic Search?
UPDATE: This problem persists mainly because of the format of data which is being sent to Kafka
{
"phone_number": "919191919191",
"location": {
"lat": 78.233,
"lon": 60.23
},
"booked": false,
}
{
"phone_number": "+919191919190",
"location": " 78.233, 60.23",
"booked": false,
}
Both the format fails to map to above defined mapping in ElasticSearch
and connector sink
shows following error:
Received Illegal Argument Exception from Elasticsearch: One of your fields' type does not match the mapped type in Elasticsearch