trying with both the dataframe Api and the rdd API
val map =collection.mutable.Map[String, String]()
map("es.nodes.wan.only") = "true"
map("es.port") = "reducted"
map("es.net.http.auth.user") = "reducted"
map("es.net.http.auth.pass") = "reducted"
map("es.net.ssl") = "true"
map("es.mapping.date.rich") = "false"
map("es.read.field.include") = "data_scope_id"
map("es.nodes") = "reducted"
val rdd = sc.esRDD("index name", map)
rdd.take(1)
But anything I try I get this error
EsHadoopIllegalArgumentException: invalid map received dynamic=strict
I've tried limiting the fields being read with es.read.field.include
But even if I choose one field which I'm sure doesn't have any varient I still get this error
How can I work around this? I'll be glad for any advice
Versions:
- eshadoop-7.13.4
- client Spark 3.1.2
- Scala 2.12
Clarification
This is about reading from elasticsearch in spark, not indexing