Hi could any suggest me how to fetch values from maptype of certain key in spark if both key and value are of string datatype.
|-- properties: map (nullable = true)
| |-- key: string
| |-- value: string (valueContainsNull = true)
am actually reading a hive table using hivecontext trying to fetch the data from map of hive datatype which is converted to maptype(stringtype,stringtype,true).
I have tried the scenario using getField but it throwing me with an error saying.
filtered.select($"properties".getField("transferred_bytes")).show()
error message:-
org.apache.spark.sql.AnalysisException: GetField is not valid on fields of type MapType(StringType,StringType,true); at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$.resolveGetField(Analyzer.scala:307) at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$apply$7$$anonfun$applyOrElse$2.applyOrElse(Analyzer.scala:271) at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$apply$7$$anonfun$applyOrElse$2.applyOrElse(Analyzer.scala:260)