0

Hi could any suggest me how to fetch values from maptype of certain key in spark if both key and value are of string datatype.

    |-- properties: map (nullable = true)
    |    |-- key: string
    |    |-- value: string (valueContainsNull = true)

am actually reading a hive table using hivecontext trying to fetch the data from map of hive datatype which is converted to maptype(stringtype,stringtype,true).

I have tried the scenario using getField but it throwing me with an error saying.

    filtered.select($"properties".getField("transferred_bytes")).show()

error message:-

org.apache.spark.sql.AnalysisException: GetField is not valid on fields of type MapType(StringType,StringType,true); at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$.resolveGetField(Analyzer.scala:307) at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$apply$7$$anonfun$applyOrElse$2.applyOrElse(Analyzer.scala:271) at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$apply$7$$anonfun$applyOrElse$2.applyOrElse(Analyzer.scala:260)

ankush reddy
  • 481
  • 1
  • 5
  • 28
  • Works just fine for me. What version of Spark do you use? Also, you can try dot syntax as shown in [Querying Spark SQL DataFrame with complex types](http://stackoverflow.com/a/33850490/1560062). – zero323 Mar 14 '16 at 07:58
  • we are actually using spark 1.3.0 I checked that example it is working for me as well. but when am using with this particular scenario it is not working. @DavidGriffin I used getItem() but no use. – ankush reddy Mar 14 '16 at 17:19

0 Answers0