0

I was trying to retrieve hive tables from Spark scala (Both Hive Cluster , spark cluster and my local system have same JDK version 8)

 var tableData= objHiveContext.hql("select * from intervalmeterdemandbymonth  c")
 println(tableData)  //Line1
 var rowData = tableData.collect()
 println(rowData)`

I was able to print table columns with Line 1 but while trying to exec collect statement due to serialVersionUID i.e incompatible class path

WARN TaskSetManager: Lost task 1.0 in stage 0.0 (TID 1, SERVERIP): **java.io.InvalidClassException**: org.apache.spark.sql.catalyst.expressions.SpecificMutableRow; **local class incompatible: stream classdesc serialVersionUID = -2362733443263122426, local class serialVersionUID = -8202316371712574867** at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:621) at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1623) at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351) at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1707) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1345) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)

sudhir
  • 1,387
  • 3
  • 25
  • 43
  • So... is there a question somewhere? – Samson Scharfrichter Dec 23 '15 at 16:13
  • @SamsonScharfrichter I'm Unable to retrieve data (or loop through it ), how can I read my data form object tableData after loading from hivecontext , I'm getting above exceptipon. – sudhir Dec 24 '15 at 04:01
  • I suggest that you search StackOverflow for keywords `InvalidClassException serialVersionUID` and also that you search the Spark JIRAs for bugs affecting your version (that you did not care to mention). Good luck. Sounds like a nasty Java config / compilation issue. – Samson Scharfrichter Dec 24 '15 at 10:01
  • Unable to resolve the issue mentioned @SerialVersionUID(23L) attribute to my object with various trial and error but couldn't get. Using Hadoop Version :2.7.1 Spark:1.4.1, Scala Version 2.10.4 In my pom.xml I have org.apache.spark spark-core_2.10 1.4.1 – sudhir Dec 28 '15 at 10:14

0 Answers0