I am developing Spark jobs in My local machine and later deploying on the cluster for full run. I have created a common library that other people uses in their code. In this code, I have to use HiveContext to do Spark SQL which many people suggested offers better sql parser. I don't have admin access to my machine so I am not able to create HiveContext in local mode. The common code is shared using jar so we can not manually switch between HiveContext and SQLContext for testing purpose. Is there anyway I can create a Common Context that will run as SQLContext in local mode and HiveContext on Cluster mode based on some parameters.
This is the error I get while trying to create HiveContext in local mode.
17/03/16 13:11:53 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order" 17/03/16 13:11:54 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table. 17/03/16 13:11:54 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table. 17/03/16 13:11:55 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table. 17/03/16 13:11:55 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table. 17/03/16 13:11:55 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY 17/03/16 13:11:55 INFO ObjectStore: Initialized ObjectStore 17/03/16 13:11:55 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0 17/03/16 13:11:56 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException 17/03/16 13:11:56 INFO HiveMetaStore: Added admin role in metastore 17/03/16 13:11:56 INFO HiveMetaStore: Added public role in metastore 17/03/16 13:11:56 INFO HiveMetaStore: No user is added in admin role, since config is empty 17/03/16 13:11:56 INFO HiveMetaStore: 0: get_all_databases 17/03/16 13:11:56 INFO audit: ugi= ip=unknown-ip-addr cmd=get_all_databases 17/03/16 13:11:56 INFO HiveMetaStore: 0: get_functions: db=default pat=* 17/03/16 13:11:56 INFO audit: ugi= ip=unknown-ip-addr cmd=get_functions: db=default pat=* 17/03/16 13:11:56 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as "embedded-only" so does not have its own datastore table. Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw- at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522) at org.apache.spark.sql.hive.client.ClientWrapper.(ClientWrapper.scala:204) at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238) at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:218) at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:208) at org.apache.spark.sql.hive.HiveContext.functionRegistry$lzycompute(HiveContext.scala:462) at org.apache.spark.sql.hive.HiveContext.functionRegistry(HiveContext.scala:461) at org.apache.spark.sql.UDFRegistration.(UDFRegistration.scala:40) at org.apache.spark.sql.SQLContext.(SQLContext.scala:330) at org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:90) at org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:101) at SparkTest$.main(SparkTest.scala:11) at SparkTest.main(SparkTest.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144) Caused by: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw- at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:612) at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554) at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508) ... 17 more