I will appreciated your response to below query.
I created few tables in vora (e.g test, addresses). I was able see the list of these tables in SHOW DATASOUCE
and query them. Later I restarted vora instance and re-logged in as vora user and started vora spark shell. I am aware that I won't see this table in new shell as it won't present in new spark context. However I come across some link where it says
<ClusterUtils.markAllHostsAsFailed()>
will load all table in vora spark context from metadata but despite of below series of command execution
scala> import org.apache.spark.sql._
import org.apache.spark.sql._
scala> val SapSqlSc = new SapSQLContext(sc)
scala> import com.sap.spark.vora.client
import com.sap.spark.vora.client
scala> client.ClusterUtils.markAllHostsAsFailed()
scala> SapSqlSc.sql(s"""
| SHOW DATASOURCETABLES
| USING com.sap.spark.vora
| OPTIONS
| (
| zkUrls "ip-x-x-x-1.ec2.internal:2181,ip-x-x-x-2.ec2.internal:2181",
| namenodeurl "ip-x-x-x-1.ec2.internal:8020"
| )
| """.stripMargin).collect
I got below error and exception
16/03/04 11:56:24 ERROR Datastore.Schema: Failed initialising database.
Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader$$anon$1@3d3efa54, see the next exception for details.
:
:
Caused by: java.sql.SQLException: Failed to start database 'metastore_db' with class loader org.apache.spark.sql.hive.client.IsolatedClientLoader $$anon$1@3d3efa54, see the next exception for details.at org.apache.derby.impl.jdbc.SQLExceptionFactory40.getSQLException(Unknown Source)
at org.apache.derby.impl.jdbc.Util.newEmbedSQLException(Unknown Source)