2

Here is my code to write my DataFrame df into the phoenix.

df.write \
.format("org.apache.phoenix.spark") \
.mode("overwrite") \
.option("table", "TABLETEST") \
.option("zkUrl", "10.10.10.151:2181") \
.save()

On running the code, It shows connection status.

INFO ZooKeeper: Initiating client connection, connectString=10.10.10.151:2181 sessionTimeout=90000 watcher=hconnection-0x4fd269230x0, quorum=10.10.10.151:2181, baseZNode=/hbase
INFO ClientCnxn: Opening socket connection to server 10.10.10.151/10.10.10.151:2181. Will not attempt to authenticate using SASL (unknown error)
INFO ClientCnxn: Socket connection established to 10.10.10.151/10.10.10.151:2181, initiating session
INFO ClientCnxn: Session establishment complete on server 10.10.10.151/10.10.10.151:2181, sessionid = 0x1610f2bcaee003d, negotiated timeout = 90000

But it shows following error, and goes on retrying the caller infinitely

INFO RpcRetryingCaller: Call exception, tries=10, retries=35, started=68188 ms ago, cancelled=false, msg=row 'SYSTEM:CATALOG,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=bigdata-datanode,16020,1516377985241, seqNum=0
INFO RpcRetryingCaller: Call exception, tries=11, retries=35, started=88386 ms ago, cancelled=false, msg=row 'SYSTEM:CATALOG,,' on table 'hbase:meta' at region=hbase:meta,,1.1588230740, hostname=bigdata-datanode,16020,1516377985241, seqNum=0

I also added Jars and hbase-site.xml as follows.

 /opt/spark/jars
... 
phoenix-core-4.12.0-HBase-1.2.jar
phoenix-spark-4.12.0-HBase-1.2.jar
hbase-common-1.2.0.jar
hbase-client-1.2.0.jar
hbase-protocol-1.2.0.jar
hbase-server-1.2.0.jar
...
/opt/spark/conf
...
hbase-site.xml
...
Dhaval Asodariya
  • 558
  • 5
  • 19
Shekhar Koirala
  • 176
  • 3
  • 10

0 Answers0