0

We have a local standalone spark cluster (2.3.1) trying to connect a read from GCP-BQ table.

df.show() is throwing bunch of errors.

---error-- m/spark-2.3.1-bin-hadoop2.7/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 328, in get_return_value py4j.protocol.Py4JJavaError: An error occurred while calling o59.showString. : java.lang.IllegalStateException: Could not find TLS ALPN provider; no working netty-tcnative, Conscrypt, or Jetty NPN/ALPN available

Any pointers will help

command used to submit - spark-2.3.1-bin-hadoop2.7/bin/spark-submit --packages com.google.cloud.spark:spark-bigquery-with-dependencies_2.11:0.17.1 pyspark-bq.py

thanks V

0 Answers0