I am trying to read a table form BigQuery using PySpark.
I have tried the following
table = 'my-project-id.project-dataset.test_table_spark'
df = spark.read.format('bigquery').option('table', table).load()
However, I am getting this error
: java.lang.ClassNotFoundException: Failed to find data source: bigquery. Please find packages at http://spark.apache.org/third-party-projects.html
How can I read the bigQuery table from pySpark (at the moment I'm using python2)