I'm new to Spark and working on pySpark version 3.0.1 and the Python version used by spark is 3.6.x. I have SQL files that have merge SQL's and are stored in google storage. I am trying to pass those SQL files to spark.sql, can some help me with how can I achieve this using SparkSession?
from pyspark.sql import SparkSession
spark = SparkSession.builder.appName("sample").getOrCreate()
df = spark.read.load("TERR.txt")
df.createTempView("example")
I want to pass this sql by reading it from a .sql file located in google storage bucket and .sql file will be having multiple lines:
df2 = spark.sql("SELECT * FROM example")