Trying to handle a Py4JJavaError that is raised when entering invalid sql code to spark.sql(). My function is as follows:
import py4j
def sql_to_df_and_create_view(sql_string: str, view_name: str):
''' Takes in a SQL command in string format and creates a spark df from the command.
Secondly, creates a temporary view with name specified under view_name parameter. Ensures that
the SQL code is valid and that the view is created. '''
try:
df = spark.sql(sql_string)
except py4j.protocol.Py4JJavaError:
raise Exception(f'Invalid SQL code passed in by {sql_string}.')
return
df.createOrReplaceTempView(view_name)
return df
With the error code being:
org.apache.spark.sql.catalyst.parser.ParseException:
---------------------------------------------------------------------------
Py4JJavaError Traceback (most recent call last)
/databricks/spark/python/pyspark/sql/utils.py in deco(*a, **kw)
62 try:
---> 63 return f(*a, **kw)
64 except py4j.protocol.Py4JJavaError as e:
/databricks/spark/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py in get_return_value(answer,
gateway_client, target_id, name)
327 "An error occurred while calling {0}{1}{2}.\n".
--> 328 format(target_id, ".", name), value)
329 else:
Py4JJavaError: An error occurred while calling o213.sql.
: org.apache.spark.sql.catalyst.parser.ParseException:
mismatched input 'sd' expecting {'(', 'SELECT', 'FROM', 'ADD', 'DESC', 'WITH', 'VALUES',
'CREATE', 'TABLE', 'INSERT', 'DELETE', 'DESCRIBE', 'EXPLAIN', 'SHOW', 'USE', 'DROP', 'ALTER',
'MAP', 'SET', 'RESET', 'START', 'COMMIT', 'ROLLBACK', 'MERGE', 'UPDATE', 'CONVERT', 'REDUCE',
'REFRESH', 'CLEAR', 'CACHE', 'UNCACHE', 'DFS', 'TRUNCATE', 'ANALYZE', 'LIST', 'REVOKE',
'GRANT', 'LOCK', 'UNLOCK', 'MSCK', 'EXPORT', 'IMPORT', 'LOAD', 'OPTIMIZE', 'COPY'}(line 1, pos
0)
== SQL ==
sd
^^^
When I run the function with invalid sql in the sql_string argument, the error is not handled and it still raises the same error code rather than 'Exception(f'Invalid SQL code passed in by {sql_string}.')'. If anyone can figure out why this is not being handled correctly I would be v grateful :)