I install the newest version "databricks-connect==13.0.0". Now get the issue
Command C:\Users\Y\AppData\Local\pypoetry\Cache\virtualenvs\X-py3.9\Lib\site-packages\pyspark\bin\spark-class2.cmd"" not found
konnte nicht gefunden werden.
Traceback (most recent call last):
File "C:\X\repositories\schema-integration-customer\tmp_run_builder.py", line 37, in <module>
spark = get_spark()
File "C:\X\repositories\data-common\X\da\common\_library\spark.py", line 60, in get_spark
return builder.getOrCreate()
File "C:\Users\Y\AppData\Local\pypoetry\Cache\virtualenvs\X-schema-integration-customer-hjO9aLIy-py3.9\lib\site-packages\pyspark\sql\session.py", line 479, in getOrCreate
else SparkContext.getOrCreate(sparkConf)
File "C:\Users\Y\AppData\Local\pypoetry\Cache\virtualenvs\X-schema-integration-customer-hjO9aLIy-py3.9\lib\site-packages\pyspark\context.py", line 560, in getOrCreate
SparkContext(conf=conf or SparkConf())
File "C:\Users\y\AppData\Local\pypoetry\Cache\virtualenvs\x-schema-integration-customer-hjO9aLIy-py3.9\lib\site-packages\pyspark\context.py", line 202, in __init__
SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
File "C:\Users\y\AppData\Local\pypoetry\Cache\virtualenvs\x-schema-integration-customer-hjO9aLIy-py3.9\lib\site-packages\pyspark\context.py", line 480, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway(conf)
File "C:\Users\y\AppData\Local\pypoetry\Cache\virtualenvs\x-schema-integration-customer-hjO9aLIy-py3.9\lib\site-packages\pyspark\java_gateway.py", line 106, in launch_gateway
raise RuntimeError("Java gateway process exited before sending its port number")
RuntimeError: Java gateway process exited before sending its port number
Process finished with exit code 1
I use Windows and for the version "databricks-connect==11.3.10" everything run smooth