0

My problem is the following: I'm trying to run a job with spark submit task, but I have an environment to build.

But for metro archives does not install my environment inside the cluster at runtime example job

{
    "name":"my_test"
    ...
    "new_cluster": {
        ...
        "spark_conf": {
            "spark.databricks.delta.preview.enabled": "true"
        },
        "spark_env_vars": {
            "PYSPARK_PYTHON": "./environment/bin/python"
        },
        ...
        },
    "spark_submit_task": {
        "parameters": [
            "--archives",
            "dbfs:/teste_path/pyspark_conda_env.tar.gz#environment",
            "dbfs:/teste_path/my_script.py"
        ]
    }
}
}

0 Answers0