0

Trying to run mlflow run by specifying MLproject and code which lives in a different location as MLproject file.

I have the following directory structure:

/root/mflow_test
.
├── conda
│   ├── conda.yaml
│   └── MLproject
├── docker
│   ├── Dockerfile
│   └── MLproject
├── README.md
├── requirements.txt
└── trainer
    ├── __init__.py
    ├── task.py
    └── utils.py

When I'm run from: /root/

mlflow run mlflow_test/docker

I get:

/root/miniconda3/bin/python: Error while finding module specification for 'trainer.task' (ImportError: No module named 'trainer')

Since my MLproject file can't find the Python code. I moved MLproject to mflow_test and this works fine.

This is my MLproject entry point:

name: mlflow_sample
docker_env:
  image: mlflow-docker-sample
entry_points:
  main:
    parameters:
      job_dir:
        type: string
        default: '/tmp/'
    command: |
        python -m trainer.task --job-dir {job_dir}

How can I run mlflow run and pass the MLproject and ask it to look in a different folder?

I tried:

"cd .. && python -m trainer.task --job-dir {job_dir}" 

and I get:

/entrypoint.sh: line 5: exec: cd: not found

Dockerfile

# docker build -t mlflow-gcp-example -f Dockerfile .
FROM gcr.io/deeplearning-platform-release/tf-cpu 
RUN git clone github.com/GoogleCloudPlatform/ml-on-gcp.git 
WORKDIR ml-on-gcp/tutorials/tensorflow/mlflow_gcp 
RUN pip install -r requirements.txt 
gogasca
  • 9,283
  • 6
  • 80
  • 125
  • use `WORKDIR` in your `Dockerfile` instead of `cd` – LinPy Aug 19 '19 at 05:07
  • I tried it and no luck, this is my Dockerfile: FROM gcr.io/deeplearning-platform-release/tf-cpu RUN git clone https://github.com/GoogleCloudPlatform/ml-on-gcp.git WORKDIR ml-on-gcp/tutorials/tensorflow/mlflow_gcp RUN pip install -r requirements.txt – gogasca Aug 19 '19 at 05:35
  • so your error comes in `pip install -r requirements.txt` not in `entrypoint.sh` ? iam confused now – LinPy Aug 19 '19 at 05:44
  • You tagged this as Docker but I see no reference whatsoever to Docker in the post. Is there a Dockerfile and docker build/run command you could share? – Mihai Aug 19 '19 at 05:47
  • pip install -r requirements.txt works fine, mlflow run it fails when I run: mlflow run mlflow_test/docker since trainer folder is not there, but one level up. – gogasca Aug 19 '19 at 05:49
  • @Mihai updated original question – gogasca Aug 19 '19 at 05:51
  • 1
    When you build a docker image, you use a Dockerfile and a context. In your case the context it '.'. The context defines what Docker can see. Anything above that level cannot be accessed in any way by Docker. My suggestion is that you put the Dockerfile at the highest level in your hierarchy (next to requirements.txt) and pass that as build context. This way Docker has access to all your project files. You can always "hide" files from Docker with .dockerignore. – Mihai Aug 19 '19 at 06:14

0 Answers0