0

So I am wondering how can I pass data frame then model between Kubeflow containers which I deployed locally using k8s. Right now I am using v2 sdk and func_to_container_op decorator but I am also interested in how to do it using docker files and creating containers from docker

I found out I should use Input and OutPut function and artifacts but I am not sure how it works in case of passing local files.

minio999
  • 3
  • 2

1 Answers1

0

Local file cannot be directly passed between component. You can either serialize it to a string or wrap it into an Artifact. For Artifact, you can first pass the url or path of the Artifact output to a container argument, which allows you to write content to this file inside the container:

https://www.kubeflow.org/docs/components/pipelines/v2/author-a-pipeline/components/#3-custom-container-components

James Liu
  • 169
  • 8
  • oh I see I realised like like 2 day's ago. But if I would like to continue it like so I would have to deploy some type of db and take data from db right? – minio999 Nov 01 '22 at 19:31
  • If you are using Artifact, you don’t need to define a db. Minio is deployed and used by default. – James Liu Nov 02 '22 at 07:41