I'm relatively new to Kubeflow and I'm trying to create a pipeline that uses Docker images stored in my private GitLab registry. I've looked through the Kubeflow documentation, but I couldn't find a straightforward way to do this.
Here's what I've tried so far:
- I created a Docker image and pushed it to my GitLab registry. I've verified that the Docker image is present in the registry by using the GitLab UI.
- I have set up a Kubeflow instance and have basic pipelines running without any issues using publicly available images.
- I created a GitLab registry secret in Kubernetes
I am now stuck at the point of integrating the GitLab registry with the Kubeflow pipeline and define the pipeline.
My current pipeline file which is not working:
@component
def first_op() -> str:
@dsl.pipeline(
name='My pipeline',
description='My machine learning pipeline'
)
def my_pipeline():
first_generator = first_op().set_image_pull_secrets([dsl.LocalObjectReference('gitlab-registry-secret')])
second_generator = second_op(output_first=first_generator.output).set_image_pull_secrets([dsl.LocalObjectReference('gitlab-registry-secret')])
if __name__ == '__main__':
kfp.compiler.Compiler().compile(my_pipeline, 'my_pipeline.yaml')
Updated code:
@component(base_image=mygitlab_registry_location)
def first_op(output_path: str):
pass
@dsl.pipeline(
name='My pipeline',
description='My machine learning pipeline'
)
def my_pipeline():
first_generator = first_op(output_path = "data/xx.txt")
# Set the image pull secret for the operation
first_generator.apply(
lambda container_op: container_op.container.set_image_pull_secrets([dsl.LocalObjectReference('gitlab-registry-secret')])
)
if __name__ == '__main__':
kfp.compiler.Compiler().compile(my_pipeline, 'my_pipeline.yaml')
Currently I am having this error:
There is no input/output parameter or artifact.