0

I'm relatively new to Kubeflow and I'm trying to create a pipeline that uses Docker images stored in my private GitLab registry. I've looked through the Kubeflow documentation, but I couldn't find a straightforward way to do this.

Here's what I've tried so far:

  1. I created a Docker image and pushed it to my GitLab registry. I've verified that the Docker image is present in the registry by using the GitLab UI.
  2. I have set up a Kubeflow instance and have basic pipelines running without any issues using publicly available images.
  3. I created a GitLab registry secret in Kubernetes

I am now stuck at the point of integrating the GitLab registry with the Kubeflow pipeline and define the pipeline.

My current pipeline file which is not working:

@component
def first_op() -> str:

@dsl.pipeline(
    name='My pipeline',
    description='My machine learning pipeline'
)
def my_pipeline():
    first_generator = first_op().set_image_pull_secrets([dsl.LocalObjectReference('gitlab-registry-secret')])
    second_generator = second_op(output_first=first_generator.output).set_image_pull_secrets([dsl.LocalObjectReference('gitlab-registry-secret')])

if __name__ == '__main__':
    kfp.compiler.Compiler().compile(my_pipeline, 'my_pipeline.yaml')

Updated code:

@component(base_image=mygitlab_registry_location)
def first_op(output_path: str):
    pass

@dsl.pipeline(
    name='My pipeline',
    description='My machine learning pipeline'
)
def my_pipeline():
    first_generator = first_op(output_path = "data/xx.txt")    
    
    # Set the image pull secret for the operation
    first_generator.apply(
        lambda container_op: container_op.container.set_image_pull_secrets([dsl.LocalObjectReference('gitlab-registry-secret')])
    )


if __name__ == '__main__':
    kfp.compiler.Compiler().compile(my_pipeline, 'my_pipeline.yaml')

Currently I am having this error:

There is no input/output parameter or artifact.
1dll
  • 39
  • 5
  • Is the namespace of the secret correct? – esantix Aug 23 '23 at 19:00
  • Yes I think so, actually I updated the pipeline and I am having this error: "There is no input/output parameter or artifact." – 1dll Aug 24 '23 at 08:44
  • I think new error is not realted to this question anymore. Maybe created a new one. But check this from kfp docs: "In pipelines, input artifact annotations should be wrapped in an Input type marker and, unlike in components, output artifacts should be provided as a return annotation as shown in concat_pipeline’s Dataset output". https://www.kubeflow.org/docs/components/pipelines/v2/data-types/artifacts/ – esantix Aug 24 '23 at 13:30

0 Answers0