Questions tagged [kubeflow-pipelines]

Kubeflow Pipelines is a platform for building and deploying portable, scalable machine learning (ML) workflows based on Docker containers.

273 questions
0
votes
0 answers

Can't create kubeflow pipeline with @dsl.component method

Hello I'm a kfp beginner. I tried to run a hello pipeline like this. pip install kfp from kfp import dsl @dsl.component def say_hello(name: str) -> str: hello_text = f'Hello, {name}!' print(hello_text) return…
sefgsefg
  • 1
  • 1
0
votes
0 answers

How can I tell Kubeflow's kfp command line interface about my self-signed cert?

I am trying to run a pipeline on our Kubeflow cluster via our CI/CD pipeline using the kfp command-line interface. However, when I call the kfp run submit command (with a host of options), I get the following error: ... connection broken by…
Eric
  • 1,414
  • 3
  • 16
  • 35
0
votes
1 answer

Different library version during the run of vertex pipeline vs in the docker container

I am using a Component in a vertex ai pipeline with a custom component. @component( base_image=f"gcr.io..." ) def temp_step(): It's supposed to have the same library versions during the run of the component in the Vertex AI Pipeline than in the…
0
votes
0 answers

Configuring Kubeflow Pipelines to run Docker Images

I am new to Kubeflow and I am currently trying to schedule two Docker images that are stored in my GitLab registry. Despite trying to figure it out, I'm still unable to understand how to pull these images and run them using Kubeflow. Any code…
1dll
  • 39
  • 5
0
votes
0 answers

How to pass dyanmically argument to 'text' parameter of KubeFlow's load_component_from_text() method

Is it possible to create a double-container KubeFlow pipeline which steps would be as follows: Container #1. Container runs and creates an output which is a 'text' argument and stores it as an output txt. Example: name: some-pipeline description:…
0
votes
0 answers

How do I use an input argument to my Python script as the output artifact in Kubeflow Pipeline

I'm currently using Kubeflow Pipeline to train a number of models. The crux of the issue is that I want to use a saved checkpoint as the input to a subsequent model. I was originally running training using a Python script inside a project repository…
Sean
  • 2,890
  • 8
  • 36
  • 78
0
votes
2 answers

Same component works when defined through @component but it fails when created with create_component_from_func

I have a Docker container in gcloud with all the code I need (multiple files, etc.). I'm able to run it when I defined the component using the @component decorator. I define the function and I set up the base_image. The component does a few things…
0
votes
2 answers

How to set up a delay between two Vertex AI components in a pipeline?

I am working on a pipeline using Vertex AI, and I'm facing an issue with scheduling the execution of two components with a delay. Here's the scenario: Component A generates some data as its output. Component B processes the output from Component A…
pajamas
  • 1,194
  • 1
  • 12
  • 25
0
votes
0 answers

Vertex AI/Kubeflow Pipelines, ModelBatchPredictOp Does Not Take Outputs of Previous Components

The title of this question is self-explanatory. If I have a Kubeflow pipeline in the following manner: # this is a kubeflow pipelines component dedicated to reformatting csv data to jsonl format reformat_input_op =…
0
votes
1 answer

How do I import external python class into Vertex AI pipeline component?

I have a Vertex AI pipeline that uses a Python component. @component( base_image="python:3.9", packages_to_install=[lots!], ) def my_comp( parms ) -> str: from google.cloud import aiplatform aiplatform.init(project=project,…
schoon
  • 2,858
  • 3
  • 46
  • 78
0
votes
0 answers

automatically change pod's workflow resource when deploying kubeflow to gke cluster

When deploying the ai platform service kubeflow to the Google Cloud gke cluster, a pod workflow (.yaml) file is automatically created. I want to change the resource request, limit cpu, and memory settings of the workflow (.yaml) file at the time of…
0
votes
1 answer

How to modify a CustomJob for a specific component in vertexAI

I'm having trouble changing one component in my pipeline. In further detail, the component, which is a reusable component is loaded through a .yaml file, should operate on a different network (VPC) than the network that is by default assigned by…
0
votes
0 answers

slurm with K8s or Kubeflow possible?

I understand that both Slurm and Kubeflow/K8s are workload manager. we wanted to leverage features of slurm parallelism etc., and use with Kubeflow or Kubernetes for ML workflow, is it possible? can they coexist or work together? Example: We want to…
0
votes
1 answer

Kubeflow - how to pass Tensorflow Dataset and Tensors from one component to another?

I am implementing a Kubeflow pipeline in Vertex AI. Basically I have two components: prepare_data and train_model: @component( packages_to_install = [ "pandas==1.3.4", "numpy==1.20.3", "unidecode", "nltk==3.6.5", …
0
votes
0 answers

Kubeflow Pipelines - No pods found in namespace kubeflow-user-example-com

No pods found in namespace kubeflow-user-example-com % kubectl get pods -n kubeflow-user-example-com No resources found in kubeflow-user-example-com namespace. % kubeflow minio ui artifact and visualisation pod should be running kubeflow minio ui…