Questions tagged [kubeflow-pipelines]

Kubeflow Pipelines is a platform for building and deploying portable, scalable machine learning (ML) workflows based on Docker containers.

273 questions
1
vote
0 answers

AI Platform Pipelines sometimes and randomly fails

I've been using AI Platform Pipelines (v0.2.5) for several months. I rebuilt the Pipelines instance because I've found a newer version (v0.5.1) on Console. I'm now stuck in completing Pipelines. It's very weird because there seems not to be failure…
1
vote
1 answer

Getting error on microk8s with Kubeflow PIpelines SDK and Jupyter Notebook

I have set up a local cluster using microk8s and Kubeflow on my local machine. I followed these installation instructions to get my cluster up and running. I have started a Jupyter Server and coded a Kubeflow Pipeline. My YAML file I have used to…
1
vote
0 answers

Is it possible with Kubeflow to mount a local folder to multiple container ops?

I have a docker container which contains my environment and then I have 2 files on the local machine which I would like to run. first_script.py and second_script.py and my image is called my_env:3.7. In the kubeflow I would like to create a pipeline…
Gabe
  • 624
  • 8
  • 19
1
vote
1 answer

Can UI show input parameters which is not PipelineParam in kubeflow-pipelines?

In kubeflow-pipelines, UI shows PipelineParam as input parameters. However, I want to confirm values which are not PipelineParam but constant value. Is there any way to show the values in UI? For example, only param_a is shown in UI in the…
saket
  • 368
  • 1
  • 9
1
vote
2 answers

Kubeflow Pipeline in serving model

I'm beginning to dig into kubeflow pipelines for a project and have a beginner's question. It seems like kubeflow pipelines work well for training, but how about serving in production? I have a fairly intensive pre processing pipeline for training…
0
votes
0 answers

Load kubeflow pipeline yaml

I'm looking for a "native" way to load a kubeflow pipeline yaml into memory (ideally in python) and manipulate it. For example, I can load a component yaml into memory and manipulate it: kfp.components.load_component_from_file("component.yaml") #…
Omri
  • 43
  • 1
  • 4
0
votes
0 answers

Always 404 on kfp 2.0.1

I was using kfp==2.0.0b12 and everything was working great. But as soon as i got into 2.0.1 I am always getting 404 when trying to use methods from client, like this: kfp_server_api.exceptions.ApiException: (404) Reason: Not Found HTTP response…
d0m3n1
  • 1
  • 1
0
votes
0 answers

kfp.dsl. placeholders (used as component inputs) work on default machine but are not transformed in a custom machine

kfp version 1.8.11 I have a pipeline and I need to use some pipeline/task parameters to keep track of my experiments and do the pathing for GCS. I provide this as inputs of the…
0
votes
0 answers

Building a Kubeflow Pipeline using Docker images from a private GitLab Registry

I'm relatively new to Kubeflow and I'm trying to create a pipeline that uses Docker images stored in my private GitLab registry. I've looked through the Kubeflow documentation, but I couldn't find a straightforward way to do this. Here's what I've…
1dll
  • 39
  • 5
0
votes
0 answers

GCP AI Platform doesn't list Kubeflow pipeline application on Pipelines page

I am doing a lab titled as "TFX on Cloud AI Platform Pipelines" from Qwiklabs Machine Learning path. The instructions asks me to run a shell file by which I grant access to corresponding accounts and create a notebook instance and delete an existing…
0
votes
0 answers

I'm trying to save a JSON file to a PVC in one component then read it from another but keep getting a FileNotFoundError

The two components look like this (they're YAML files compiled from Python functions): Component 1 class DownloadData: @staticmethod def _component_func( mlflow_uri: str, run_id: str, artifact_path: str, …
Sean
  • 2,890
  • 8
  • 36
  • 78
0
votes
1 answer

Change uri name of a KFP artifact to .csv instead of "dataset"

When I output a KubeFlow Output[Dataset] that I know will be in a CSV format output URI is ending with the text dataset. Is it possible to change the name of the URI to dataset.csv or to training_data.csv? The full URI for the artifact is now…
0
votes
0 answers

Use dsl.Collected inside a dsl.Condition context

I'm trying to use the output of dsl.ParallelFor downstream within a dsl.Condition and I get this error: TypeError: None has type NoneType, but expected one of: int I know regular output of components can be use in dsl.Condition contexts without…
BogdanC
  • 1,316
  • 3
  • 16
  • 36
0
votes
0 answers

How to using local file when run dsl.ContainerOp in kubeflow-pipeline

As I known when we using dsl.ContainerOp in kubeflow pipeline with sdk ContainerOp( name=step_name, image='katib/mxnet-mnist-example', command=['python', '/mxnet/example/image-classification/train_mnist.py'], arguments=[ …
spc
  • 1
0
votes
1 answer

Missing buttons for logs, pod etc on components of pipeline runs in KFP UI

We have set up an on-prem Kubeflow 1.7 installation and are trying out Kubeflow Pipelines v2.x. When executing a run on the provided pipeline "[Tutorial] Data passing in python components", all the components expose quite a lot of tabs, including…
khituras
  • 1,081
  • 10
  • 25