Questions tagged [kubeflow-pipelines]

Kubeflow Pipelines is a platform for building and deploying portable, scalable machine learning (ML) workflows based on Docker containers.

273 questions
0
votes
1 answer

synchronize kubeflow pipeline files with git

I want to locally make changes to my python scripts, then push it to git, then run the Kubeflow pipeline on Google Cloud. How can I pull the latest commit from git before running the files on the cloud?
Rony Tesler
  • 1,207
  • 15
  • 25
0
votes
1 answer

AWS SageMaker ML DevOps tooling / architecture - Kubeflow?

I'm tasked with defining AWS tools for ML development at a medium-sized company. Assume about a dozen ML engineers plus other DevOps staff familiar with serverless ( lambdas and the framework ). The main questions are: a) what is an architecture…
0
votes
0 answers

Accessing kubeflow pipelines from GKE Cluster

I have installed Kubeflow on a private GKE Cluster from the GCP Console using below service AI Platform --> Pipelines The kubeflow pipeline UI can be accessed fine. But my confusion how can I access the UI from internet as there are no ingress/LBs…
0
votes
3 answers

Save and load a spacy model to a google cloud storage bucket

I have a spacy model and I am trying to save it to a gcs bucket using this format trainer.to_disk('gs://{bucket-name}/model') But each time I run this I get this error message FileNotFoundError: [Errno 2] No such file or directory:…
0
votes
2 answers

How to Kickstart Kubeflow Pipeline development in Python

I have been studying about Kubeflow and trying to grasp how do I write my first hollo world program in it and run locally on my mac. I have kfp and kubectl installed locally on my machine. For testing purpose I want to write a simple pipeline with…
Volatil3
  • 14,253
  • 38
  • 134
  • 263
0
votes
1 answer

How to access artifacts in Kubeflow runtime?

I would like to access mlpipeline-metrics content from another component.
Montenegrodr
  • 1,597
  • 1
  • 16
  • 30
0
votes
1 answer

kubeflow pipeline ml_metadata mysql_real_connect failed

I am trying to build kubeflow pipeline metadata-deployment When Kubeflow connected data, the following questions occurred. What do I need to operate? I have now set up the database and the user password is confirmed to be correct That's official…
Mixc
  • 3
  • 2
0
votes
2 answers

Is KubeFlow still supported on GCP?

I am trying to use KubeFlow on GCP and I am following this codelab, but "click-to-deploy" is no longer supported so I followed the documentation of "kubectl and kpt". However, I keep getting this "You cannot perform this action because the Cloud SDK…
0
votes
1 answer

Index Out of range error when converting my Jupyter Notebook into Kubeflow pipline with Kale

I am running a simple ANN model on a Jupyter Notebook Server in Kubeflow. I ran my code in my notebook to see if it worked, and everything ran just fine with expected outputs. However, when I use Kale to convert the notebook into a Kubeflow…
0
votes
0 answers

is there a way to run a python script in cloudbuild steps?

I have a series of cloudbuild steps where i am uploading a pipeline to gcp kubeflow. now i want to run that pipeline in the next step. so for that i have written a python script, what i want it to run this python script in my next cloudbuild…
0
votes
1 answer

How to persist variables in google cloudbuild steps?

I have a cloudbuild.json which is used to upload a pipeline to gcp kubeflow. now i want to add another step in which i want to fetch the latest pipeline id and then run the pipeline as an experiment. so my main issue is how should i get the pipeline…
0
votes
1 answer

I have created a cloudbuild.json for kubeflow pipeline deployment. but it is giving error saying file is not present

This is my cloudbuild.json { "steps": [ { "name": "gcr.io/cloud-builders/docker", "args": [ "build", "-t", "trainer_image", "." ], …
0
votes
2 answers

Is there a way to automate the build of kubeflow pipeline in gcp

here is my cloudbuild.yaml file - name: 'gcr.io/cloud-builders/docker' args: ['build', '-t', 'gcr.io/$PROJECT_ID/$_TRAINER_IMAGE_NAME:$TAG_NAME', '.'] dir: $_PIPELINE_FOLDER/trainer_image # Build the base image for lightweight components -…
0
votes
1 answer

Kubeflow: How to supply a file as pipeline input (param)

From what I understand, a Kubeflow python only takes string parameters, but in case of the pipeline I need, the user should be able to supply a file as input. How can I do that? Best
Hamid
  • 59
  • 2
0
votes
1 answer

How to set up data access in a distributed training job (TF Job) for Object Detection API on Azure

I've been trying to set up distributed training for the TensorFlow Object Detection API on Azure for a while. I've been confused a bit on how exactly to set up my data into the job. Previously, I used to make this work pretty easily on gcloud using…
Mercury
  • 3,417
  • 1
  • 10
  • 35