2

I was exploring the vertex AI AutoML feature in GCP, which lets users import datasets, train, deploy and predict ML models. My use case is to do the data pre-processing on my own (I didn't get satisfied with AutoML data preprocessing) and want to feed that data directly to a pipeline where it trains and deploys the model. Also, I want to feed the new data to the dataset. It should take care of the entire pipeline (from data preprocessing to deploying the latest model). I want insight as to how to approach this problem?

Koushik J
  • 552
  • 3
  • 10
  • 23

1 Answers1

1

You can create a custom pipeline using Kubeflow Pipelines SDK v1.8.9 or higher or TensorFlow Extended v0.30.0 or higher.

  • If you use TensorFlow in an ML workflow that processes terabytes of structured data or text data, it is recommended that you build your pipeline using TFX.

  • For other use cases, we recommend that you build your pipeline using the Kubeflow Pipelines SDK. By building a pipeline with the Kubeflow Pipelines SDK, you can implement your workflow by building custom components or reusing pre-built components.

To create a Kubeflow pipeline, you can follow the next guide

Eduardo Ortiz
  • 715
  • 3
  • 14