Questions tagged [google-cloud-vertex-ai]

Usage questions relating to Google Cloud Platform's Vertex AI: https://cloud.google.com/vertex-ai/docs

692 questions
0
votes
1 answer

How to create a Logs Router Sink when a Vertex AI training job failed (after 3 attempts)?

I am running a Vertex AI custom training job (machine learnin training using custom container) on GCP. I would like to create a Pub/Sub message when the job failed so I can post a message on some chat like Slack. Logfile (Cloud Logging) is looking…
0
votes
1 answer

Vertex AI batch predictions from file-list

I want to submit batch prediction job for a custom model (in my case it is torch model, but I think this is irrelevant in this case). So I read the documentation: But as there are no examples I cannot be sure what the schema of the json object…
Vasil Yordanov
  • 417
  • 3
  • 14
0
votes
0 answers

vertex ai: ResourceExhausted 429 received trailing metadata size exceeds limit

I am using google vertex AI online prediction: In order to send an image it has to be in a JSON file in unit8 format which has to be less than 1.5 MB, when converting my image to uint8 it definitely exceeds 1.5MB. To go around this issue we can…
0
votes
1 answer

How do I stop a Google Cloud's AutoML (now VertexAI) batch prediction job using the web GUI?

I started a batch prediction job in AutoML (now VertexAI) for a small csv in one of my buckets, using a classification model, then I noticed the csv had an error but was unable to find a way to cancel the job using the web GUI, it just says…
-1
votes
0 answers

Can we you use S3 bucket URL for vertex ai video object model training using Python?

I am exploring vertex ai for object recognition from video and I want to use AWS s3 bucket video URL to provide video source. I have used google cloud storage and I need to check whether AWS bucket configuration for video source URL is feasible or…
-1
votes
1 answer

Format issue when calling Vertex AI Custom Job Endpoint

I developed a custom training job in sklearn 0.23 in Vertex AI and successfully deployed to an endpoint. However, when I call the endpoint, I get the following error: raise exceptions.from_grpc_error(exc) from…
-1
votes
1 answer

Best practice to deploy multi models that will run concurrently at scale (something like map reduce)

I have a model that consists 150 models (runs in for loop). In order to be performance oriented, I would like to split it into 150 models, that for every request my server gets it will send 150 api requests to every different model and then combine…
-1
votes
1 answer

What should be used for deployed_model in DeployModelRequest in Vertex AI pipeline?

i am trying to deploy a model in Vertex AI pipeline component using DeployModelRequest. I try to get the model using GetModelRequest model_name = f'projects/{project}/locations/{location}/models/{model_id}' model_request =…
schoon
  • 2,858
  • 3
  • 46
  • 78
-1
votes
1 answer

How to run huge datasets in Vertex AI

I am working with large feature sets (20,000 rows x 20,000 columns) and Vertex AI has a hard limit of 1,000 columns. How can I import data into Google cloud efficiently so that I can run TensorFlow models or auto ML on my data? I haven't been able…
-1
votes
2 answers

AI/ML Assisted labeling in Vertex AI

Is there a feature in Vertex AI which will allow AI/ML to assist in labeling data? This usually works by providing a small set of labeled data, followed by a model creation which assists in labeling more data. As more and more data is labeled the…
Prashant Saraswat
  • 838
  • 1
  • 8
  • 20
-1
votes
1 answer

comparing the output of Automl

im getting different output for feature importance, when I run the automl in azure, google and h2o. even though the data is same and all the features are also same. what would be the reason for it. is there any other method to compare the models
g2021
  • 11
-1
votes
1 answer

Merge the csv files in GCP

The dataset on which I am working on GCP is in csv format and for each feature there is a separate csv file with no header. There is around 20 files and want to create a single file for all these variables with headers. However, I have access on the…
-1
votes
1 answer

Grant user permission to bigquery.datasets.create in schedules notebook in Vertex AI

I have a notebook in which I access data through APIs, work with the data and send the results to a BigQuery table as well as to a GCS bucket. Everything works as it should when the notebook is run manually. However, when scheduled, it breaks with…
-2
votes
1 answer

CI/CD Pipeline using BitBucket and VertexAI

Has anyone tried building a CI/CD pipeline for a private repo on BitBucket and running on VertexAI (Google Cloud)? Or any similar use case?
-2
votes
1 answer

Connecting Google's Tensorboard to Vertex AI AutoML

Is there a way to connect Google's Vertex AI Tensorboard to AutoML? - I remember a brief mention in a coursera specialization but don't know where it was.
user2268997
  • 1,263
  • 2
  • 14
  • 35
1 2 3
45
46