Vertex AI endpoints provide Batch Prediction functionality that is very useful to perform predictions on a large amount of data.
However, every time I make a new batch prediction the endpoint creates a new Dataset inside BigQuery. This is very uncomfortable because if I make 100 batch predictions per day, I will obtain 100 new Datasets inside BigQuery.
Is there a way to make all predictions converge inside the same Dataset? I mean, each prediction is a new Table inside a specific Dataset of BigQuery.