-1

I am working with large feature sets (20,000 rows x 20,000 columns) and Vertex AI has a hard limit of 1,000 columns. How can I import data into Google cloud efficiently so that I can run TensorFlow models or auto ML on my data? I haven't been able to find documentation for this issue.

Munrock
  • 403
  • 1
  • 11
  • 1
    You are correct that the [limit for Vertex AI](https://cloud.google.com/automl-tables/docs/quotas#limits) is 1,000 columns and 200,000,000 rows. There is an on-going request to support dataset with >1000 columns, you may check the [public issue tracker](https://issuetracker.google.com/240491813) for update. – Anjela B Aug 30 '22 at 04:02

1 Answers1

0

Are you trying this with datasets / AutoML? One thing to try is Feature Store (https://cloud.google.com/vertex-ai/docs/featurestore) or putting it into BigQuery (https://cloud.google.com/blog/products/data-analytics/automl-tables-now-generally-available-bigquery-ml). I know that might be a change in your workflow but should be able to accommodate up to 10,000 columns.