So I am creating a feature store. I am doing the transformations using data wrangler and by using the option "export to sagemaker feature store" I am trying to ingest all the dat ai got after transformation, into the feature store. One of my feature groups is having nearly 300 columns (after one hot encode) and nearly 10 lakhs of rows in it. Its a batch ingestion. So the processing job is running for about 7-10 hrs even though i increased the instance type to ml.m5.24xlarge. but its neither completing nor failing. What can be done to make the processing job run faster and to ingest these data into the feature store?
Asked
Active
Viewed 39 times