0

i have a delta table whose size will increase gradually. now we have around 15 million rows while running vacuum command on that table i am getting the below error.

ERROR: Job aborted due to stage failure: Task 7 in stage 491.0 failed 4 times, most recent failure: Lost task 7.4 in stage 481.0 (TID 4116) (10.154.64.26 executor 7): ExecutorLostFailure (executor 7 exited caused by one of the running tasks) Reason: Executor heartbeat timed out after 177186 ms

and also wanted to know the best order of using

VACUUM table RETAIN 168 HOURS 

optimize table;

fsck repair table table;

REFRESH table table;
Alex Ott
  • 80,552
  • 8
  • 87
  • 132
John
  • 11
  • 1
  • Please use universal measurements instead of local words like *crore*, and also, don't put the same text more than once into your question. – James Z May 17 '23 at 14:20
  • 1
    `ExecutorLostFailure`s are really hard to diagnose unless you provide enough information and are usually a side effect of some other memory-related issues. There had to be more information earlier in the logs to pinpoint the root cause of the issue. – Jacek Laskowski May 18 '23 at 12:03

0 Answers0