0

My glue job is getting "Failed due to put of memory"

I have gone through the link AWS Glue executor memory limit

I have added

  • Key: --conf
  • Value: spark.yarn.executor.memoryOverhead=7g

I have one environment variable :(key) --dynamodb , (value) sample-table

  • I got below error "ambiguous option: --conf could match --dynamodb"

https://i.stack.imgur.com/dkSNC.png

sim
  • 524
  • 3
  • 14
  • Can you add the stack trace to your question. Is it driver or executor going OOM ? Also have you tried larger instance types ? – Prabhakar Reddy Mar 07 '22 at 16:26
  • @PrabhakarReddy its driver , don't know to do larger instance types – sim Mar 08 '22 at 09:12
  • 1
    From the image attached it looks like you are running python shell job and not Glue ETL? If yes then there will not be an executor or driver – Prabhakar Reddy Mar 08 '22 at 16:20

0 Answers0