0

I am getting bulkwrite Error issue while writing df into mongodb.

df.write\
.format("com.mongodb.spark.sql.DefaultSource")\
.option("uri", connection_string)\
.option("database", db)\
.option("collection", collection)\
.option("ordered", False)\
.option("replaceDocument", False)\
.mode("append")\
.save()

Error:

Write errors: \[BulkWriteError{index=2, code=16500, message='Error=16500, RetryAfterMs=12, Details='Response status code does not indicate success: TooManyRequests (429); Substatus: 3200; ActivityId: 6f54c1ee-f9f4-4f78-948c-84c38afadfde

I have tried, multiple thing but couldn't get any luck to bulk write (122 million records) using pyspark while I am able to write small chunks data successfully. Note: Using Azure MongoDB and databricks.

0 Answers0