0

I am using Python code in Databricks to read a JSON using the below code.

my_json = ''
with open('/dbfs/mnt/my_mount/my_json.json', 'r') as fp:
    my_json = json.load(fp)

But I am getting the below error

FileNotFoundError: [Errno 2] No such file or directory: '/dbfs/mnt/my_mount/my_json.json'

However, if I try to read using a Spark Dataframe, I am able to read contents of the json file and verified using .show() method of Spark Dataframe.

df = spark.read.text('/dbfs/mnt/my_mount/my_json.json', wholetext=True)

How can I read the file using with open?

Below is the Databricks Runtime I am using.

enter image description here

Please let me know if you need any further details.

Sarath Subramanian
  • 20,027
  • 11
  • 82
  • 86
  • 1
    Does this answer your question https://stackoverflow.com/questions/49318402/read-write-single-file-in-databricks – Sharma Jan 16 '23 at 18:30
  • Its similar but that questions requirement is to append to a file although that user is getting file not found error. @Sharma – Sarath Subramanian Jan 16 '23 at 18:46
  • Comment by @Sharma is correct. For Databricks suggest using `dbutils` for direct (non-data frame) file access on blob storage. Otherwise try python lib `S3FS` for AWS or equivalent for azure – Chris Jan 16 '23 at 19:11
  • @Sarath Subramanian, which databricks are you using aws or azure? – Rakesh Govindula Jan 17 '23 at 12:21

0 Answers0