13

enter image description hereenter image description hereI'm new to the Databricks, need help in writing a pandas dataframe into databricks local file system.

I did search in google but could not find any case similar to this, also tried the help guid provided by databricks (attached) but that did not work either. Attempted the below changes to find my luck, the commands goes just fine, but the file is not getting written in the directory (expected wrtdftodbfs.txt file gets created)

  1. df.to_csv("/dbfs/FileStore/NJ/wrtdftodbfs.txt")

Result: throws the below error

FileNotFoundError: [Errno 2] No such file or directory: '/dbfs/FileStore/NJ/wrtdftodbfs.txt'

  1. df.to_csv("\\dbfs\\FileStore\\NJ\\wrtdftodbfs.txt")

Result: No errors, but nothing written either

  1. df.to_csv("dbfs\\FileStore\\NJ\\wrtdftodbfs.txt")

Result: No errors, but nothing written either

  1. df.to_csv(path ="\\dbfs\\FileStore\\NJ\\",file="wrtdftodbfs.txt")

Result: TypeError: to_csv() got an unexpected keyword argument 'path'

  1. df.to_csv("dbfs:\\FileStore\\NJ\\wrtdftodbfs.txt")

Result: No errors, but nothing written either

  1. df.to_csv("dbfs:\\dbfs\\FileStore\\NJ\\wrtdftodbfs.txt")

Result: No errors, but nothing written either

The directory exists and the files created manually shows up but pandas to_csv never writes nor error out.

dbutils.fs.put("/dbfs/FileStore/NJ/tst.txt","Testing file creation and existence")

dbutils.fs.ls("dbfs/FileStore/NJ")

Out[186]: [FileInfo(path='dbfs:/dbfs/FileStore/NJ/tst.txt', name='tst.txt', size=35)]

Appreciate your time and pardon me if the enclosed details are not clear enough.

Shaan Proms
  • 133
  • 1
  • 1
  • 5
  • Try converting it to a spark data frame then save it as a csv pandas most likely doesn't have access to the filestore – Umar.H Dec 19 '19 at 21:15
  • Is it a Spark dataframe or Pandas? The code at the top talks about Spark but everything else looks like Pandas. If it is involving Pandas, you need to make the file using `df.to_csv` and then use `dbutils.fs.put()` to put the file you made into the FileStore following [here](https://docs.databricks.com/data/filestore.html#). If it involves Spark, see [here](https://towardsdatascience.com/databricks-how-to-save-files-in-csv-on-your-local-computer-3d0c70e6a9ab). – Wayne Dec 19 '19 at 21:16
  • Have you tried: `with open("/dbfs/FileStore/NJ/wrtdftodbfs.txt", "w") as f: df.to_csv(f)`? – PMende Dec 19 '19 at 21:17
  • Thanks for the response Mende. I did try that but no luck, it runs fine but file is not making into the directory. – Shaan Proms Dec 19 '19 at 21:44
  • Thanks so much Wayne. The second link shared worked. I have converted pandas data frame to spark. Not sure if Databricks filestore works only thru spark commands for writing data to its file system. – Shaan Proms Dec 19 '19 at 21:59

2 Answers2

12

Try with this in your notebook databricks:

import pandas as pd
from io import StringIO

data = """
CODE,L,PS
5d8A,N,P60490
5d8b,H,P80377
5d8C,O,P60491
"""

df = pd.read_csv(StringIO(data), sep=',')
#print(df)
df.to_csv('/dbfs/FileStore/NJ/file1.txt')

pandas_df = pd.read_csv("/dbfs/FileStore/NJ/file1.txt", header='infer') 
print(pandas_df)
GiovaniSalazar
  • 1,999
  • 2
  • 8
  • 15
  • Thanks Giovani. It worked, seems the files are getting written but does not physically shows up when validated thru gui navigation or thru command fs ls. – Shaan Proms Dec 19 '19 at 22:09
  • %sh find / -type f -name "file2.txt" recursively @ShaanProms – GiovaniSalazar Dec 19 '19 at 22:15
  • 1
    Awesome! I see it. :) Thanks! The dbfs commands %fs ls /dbfs/FileStore/NJ OR dbutils.fs.ls('/dbfs/FileStore/NJ') does not show this file for some reason. – Shaan Proms Dec 20 '19 at 15:38
6

This worked out for me:

outname = 'pre-processed.csv'
outdir = '/dbfs/FileStore/'
dfPandas.to_csv(outdir+outname, index=False, encoding="utf-8")

To download the file, add files/filename to your notebook url (before the interrogation mark ?):

https://community.cloud.databricks.com/files/pre-processed.csv?o=189989883924552#

(you need to edit your home url, for me is :

https://community.cloud.databricks.com/?o=189989883924552#)

dbfs file explorer

Skippy le Grand Gourou
  • 6,976
  • 4
  • 60
  • 76
Nicoswow
  • 69
  • 1
  • 4
  • How do you get the URL to download? can you tell any generic method to download any file – MathGeek Jun 06 '20 at 05:46
  • 2
    Hi Nani, if you put the path+file_name in the middle of your home URL (after .com/), it should be enough, your download should start automatically. In my case, I had to insert "files/pre-processed.csv" in the middle of the home URL. – Nicoswow Jun 08 '20 at 12:04
  • @MathGeek In databricks (Python), i use a HTML href to access the file from IPython.display import HTML HTML('Get CSV') – langeleppel Jan 05 '22 at 10:36