I'm stuck again! my story is:
I need to find files named "tv.sas7bdat" that exists in different folders in a directory and save the content of all files found into a single excel file on my desktop. With my actual Code I can get all paths for that file and transfer their content to a dataframe. But, I can't append all dataframes into one single Excel file.
In my excel I find only the last dataframe !!
Here is my Code,
import pandas as pd
from sas7bdat import SAS7BDAT
import os
path = "\\"
newpath = "\\"
files = []
# r=root, d=directories, f = files
for r, d, f in os.walk(path):
for file in f:
if 'tv.sas7bdat' in file:
files.append(os.path.join(r, file))
lenf = range(len(files))
for f in files:
print(f)
for df in lenf:
with SAS7BDAT(f) as file:
df = file.to_data_frame()
print(df)
group =pd.concat([df], axis=0, sort=True, ignore_index = True)
df.to_excel(newpath + 'dataframes_tv.xlsx',index=False)