0

So I am looping through multiple csv files and using FBProphet to get a forecast for them. I then write all of the forecasts to one csv file using .to_csv. However I would like to get the filename to separate them on the csv and am unsure if there is a way to do that when using .to_csv.

filenames=['example1.csv','example2.csv','example3.csv']
for f in filenames:
    df=pd.read_csv(f)

    df.columns=['ds','y']
    df['ds']=pd.to_datetime(df['ds'])

    m=Prophet(seasonality_mode='multiplicative')
    m.fit(df)

    future=m.make_future_dataframe(periods=6,freq='M')

    forecast=m.predict(future)
    forecast[['ds','yhat','yhat_lower','yhat_upper']].tail()

    fig1=m.plot(forecast)
    fig2=m.plot_components(forecast)

    forecast.to_csv('final_data.csv',mode='a',header=True)

So the results i'm getting are

ds yhat yhat_lower yhat_upper                                    
example1 forecast

ds yhat yhat_lower yhat_upper                                   
example2 forecast

ds yhat yhat_lower yhat_upper                                    
example 3 forecast

The results I would like to get are

example 1 filename    
ds yhat yhat_lower yhat_upper                                   
example1 forecast

example 2 filename                                                    
ds yhat yhat_lower yhat_upper    
example2 forecast

example 3 filename   
ds yhat yhat_lower yhat_upper  
example 3 forecast
merv
  • 67,214
  • 13
  • 180
  • 245
  • What about something like: `forecast.assign(filename=f).to_csv('final_data.csv',mode='a',header=True)` ? – Anna Iliukovich-Strakovskaia Dec 27 '18 at 14:49
  • That added the filename to the columns and next to yhat_upper which works well. Is there any way though to get it on top of the column headers? – Brandon Brown Dec 27 '18 at 15:25
  • use pandas's excelwriter and set up your filename as a multi index. plenty of examples here: https://towardsdatascience.com/seven-clean-steps-to-reshape-your-data-with-pandas-or-how-i-use-python-where-excel-fails-62061f86ef9c – dylanjf Dec 27 '18 at 17:46

0 Answers0