4

In Excel, I created a Year End Date column of 1000 rows, where every cell's contents are 12/31/2018. I formatted this as a Date in Excel. I pulled this all into R, did some data manipulation on other columns, NOT the date column, and exported the final data frame out using write.csv. When I open the final file (in Excel or Tableau), my entire Year End Date column is coming out to be 1546214400 in every single cell.

Why is this happening? How can I resolve it?

Thanks!

Misc
  • 41
  • 1
  • 2

2 Answers2

5

Every software program stores a date (or date time) as a number, but will display this number in a familiar date format for humans.

When you read your data into R, whatever function you used was "smart" enough to know that "12/31/2018" was the date "December 31, 2018" and not a string of characters "1, 2, /, 3, ..." all pasted together. One way that R stores dates is as a POSIXct object. We can see that R stored the date "12/31/2018" as the POSIXct integer 1546214400:

as.integer(as.POSIXct(as.Date("2018/12/31")))
> 1546214400

To export your data using write.csv() in a way such that the dates will be familiar to Excel, you first need to format the POSIXct value 1546214400 with a date format. So if your data are in a dataframe named mydf and the date column is named date, do this:

mydf$date <- format(mydf$date, "%m/%d/%Y")
write.csv(mydf, file="path/to/file.csv")
DanY
  • 5,920
  • 1
  • 13
  • 33
1

If you pass in column types you can control column format. \

coltypes <- cols( col1 = col_double(), col2 = col_double(), col3 = col_date(format ="%Y%m%d"), col4 = col_double())

then after you specify what each column should be, read the CSV in like this: /

myData <- readr::read_csv("myCSV.csv", col_types = coltypes)
Joe Wolf
  • 61
  • 12