1

I need to write a large data.table to a PostgreSql table on the same computer (Ubuntu 64bit, 16GB RAM).

Before sending the following commands, the System Monitor show 47% Memory used (3% Swap), but in the middle of the long time of running dbWriteTable, it reaches 99% Memory and also 49% Swap.

library(data.table)
library(RPostgreSQL)
my.df <- data.frame(my.dt)
rm(my.dt)
dbWriteTable(con, name="tableindb", value=my.df)

The table has mostly numeric columns, but also time (Posix) and a few character columns.

How can this large RAM use be avoided? After 1 hour, the dbWriteTable-command is still running...

Matt Dowle
  • 58,872
  • 22
  • 166
  • 224
Chris
  • 2,256
  • 1
  • 19
  • 41
  • It might help to know exactly how big the data frame is. – joran Jun 02 '13 at 23:41
  • Well, the number of rows and columns would be a start. You might also check `object.size`. – joran Jun 02 '13 at 23:43
  • 10731640 obs. of 39 variables. object.size is 1,894,392,072 bytes. – Chris Jun 02 '13 at 23:46
  • 1
    For these sizes, it is better to use the PostgreSQL COPY statement immediately. So dump it to a csv file and use the PostgreSQL COPY command. –  Jun 03 '13 at 10:14

0 Answers0