0

I have a data.frame (table_2) of 12531 rows and 92 cols and I would like to load it into MySQL using 'RMySQL' package, so...

dbWriteTable(con,'DB.table_2',table_2,row.names=F)

Checking the output in MySQL, I see that the table has 3 rows more, with all fields missing (NULL):

check1 <- dbGetQuery(con,'select * from DB.table_2')  # 12534 rows
check2 <- check1[is.na(check1$row_1),] # 3 obs

The only way I found to solve this problem is filtering the table:

dbGetQuery(con,'create table DB.table_3 select * from
                DB.table_2 where row_1 is not NULL')
table_3 <- dbGetQuery(con,'select * from DB.table_3')  # 12531 rows

Does somebody know if there is a better way to do it and the reason of this problem?

Many thanks for your help. Christian.

chri
  • 1
  • 1
  • No. You should not have extra rows. The code below doesn't help to reproduce or even to show any problem. – agstudy Jul 18 '13 at 10:24
  • Right, I should not have any extra rows and I really do not have any clue for this outcome, I used this code several time and it happens for the first time. The code below is what I did (nothing left) and the R output did not tell me any ERROR or WARNING. – chri Jul 18 '13 at 10:56

0 Answers0