2

I am reading and writing from MySQL tables which have utf8 encoding and a mix of numerical fields and text fields of Japanese characters. The tables can be read without problem and display properly in R but writing back to MySQL using dbWriteTable produces text output that contains unrecognisable characters. If I write the dataframe as a CSV file, I can then read it into MySQL successfully, so presumably I am missing something simple in using dbWriteTable. The R code is:

con <- dbConnect(RMySQL::MySQL(), user='root', password=pw, dbname=dbase)
dbGetQuery(con, 'set character set utf8')
dbWriteTable(con, 'assort', full_out)
dbDisconnect(con)

Thanks for any pointers.

jgh781
  • 171
  • 2
  • 9
  • Can you try the github version? I spent some time on encoding in the dev version. – hadley Mar 12 '15 at 00:24
  • The github version produces different characters in the MySQL tables but still unrecognisable unfortunately (series of ???). I suspect the 'set character set utf8' isn't doing anything although the characters appear the same whether I write the table anew or append the data to an existing table set up with utf8 fields – jgh781 Mar 12 '15 at 03:32
  • Can you please file a reproducible issue? – hadley Mar 12 '15 at 10:49
  • I've put a simple example in github. I assume that is where the issue should be filed. Thanks – jgh781 Mar 13 '15 at 06:24

0 Answers0