0

I'd like to import a bunch of large text files to a SQLite db using RSQLite. If my data were comma delimited, I'd do this:

library(DBI)
library(RSQLite)
db <- dbConnect(SQLite(), dbname = 'my_db.sqlite')
dbWriteTable(conn=db, name='my_table', value='my_file.csv')

But how about with '\t' -delimited data? I know I could read the data into an R data.frame and then create the table from that, but I'd like to go straight into SQLite, since there are lots of large files. When I try the above with my data, I get one single character field.

Is there a sep='\t' sort of option I can use? I tried just adding sep='\t', like this:

dbWriteTable(conn=db, name='my_table', value='my_file.csv', sep='\t')

EDIT: And in fact that works great, but a flaw in the file I was working with was producing an error. Also good to add header=TRUE, if you have headers as I do.

arvi1000
  • 9,393
  • 2
  • 42
  • 52

1 Answers1

4

Try the following:

dbWriteTable(conn=db, name='my_table', value='my_file.csv', sep='\t')

Per the following toward the top of page 21 of http://cran.r-project.org/web/packages/RMySQL/RMySQL.pdf

When dbWriteTable is used to import data from a file, you may optionally specify header=, row.names=, col.names=, sep=, eol=, field.types=, skip=, and quote=.

[snip]

sep= specifies the field separator, and its default is ','.

Mark Silverberg
  • 1,249
  • 2
  • 8
  • 21