I'm trying to create a Clickhouse DB and copy to a table the values from many files (all them equal column order).
But I'm not understanding how to deal with it.
I'm using RClickhouse which I don't know if has any difference compared to clickhouse-r.
library(RClickhouse)
library(DBI)
library(tidyverse)
eggnog_dir <- "/home/acpguedes/projects/sig_trans/data/eggnog/table/"
setwd(eggnog_dir)
myconn <- DBI::dbConnect(drv = RClickhouse::clickhouse())
mytables <- list.files(".") # all tables are in the same folder
mysqltb <- db_create_table(con = myconn, table = 'eggnog')
lapply(mytables, function(x) {
read_tsv(file = x,
col_names = c( #in my case the tables has no header
'sequence',
'model',
'start',
'end',
'evalue',
'cov',
'qstart',
'qend',
'iteration',
'score',
'talilen',
'qlen',
'estart',
'eend',
'program'
)
) %>% dbWriteTable(conn=myconn, value = ., name = "domains", append=TRUE)
}
) -> dt
Don't matter the columns itself, I just would like an example of how to create a table and load to it the content of many files( tables like tsv ou CSV or any delim).
I also was trying with dbpĺyr
using copy_to()
instead dbWriteTable
.
Also, afterload all tables, should I do a 'commit' statement to save permanently the database to posterior accession from R or other platforms?
Thanks in advance.