I need import a 10.352.223 KB data base in PostgresSQL. I am trying to do it via RPostgreSQL library in R, but I have had problems.
My code is:
i<-10000
for(a in seq(0,36000000, i)){
data<-read.table("data.txt",sep="|",dec=",",
nrows=i,skip=a,fill=T,
colClasses=c(rep("NULL",2),"character",rep("NULL",64),
rep("character",2),rep("NULL",11),
"numeric",rep("NULL",9),"numeric",
rep("NULL",2)))
colnames(data)<-names
data$iteration<-rep(a,dim(data)[1])
dbWriteTable(con,"data",data, append=TRUE )
print(paste("registry", a, sep = " "))
}
I have no problem until registry 469000, but in some moment I have this error:
Error in scan(file = file, what = what, sep = sep, quote = quote, dec = dec, :
could not allocate memory (2048 Mb) in C function 'R_AllocStringBuffer'
I can't understand what's the problem if I am skipping a lot of rows and reading some specific rows.
What can I do? I must use PosgreSQL codes directly? There exist a better method in R?
I would appreciate your help.