0

I load several databasefiles (SQLite) and subject them to a simple query:

library("RSQLite")
drv <- dbDriver ("SQLite")
get.wa2 <- function(file){
    con <- dbConnect (drv, dbname = file)
    table <- dbGetQuery (con, "Select data3 from data where data2 like 'xxx' ") 
    return(table)
}
database.files<- dir(database.path)
database.files <- database.files[grep(".db$",database.files, perl = T)] ### select only database files
count.wa  <- sapply(database.files,get.wa2) 

I run into problems since my files are randomly corrupted, or wiped.. appearing as 0 byte in filesize.

Am I doing something wrong and should I be closing connections after each query. What is best practice here?

vrajs5
  • 4,066
  • 1
  • 27
  • 44
Timtico
  • 377
  • 1
  • 4
  • 14

1 Answers1

0

Try an additional logical qualifier vector:

database.files<- dir(database.path)
database.files <- database.files[grep(".db$",database.files, perl = T) &
                                 file.info(database.files)[,"size"] > 0 ]

?file.info

If the error occurs as a result of processing, then you need to look at ?try and the various error handling capacities that R provides.

IRTFM
  • 258,963
  • 21
  • 364
  • 487
  • Thanks, but your option isnt a solution to my problem. The corruption randomly occurs because of my R analysis. Your method let's me skip loading them, but that's not the the solution. And to which library does the list.info function belong? I dont have it. – Timtico May 27 '14 at 08:08
  • I meant to type `file.info`. Bad transcription on my part. Edited. – IRTFM May 27 '14 at 21:02