I would like to read in a possibly large text file and filter the relevant lines on the fly based on a regular expression. My first approach was using the package LaF
which supports chunkwise reading and then grepl
to filter. However, this seems not to work:
library(LaF)
fh <- laf_open_csv("myfile.txt", column_types="string", sep="°")
# would be nice to declare *no* separator
fh[grepl("abc", fh[[1]]), ]
returns an error in as.character.default(x)
-- no method to convert this S4 to character. It seems like grepl
is applied to the S4 function and not to the chunks.
Is there a nice way to read text lines from a large file and filter them selectively?