2

I am trying to read data from a csv file has 332,462 KB with 136 columns and 297,388 rows. Then I want to insert into an Oracle database table which has the exactly same column number mapping, except I add one more column at the end of this table to record today's date.

So, everything looks fine, no exceptions, the only thing is I can only read a small part like 7619 row, and the program stops. The finish part in the database is what I want, that is correct, but I don't know why it stops, I tried use readNext(), readAll(), and pass an inputStreamRead to CSVReader, all of these way have the same result.

What is the cause of this? One thing I am think is this csv file has some empty row that the CSVReader read it as the end of the file?

Newd
  • 2,174
  • 2
  • 17
  • 31
Nickwsf
  • 21
  • 1
  • Without seeing the code I can only guess that it is an out of memory error that you are trapping the exception to. The readNext would be the way to go for large datasets just make sure you read a single record and then write it out to the database before reading the next record (make sure you are not holding on to records in memory). You have alot of things going on so lets simplify this and try and narrow down where the problem is. First write a program that just reads the file and prints out the output (take out the oracle write) that would narrow down to just the read or write. – Scott Conway Jul 27 '15 at 14:15
  • Second make a copy of the data file and remove the first 7600 rows of data and see if it then dies on row 19 (meaning you have bad data) or around 7619 again (meaning you are running out of memory). Let me know how that goes. – Scott Conway Jul 27 '15 at 14:20
  • Possible duplicate of [openCSV not reading my entire file](http://stackoverflow.com/questions/14988577/opencsv-not-reading-my-entire-file) – harshtuna Mar 11 '16 at 04:55
  • openscv does not support streaming and fails silently on large files. try apache commons CSV - that worked in my case – harshtuna Mar 11 '16 at 04:56

0 Answers0