0

i have some problem in conversion or writing dbf files with a big amount of data. I use jdbf library and it provides an array solution for writing all the data. While i run a small amount of data, it really works fast. But, the problem is, i usually use this conversion to write a huge data (almost 2 million rows for each request).

I try to use threads, but in writing file only can be written for once. Is there any solution for me? Thanks for any answer.

syaloom
  • 385
  • 5
  • 10
  • 21

2 Answers2

0

Try JDBF: https://github.com/iryndin/jdbf. it can handle files up to 2G. 2G is the limit of DBF files AFAIK. If it is not enought you can break your DBF files into parts and write each part as a single DBF file. JDBF can handle big files quite good - it war tested with millions of rows.

iryndin
  • 530
  • 1
  • 5
  • 11
0

I created a tool to convert from DBF file into a TXT file separated with \t. Checkout my github: https://github.com/miguelschwindt/dbf-converter-java. There you have the source code written in java, or if you prefer, you have a .jar in libs/dbf-converter.java.jar to directly execute the tool over command line.

miguel.angel
  • 49
  • 1
  • 8