1

I am developing a web-based application, which processes high amount of data (200.000 - 500.000 rows) with a Java applet, then the data has to be inserted into MySQL. Java is a must, it cannot be ignored or replaced

Which method would be faster and more hassle-free, secure?

  • Parse the data in Java, send the generated TreeMap to the PHP with GZIP compression, then prepare the rows in PHP with $q -> bind_param($a, $b, $c);, then $q -> execute();.

  • Or parse the data in Java with batchStatement.addBatch();, create many INSERT statement, write that to String, compress that with GZIP, send it to PHP, then from PHP, execute LOAD_FILE('xy.tar.gz'); in MySQL.

Ágota Horváth
  • 1,353
  • 3
  • 13
  • 20

2 Answers2

0

I believe Load_File is the fastest.

Read more:

http://kvz.io/blog/2009/03/31/improve-mysql-insert-performance/
http://dev.mysql.com/doc/refman/5.1/en/insert-speed.html

mjb
  • 7,649
  • 8
  • 44
  • 60
0

I don't think batchStatement.addBatch() will do what you expect. That's for executing a batch of queries inside JDBC, connecting directly with your DB from Java. It will not generate you the code of the SQL.

Furthermore, LOAD_FILE does not do what you expect, that's for loading a full file into a string field. Perhaps you are thinking of LOAD DATA INFILE but, again, this does not expect a bunch of INSERT.

It's not clear if you want to this only once, or regularly. And why you insist on using PHP (are you connecting remotely with a web app that does the insert? are you aware of the security implications of this?).

leonbloy
  • 73,180
  • 20
  • 142
  • 190