4

I've written an application which servers can use. This application gathers information, and sends it to a server. This is done every 10 seconds. The amount of data depends on players playing, but lets keep it at about 50 servers each sending 100 pieces of data (5000 total every 10 seconds).

Those pieces of data exist of an SQL query (in PreparedStatement syntax), an Object[] of values for the SQL query, and the serverID.
Now, I want to process all the data. And this is what's not going well for me. I have a MySQL server, which has 5 tables. Three of the tables are constantly updating, and can't keep up with the data flow. The tables are InnoDB tables, mostly since I can row-level lock that way, instead of table-lock. Most of the queries are UPDATE queries anyway, barely any INSERT statements.

I've tried executing all queries right away, which just resulted in horrible performance since the connecting server also had to wait for that to be finished.
I've also tried putting all queries in a big ConcurrentLinkedQueue, and emptying this queue every few seconds. The performance was better, but this was way too slow.
Then I tried a per-table solution, which was slightly better again. Way too slow still.
Currently, it's using a per-server setup (Creates a new thread for every server and executes all queries there). This still is too slow. It can't keep up.

As you see, I've tried numerous things. I've also tried using addBatch() followed by executeBatch(), which is also used in the current setup. Of course, I've also looked on here, Google, etc. Has some useful info, but mostly it's just about adding PreparedStatements and using BatchUpdates.

Any ideas on how to do this?

Lolmewn
  • 1,491
  • 4
  • 17
  • 38
  • 1
    Take a look at the suggestions here as well: http://dev.mysql.com/doc/refman/5.5/en/optimizing-innodb-bulk-data-loading.html – mellamokb Mar 12 '13 at 15:56
  • what do you mean process? Can you process from object? – Hituptony Mar 12 '13 at 15:56
  • Are there reads going on at the same time or are you really just concerned with getting the data in right now? – WPrecht Mar 12 '13 at 15:57
  • What I mean by process: I'm receiving the data as QueryHolder object (one of my own classes, with the SQL query and the Object[] in it). I want to process the SQL statements. I am not concerned with reading at the same time, I just want to put the data in there. Thanks for the link, I'll see if that fixes anything. I'm already using autocommit, since that's useful for BatchUpdates. – Lolmewn Mar 12 '13 at 16:05
  • Isn't this more of a design problem? Should you're database really be updated so frequently with data? Should the client-server interaction not hold the relevant data in cache, and only update the database at termination of the game or some predefined moments (not every 10 seconds)? – reverse_engineer Mar 12 '13 at 16:21
  • Maybe it shouldn't, but I would very much like it to be. The web-end attached to it refreshes all it's data every 10 seconds too, hence the reason I chose 10 seconds. Besides, it should be able to execute those queries in 10 seconds, right? – Lolmewn Mar 12 '13 at 16:35
  • Yes, when I see the numbers it should be possible indeed (easily), but more often than not, you see people wanting to scale for the sake of it, while functionally, things could run more easily. Not saying this is your case though... But what do your statements look like? Only one line update per statement? And you can't handle 5000 of them in 10 seconds? – reverse_engineer Mar 12 '13 at 16:42
  • Statements look like this: UPDATE Stats2_block SET amount=amount+1 WHERE player=? AND blockID=? AND blockData=? and serverID=? The question marks get filled with the data from the Object[]. Does it make it a lot slower when the same blockID + blockData + serverID combination have to be performed very often perhaps? – Lolmewn Mar 12 '13 at 16:44
  • Hhmm, is your table very large? Is it indexed? And about the redundancy, can't you remove redundant updates in the code before processing them? Just first things coming to my mind... – reverse_engineer Mar 12 '13 at 16:47
  • Looks like there's about 900k rows in that table. Indexed with a primary key, Not-Null's, Unique Key, Unsigned's, and an AI. I might be able to process redundant updates before sending them, although that would be difficult (mostly because I'd have to find what the value to update is). – Lolmewn Mar 12 '13 at 16:53
  • Hmm, ok so you sure indexing is not an issue then? You query on the columns relevant to the index? Then the only thing remaining seems to be the redundancy. Try to eliminate statements like amount=amount+1 with only one amount=value statement, because you should know the current amount in your game logic right? – reverse_engineer Mar 12 '13 at 17:00
  • I'm not currently caching those values, I thought that'd give a huge increase in memory usage since there's so many MySQL rows. I'll go ahead and fix redundancy though, thank you. – Lolmewn Mar 12 '13 at 19:35

1 Answers1

1

I used spring-jdbc, and I use com.jolbox.bonecp for connections . I would like to recommend you to use same.

I use jdbcTemplate.batchUpdate(query, multyPreparedValues) where multyPreparedValues is List<Object[]>

but you also can use BatchPreparedStatementSetter — example https://stackoverflow.com/a/8873528/814304

Community
  • 1
  • 1
iMysak
  • 2,170
  • 1
  • 22
  • 35