3

I have a table. Due to the nature of the website, in that table, if i have 400+ users or so, it will have in it inserted 100,000 new rows per hour.

If i have near 4000 then for 2 hours or so, i will have a minimum of 500,000 rows inserted. This means about 800-1000 inserts per second, probably even way more than that.

How do i prepare the table for such intense use? I am talking about quite the lot of inserts per second in the table and i don't want it to get locked or slow down.

The table engine is MyISAM, but it could be altered to accommodate any changes

dikidera
  • 131
  • 1
  • 2

1 Answers1

3

it all depends on your usage pattern - do you need high performance for lookups in this table? will lookups be evenly distributed across the whole table or just latest data? if selects will mostly hit recently inserted rows - consider partitioning - check this and that.

myisam almost + heavy write load sounds like a bad idea to me. go innodb. learn about tuning of that engine, use fast disks, have enough memory / buffer pool config to fit at least indices in ram.

check postings at mysqlperformanceblog - you'll find a lot of good advice there.

spend a lot of time on benchmarking - see how the system behaves under prolonged load [you might see drops in performance due to checkpointing - is it acceptable? or maybe you need to get 5.5 or xtradb to solve that problem?]

pQd
  • 29,981
  • 6
  • 66
  • 109