0

I have table where I track various statistics about site usage. Once a week, I plan to use these statistics and information from other tables to calculate various key indicators for use in multiple reports. This is in order not to have compute the complex data each time a report is accessed. These indicators will be stored in a separate table called "indicators".

There are approx. 50 indicators for 220 countries (i.e. a set of 50 for each country separately). Given that these figures are calculated weekly, I get the mind-boggling number of:

50 x 220 = 11,000 inserts/updates each week

My indicators table is structured in the following way:

id PK | type (TINYINT) | country_id (TINYINT) | year | week | value

... where I would basically insert/update 50 rows of different "type" per country.

What I'd like to ask is: am I going completely the wrong way here? Might I be better off setting the countries as columns instead of rows (220 columns...)? Or perhaps the indicators and lose some flexibility? How are such high volumes of inserts/updates generally dealt with?

Thanks in advance for your time.

CONCLUSION: Looks like this is clear enough, thanks for all the contributions.

Tom
  • 30,090
  • 27
  • 90
  • 124
  • 1
    11k per week? That's breakfast! :) – Diadistis Mar 01 '10 at 02:06
  • 1
    11,000 inserts/week is nothing to worry about. People regularly put 10's of millions, or even billions of rows into tables. Once you start getting really huge datasets you start having to be more careful with how the db is setup, some kinds of joins become impractical, etc.etc. but that's not likely to be an issue for you for a century or so. – Steve B. Mar 01 '10 at 02:06
  • @both above.... so I dont need to worry about my app grinding to a halt while it's doing that? (I'm a n00b) – Tom Mar 01 '10 at 02:07
  • Nope. No worries. That's exactly what it's for. It is worth reading up a bit on mysql optimization, as well as some simple setup things you can do (e.g. not logging every query, slow query logging) that are very well documented. But what you're doing is really very low volume. I'm currently running apps with 50-100 million row tables, and although I tuned them a bit I'm hardly an expert and they worked well even at those sizes with just default settings. – Steve B. Mar 01 '10 at 02:11
  • 2
    Well, depends, if your server is a 25Mhz 486SX with 16MB of RAM then you might have a problem. – Diadistis Mar 01 '10 at 02:11
  • @Steve B - thanks. @Diadistic - yeah, I installed that just last week, found one in the attic. – Tom Mar 01 '10 at 02:15

1 Answers1

1

While I don't know your exact application, this shouldn't be problematic at all. Inserts and updates shouldn't be table locking. I doubt your users would even notice.

As a comparison, under load my mysql server peaks at nearly 3.5k queries / second.

Typically you'll want to run your maintenance scripts during periods of low use anyway.r

jasonbar
  • 13,333
  • 4
  • 38
  • 46