3

Currently I'm working on a on-line webapplication for construction materials. Companies can log in on our website and then they can use the webapp.

From the beginnen the idea was to create a database per customer. But now it's becomming larger and larger (100+) so we have now 100 databases to manage.

We have to run approx. twice a year an update script for db maintanance.

The advantage that I see, is that when a customer wants to quit, we delete their database and than it's finished.

When I want to add new customer, I have to fill the database with approx. 1.000.000 unique records for that specific customer, because every customer have different prices /materials.

For backups I use a MySQL Dump script, that creates a *.sql file per database that I download every day.

What is your opnion and what do you think? One large db or per customer a database?

I'm using MySQL with ASP.NET/C#...

Ruutert
  • 395
  • 1
  • 7
  • 18

4 Answers4

3

But now it's becomming larger and larger (100+) so we have now 100 databases to manage

I think you have your answer right there.

Hogan
  • 69,564
  • 10
  • 76
  • 117
3

I don't want to make a suggestion because there are far too many variables.

I do want to note, however, that my employer has 1000s of deployed databases -- we use one database per customer with replication (2+ databases).

So, the idea is workable. My job isn't related to DB management but I do recall that we do a lot in the way of automation and online tools. Backups and DB management is handled by a team.

Ultimately, you can make the 100+ deployments work but you are going to want to start investing in the development of utility and tools to help automate the backup and/or management of the DBs.

Ideally, nothing (DB Management) should be done by hand. Furthermore, the connection strings should be abstracted away from a given web app deployment.

Frank V
  • 25,141
  • 34
  • 106
  • 144
  • thanx for the reactions, if I had to do it from scratch I choose for 1 single DB ... but for now we're continue working this way .. because it's the cheapest way, and the db schema doesn't change that ofteren (once a year maybe ..) once the customer logs in .. it goes through a central user database and the connectionstring is stored in the session .. – Ruutert Dec 22 '10 at 15:41
1

Have to agree with @Hogan - the overhead of managing that many databases is probably far from ideal - especially if you ever need to make schema changes, etc. in the future.

That said, if you use a single database are you ever likely to need to separate out a given customer's data into a standalone database/site? If this is likely, how long would it take to carry out this separation?

In essence, if it's likely to take less effort to write a set of tools to handle the above case, then I'd be tempted to go for the single database approach. However, you'll also need to factor in the likely timescales for creating a unified version of the database schemas that handle datasets for each customer, etc.

Also, are the schemas precisely the same for all of the existing 100+ databases? If not, there's potentially a world of pain if you decide to migrate the existing data into a single database.

Update - Incidentally, all of the above is a bit generalised, but it's hard to be specific without knowing more about the amount of data, and traffic, etc. in use. (e.g.: If you ever had a high demand site for a customer it would be trivial to put it onto its own DB server if you were using a per-customer database.)

John Parker
  • 54,048
  • 11
  • 129
  • 129
0

i agree with @Hogan and @middaparke... if the schemas are the same, you shuol dput it in one instance.

unfortuantely it is impossible to tell from here if your schemas would benefit from reusing most of those million rows or not, if normalized well, the ncertinly it would be beneficial.

it is also impossible to tell how difficult any changes to the applications would be based on this change.

unfortunately, it sounds like you have a large customer base with working applications, and therefore momentum to keep going in that direction - which thros you into the realm of sucking it up and dealing with it by automating the management of so many db's... not the way you would do it from scratch - but maybe cheapest since you are where you are.

Randy
  • 16,480
  • 1
  • 37
  • 55