Our current set up has Django hosted on Google's appengine with a MySQL database on Google's Cloud SQL.
The users (clients) are typically small businesses who we give a subdomain to for a multi-tenant database structure (1 database for each client).
As for determining which request should hit up which database, there is an existing middleware which strips the request path to get the subdomain and thus returning the correlated database alias defined in settings.py
from django.conf import settings
import threading
currentCompanyConnection = threading.local()
class DatabaseMiddleware(object):
def process_request(self, request):
url = request.build_absolute_uri()
if url.find("http://") < 0:
company = url.split("https://")[1].split(".")[0]
else:
company = url.split("http://")[1].split(".")[0]
global currentCompanyConnection
if company in settings.DATABASES:
currentCompanyConnection.company = company
else:
currentCompanyConnection.company = 'default'
request.currentCompany = str(currentCompanyConnection.company)
return None
class DBRouter(object):
def read_and_write(self, model, **hints):
return currentCompanyConnection.company
db_for_read = read_and_write
db_for_write = read_and_write
However, to allow our web application the functionality of a freemium self-service, a new database must be generated on the fly and imported into Django's settings.py dynamically for each user who sign up.
The last part is something I can't seem to figure out, since each time I change the settings.py, I must deploy it to appengine again. Aside from that, I'm not sure how to create a new database with pre-defined tables in Google's Cloud SQL from our web application.
Thanks for your help! I love resolving challenges from work, but this is something I simply haven't done before =O