I have a classical web application (CRM tool) which requires nginx, python (flask), postgresql and redis to work.
I plan to sell this application to different companies and host them on my powerful dedicated server and I plan to use Docker to quickly create the instance for the new company.
One of the goals that I want to achieve is protecting clients from Ddos attacks, for example, if one of the clients is under attack, this should not hurt other users. I also want to be able to easily scale the application (or just transfer it to a separate server) if it creates too much load.
Should I run a separate container of the web server, database and cache for each copy of the application? Will it create too much overhead? Is there any more, more optimal way to ensure reliability and isolation?
Thanks in advance.