I'm building an application that will start small, but could grow and of course I may as well think about scalability.
I'm going to use a message queue, partly for scalability but also to process some tasks in the background.
I'll be heavily using Twitter/Facebook APIs too (although caching as best as can to reduce number of calls).
It's a PHP application using a heavy framework, and expect CPU, RAM and the DB to get some use.
The message queue is a service accessed from an external API.
Many requests could contain API calls + message queue calls, and now I'm thinking I may end up topping out with the network allowances. I'm thinking putting data on a message queue would be better than using the DB so much though.
Is there a limit in Linux with the NUMBER of connections? I'm guessing if I keep within the server's bandwidth allowance (probably 500Mb/s) then that should be fine, but surely the number of individual calls also needs to be considered.
I can fiddle around with settings + increase server size either now or in the future, but I'm thinking that making 3 external calls for every web request may be a lot and I should design the application differently from the start?