0

First of all, apology for my bad english.

I just want to clarify things.

If there are multiple website's user (let's say 1000) and try to access my API's endpoint (let say everyone is accessing the registration endpoint) at the same time (or if not exactly the same time, a time-interval of nano-second). What will happen? Will everyone get the same response time? or the first one who access it will get it faster than the second one?

Base on my knowledge (yeah i'm stupid), I think that the API will handle it in queues, so if you're the 1000th user you will receive the response in much longer time. If this is true. Is there a way to lessen the delay?

Thank you very much for your time explaining things :)

Emman
  • 41
  • 2
  • For this You have to tune server settings. Lets say if you using `PHP` and `Nginx` than you have to tune your `nginx` and `php-fpm` for handling those requests Example: increasing background-worker-processes and php-fpm processes etc. The better you tune the lesser will be the response time. – Sahil Gulati Mar 20 '17 at 03:39
  • I see. I will research for those recommendations! Thank you so much. I 'm actually thinking to create 10 more server that connects to one database. then if the other one is busy, it will redirect to the other one. I'm such a newbie :) – Emman Mar 20 '17 at 03:48
  • If you want to go for multi server for handling requests. It will be better to go through the concept of `load-balancing` on server – Sahil Gulati Mar 20 '17 at 03:49
  • Ohh, so what I'm thinking is possible? Great! two options in just a minute. – Emman Mar 20 '17 at 03:52
  • Welcome..... :) – Sahil Gulati Mar 20 '17 at 03:53
  • There are a lot of variables which contribute to the answer to your question. Perhaps provide more specifics, so you can get a specific answer. For example, specifying the web server will you be using, as well as what your API endpoint does, will be a good start. – Joe Niland Mar 20 '17 at 04:19
  • AS for the web server, I'll be using apache. And for the endpoint, I don't have that at the moment, I just think of a possible scenario and I came up using the registration form as it is commonly used form. Sahil's response was short yet, easy to understand and to research. Also someone below named Alex support his answer. I'm going to research for this fine-tune and load balancer on how to achieve it. For testing, I don't how to test yet. I'll worry for it later. – Emman Mar 20 '17 at 05:20

1 Answers1

0

You're right about the queue. If there are 1000 people users accessing your API at the same time, some of them will most likely wait.

You can fine-tune the number of simultaneous requests you accept. I assume you're using nginx or apache. For example, you will have to increase the workers and worker processes in nginx as much as possible, but make sure your server can handle them.

If you want to use more servers, you can use a load balancer that will serve the request from the server that's available at the moment or randomly, from one of them.

Alex
  • 4,674
  • 5
  • 38
  • 59