1

I'm currently running a couple of Web servers (one is Apache 2.4 based, another is running nginx 1.8).

I need to set a rate limit, on each individual GET/POST request, that can throttle both download and upload speed.

I've googled around but every solution I've found seems to focus on download rate only, and I can't find a way to set a limit also on data sent from the client to the server in a request body.

I need this because most requests are related to file uploads.

Any suggestion would be greatly appreciated. Thanks!

Drifter104
  • 3,773
  • 2
  • 25
  • 39
Alberto Pastore
  • 113
  • 1
  • 8

1 Answers1

0

You can't probably find this information because it's not generally needed; your case is just too rare.

Downloads can be of any size, even extremely huge, and rate limiting will allow equitable distribution of network capacity between the users: one huge download can't cause slowing down on typical browsing etc.

HTTP uploads, however, are normally limited in size and they may reserve memory during the upload, that can only be released when the upload is over and the file can be saved. Therefore, I'd suggest limiting the amount of concurrent uploads instead, as I would want a single upload to be over as soon as possible.

Besides, the upload speed in most cases is already limited by an asynchronous Internet connection, so your users would suffer from a bad user experience if you'd limit it even further.

Esa Jokinen
  • 46,944
  • 3
  • 83
  • 129
  • Thanks Esa for your reply. My servers are not publishing web sites, however. They are serving API calls for managing security devices; the uploads involved are for storing backup data onto the server storage space for my users. Therefore my request was to avoid the possibility of delaying standard api calls to everyone when a few big uploads were taking place from some users, but I think you're right, I need to redesign the architecture in order to separata the two API call families and manage bandwith differently. – Alberto Pastore May 19 '17 at 05:42
  • Sounds like a great idea. You could use a completely different destination for the backups. Then you wouldn't need to limit bandwidth by user, but would ensure quality of service for the API calls. Notice that the network bandwidth could not be the only bottle neck here, so separate backup server could be good for more than a one reason. – Esa Jokinen May 19 '17 at 05:56