8

The documentation states:

Note that you can always place this on per-URL routes to enable different request rates to different resources (if for example, one route, like /my/slow/database is much easier to overwhlem than /my/fast/memcache).

I am having trouble finding out how to implement this exactly.

Basically, I want to serve static files at a different throttle rate than my API.

ilovett
  • 3,240
  • 33
  • 39
  • try this out [restify-throttle @ GitHub](https://github.com/thisandagain/restify-throttle/tree/master#use-without-restify) – dgm Nov 04 '13 at 15:29

1 Answers1

11

Setup throttling (rate limiter) with restify for some endpoints like this.

    var rateLimit = restify.throttle({burst:100,rate:50,ip:true});
    server.get('/my/endpoint',
        rateLimit,
        function(req, res, next) {
            // Do something here
            return next();
        }
    );
    server.post('/another/endpoint',
        rateLimit,
        function(req, res, next) {
            // Do something here
            return next();
        }
    );

Or like this.

    server.post('/my/endpoint',
        restify.throttle({burst:100,rate:50,ip:true}),
        function(req, res, next) {
            // Do something here
            return next();
        }
    );

Even when throttling per endpoint a global throttle may still be desired, so that can be done like this.

    server.use(restify.throttle({burst:100,rate:50,ip:true});

(reference) Throttle is one of restify's plugins.

Mark Maruska
  • 1,210
  • 12
  • 13
  • 1
    What's the difference between `rate` and `burst`? – mcont Nov 06 '15 at 21:13
  • 2
    Restify uses the [token bucket](https://en.wikipedia.org/wiki/Token_bucket) algorithm to throttle traffic. With this, the `burst` value is the maximum possible number of requests per second, and the `rate` value is the average rate of requests per second. Even if requests are steady from a caller's perspective, the Restify server may not receive those requests at a steady pace (due to transmission congestion or other reasons), so the `burst` value provides some tolerance level beyond the average `rate` value. – Mark Maruska Nov 11 '15 at 22:44
  • How do you know that the rate limit is met? Is there an error raised here? – Aamir Apr 18 '18 at 20:28
  • @Aamir, on the server-side, a TooManyRequestsError is thrown. From the client-side, the caller should receive an HTTP 429 status, Too Many Requests. – Mark Maruska Apr 26 '18 at 13:46
  • @MarkMaruska So, `burst` is the max value for the overall number of requests that the server would accept per second. And the `rate` is the max requests that a per IP can request? – Vpp Man Jul 20 '19 at 04:54
  • @VppMan The **burst** size and average **rate** (requests/second) translate to the [Token bucket](https://en.wikipedia.org/wiki/Token_bucket) algorithm, with the ability to throttle on IP (or x-forwarded-for) and username (from req.username). – Mark Maruska Jul 29 '19 at 12:14