0

I'm currently debating the question on wether disallowing concurrent GET requests on a given resource constitutes a violation of RFC 2616 (especially the idempotency and safety properties required for the GET method, §9.1).

For instance; if my server receives a GET /data/?dataId=123456 twice simultaneously, would you consider it a violation of safety or idempotency that one or both requests returns an error message ?

As per my understanding, the RFC specifies that the same request should yield the same result when called again. I haven't seen however anything about what behaviours are acceptable regarding concurrent requests.

My feeling is that disallowing concurrent GET access (on a given resource, not as a general rule of course) does not constitute a violation of the RFC. Returning a 423 response code, or a 500 (although not very elegant), or even a 429 or a 420 (although the meaning is slightly different) seems acceptable to me.

I would like to know however if there are valid arguments proving that the RFC denies this position.

Thanks in advance / Best Regards

Remi
  • 1
  • it's done on a daily basis by file sharing services - one download at a time, or even only one download per URL. same URL used later produces a much different result. – Marc B Feb 26 '13 at 14:45
  • 423 would be incorrect. 420 is undefined. You could use 429, but I don't fully understand your use case. Why would you *want* to block the request???? – Julian Reschke Feb 26 '13 at 15:13
  • (you'r right about 423) the question is mostly rethorical, but use case is for a GET request that requires access to a resource that doesn't support concurrent access. I could definitely queue requests until the resource is freed, but I was wondering about the RFC compliance in case I decide not to do so, and just return an error code (which is what we currently do since in our case concurrent access do not normally happen) – Remi Feb 26 '13 at 16:04
  • (420 is the response code used by twitter API to limit the rate of queries) – Remi Feb 27 '13 at 10:21

2 Answers2

0

9.1.1 Safe Methods

In particular, the convention has been established that the GET and HEAD methods SHOULD NOT have the significance of taking an action other than retrieval.

Blocking a resource might qualify as an action other than retrieval. But the wording SHOULD NOT definitely allows you to do so! You could rephrase it to: it's not recommended, but it's valid. If the user doesn't even know that he is causing that side effect it's even better:

Naturally, it is not possible to ensure that the server does not generate side-effects as a result of performing a GET request; [..] The important distinction here is that the user did not request the side-effects.

9.1.2 Idempotent Methods

A sequence that never has side effects is idempotent, by definition (provided that no concurrent operations are being executed on the same set of resources).

The only side effect your implementation has is blocking concurrent requests. No side effects for sequential requests. I understand that quoted passage that it is no requirement for 9.1.2 to be side effect free on concurrent requests. So you get a valid for 9.1.2 from me.

BTW. I would answer with a 503 Service Unavailable together with a Retry-After header.

Markus Malkusch
  • 7,738
  • 2
  • 38
  • 67
0

Operationally, your server is allowed to do whatever it likes regarding protecting its resources from attacks.

That said, disallowing any concurrent GETs to a resource will surprise many clients, to put it mildly.

Mark Nottingham
  • 5,546
  • 1
  • 25
  • 21