1

I'm producing an event registration website. When someone click on a link:

<a href="/reserve/10" rel="nofollow">Reserve id=10 event</a>

The system is doing a "lock" on this event for ten minutes for this visitor. In that case no one else can reserve this event in next ten minutes. If the payment is done in that time, everything is OK, else the event is unlocked again. I hope the idea is clear.

PROBLEM: When bot (google bot, malicious bot, or angry customer script :P) visits this page, he see this link. Then he enters the page. Then the lock is done...

Also if someone visit recursive: /reserve/1, /reserve/2, /reserve/3, ... He can lock all the events.


I thought about creating a random md5 string for each event. In that case, every event has (next to id) unique code, for example: 1987fjskdfh938hfsdvpowefjosidjf8243

Next, I can translate libraries, to work like this:

<a href="/reserve/1987fjskdfh938hfsdvpowefjosidjf8243" rel="nofollow">
    Reserve
</a>

In that case I can prevent the "bruteforce" lock. But the link is still visible for bots.

Then I thought about entering the captcha. And that is the solution. But captchas are... not so great in case of usability and user experience.


I saw few websites with reservation engine working like this. Are they protected? Maybe there is a simple ajax / javascript solution to prevent the bots from reading this as a pure text? I thought about:

<a href="/registerthisvisitorasbot" id="reserve">Reserve</a>
<script type="text/javascript">
    $('#reserve').click(function(e) {
        e.preventDefault();
        var address = ...; 
        // something not so obvious to follow? 
        // for example: md5(ajaxget(some_php_file.php?salt=1029301))
        window.location('/reserve/' + address);
    });
</script>

But I'm not sure what shall I do there to prevent bots form calculating it. I mean stupid bots will not be able even to follow javascript or jquery stuff, but sometimes, someone wants to destroy something, and if the source is obvious, it can be broken in few lines of code. And whole database of events will be locked down with no reservation option for noone.

Jacek Kowalewski
  • 2,761
  • 2
  • 23
  • 36
  • Why don't you use session control and verify that the event you are locking is 'lockable' by that user? This helps to prevent a few things, but wont kill them for good. – Ismael Miguel Jan 27 '15 at 11:22
  • I can try that. But almost all the bots use sessions to, has good HTTP_USER_AGENT, etc, so this will block only a little percent of malicious visitors. I mean this is the kind of website, where some "evil" users can really try to tear something down. And this method must be almost 100% safe, like the most modern captcha. – Jacek Kowalewski Jan 27 '15 at 11:25
  • 1
    You should consider making a login system... This will help you in other areas, like debugging in case of total lockdown. You will know who did it, when, where (to a certain point) and how. – Ismael Miguel Jan 27 '15 at 11:42
  • 1
    nice question i came to know lot from your question and answer – Mateen Jan 27 '15 at 12:13

2 Answers2

3

CRFS + AJAX POST + EVENT TOKEN generated on each load.

Summary: don't rely on GET requests especially through a elements.

And better if you add some event block rate limits (by IP for instance).

EDIT: (this is a basic sketch)

  1. replace all the href="..." with data-reservation-id=ID
  2. delegate click on the parent element for a[data-reservation-id]
  3. in the callback, simply make a POST ajax call to the API
  4. in the API's endpoint check rate limits using IP for instance
  5. if OK, block the event and return OK, if not return error.
avetisk
  • 11,651
  • 4
  • 24
  • 37
  • Yeah, I know that get request are nothing else then pure text. But, could You please provide more information? I know php very well, and know all the abbreviations you used, but a simple sketch of solution would be great here. – Jacek Kowalewski Jan 27 '15 at 11:22
  • 1
    This is great!.... Assuming you don't have IE5.5 or anything. Most bots won't work with it, This is a great solution! – Ismael Miguel Jan 27 '15 at 11:43
  • Could give +100 if can :). I hopt this will help someone in the future also, because this is a popular problem I think, with no online solutions. – Jacek Kowalewski Jan 27 '15 at 11:47
  • Happy that it helped you :) As for the rate limit, you can consider a combination of IP/user agent and some other data to avoid IP collision. – avetisk Jan 28 '15 at 14:13
1

IP-Specific maximum simultaneous reservations

Summary: Depend on the fact that many simple bots operate from one host. Limit the number of simultaneous reservations for a host.

Basic scetch:

  • Store the requesting IP alongside the reservation
  • On reservation request count the IP's which have a non-completed reservation.

    SELECT Count(ip) FROM reservations WHERE ip=:request_ip AND status=open;
    
  • If the number is above a certain threshold, block the reservation.

(this is mostly an expansion of point 4 given in avetist's excellent answer)

Community
  • 1
  • 1
Vogel612
  • 5,620
  • 5
  • 48
  • 73
  • This is really a great idea. However among places, where one IP hold many, many hosts (builidings, offices) it can be a little frustrating. But after 10 minutes, the reservation still can be done. Your answer gave me an oportunity to think in a quite different direction, +1 from me. – Jacek Kowalewski Jan 27 '15 at 12:04