3

I have a website with an AJAX cart. The concept is pretty simple: you end up on a page with a product and can click the Buy Now button. When you do so, the JavaScript code adds the product to the cart, changes the cart visual with the new count, and sends an AJAX request to the server to report the change.

What I'm wondering about, since the client and the server may take a while to process the AJAX request, is... will the client clicking a link to move to another page (i.e. "next product") before the AJAX is reported as successful stop the AJAX request at all?

// prepare request...
...snip...
// send request (usually a POST)
jQuery.ajax(uri, ajax_options);
// return to user

// will a click on a link cancel the AJAX call after this point?

Further, I have timed AJAX requests. If the user clicks on a link before those timed requests happen, they will be lost for sure. Assuming that the click does not cancel an AJAX request, would starting one in the unload event work? Would using a cookie be better/safer than attempting another AJAX request? (although if the user clicks an external link, the unload is really the only solution I can think of to save that data...)

As a side note: I do not want to darken the screen when the user adds an item to the cart so that way the user can continue to do things... but if the AJAX needs to be confirmed before a link can be clicked, I'd have to make sure clicks cannot be used until then.


Update:

I thinks that some of you are missing the point. I do not care about the done() or completed() functions getting called on the client side. What I do care about is making sure that in the end I get all the data on the server.

I understand that's asynchronous, but what I want to make sure of is avoiding loss of data, especially if the link goes to another website (to the same website, I am really thinking to make use of a cookie to make sure that the data of delayed AJAX requests get to the server no matter what.)

Also, the idea of timed data requests is to avoid heavy loads on the server. With a properly timed set of AJAX requests, the client and server both work a lot better!

Alexis Wilke
  • 19,179
  • 10
  • 84
  • 156
  • 6
    Any pending AJAX requests will be aborted when the browser navigates away from the page. – user229044 Jan 12 '15 at 02:16
  • starting requests in unload event is far too late. Leaving page aborts requests and unload won't wait for request either – charlietfl Jan 12 '15 at 02:16
  • One of the words in AJAX is *asynchronous.* You seem to be describing a modal dialog, which is about as synchronous as it gets. You can't have it both ways. – Robert Harvey Jan 12 '15 at 02:16
  • What you could do is separate the request to the server and the response. That is, have `Buy Now` hit the server, which responds immediately, and adds stuff to the cart on the backend in a separate thread. When that completes, you can notify the browser of the new cart contents using something like SignalR – which will poll the server independently. As for the timed requests, same thing: implement the delay server-side. Essentially, you will never return the result of an AJAX request that takes a long time to complete on the backend, you update the UI using an independent mechanism. – millimoose Jan 12 '15 at 02:21
  • Another option is replacing the content of your webpage using AJAX as well, so that no page reloads occur, but that might be even more effort than the previous suggestion. – millimoose Jan 12 '15 at 02:24
  • @meagar, so if the C/C++ send() or write() call of the browser was not finished sending all the data of the request, the server will not receive it, correct? – Alexis Wilke Jan 12 '15 at 03:10
  • store the cart in session storage, keep it updated for all user interactions and you don't have to worry about unload – charlietfl Jan 12 '15 at 04:16

1 Answers1

5

@meagar summed this up pretty well in the comments

Any pending AJAX requests will be aborted when the browser navigates away from the page.

So depending on how you define "killing" an AJAX request, that means the request may be started, but it also might not have finished. If it's a short request, most likely it won't be aborted by the time it finishes. But if it's a long request (lots of data processing, takes a second or two to complete), then most likely it's going to be aborted somewhere in the middle.

Of course, this all depends on the browser. The issue typically is that the request makes it to the server, but the browser aborts the request before the response comes through. This all depends on the server and how it processes the data.

Some servers will interrupt the execution of your view, where the requests data is being processed and the response is being generated. Many servers will just let the code run and trigger an error when you try to write output to the response. This is because there is nobody on the other end, so you're writing the response to a closed connection.

although if the user clicks an external link, the unload is really the only solution I can think of to save that data

From my own experience, most browsers will allow you to send out a request during the beforeunload event. This is not always true for unload though, as by that time the page change cannot typically be stopped.

One way to get around this, especially when the response matters, is to halt the page change after the user clicks the link. This can be as simple as calling evt.preventDefault() on the click event for the link, and then later redirecting the user to where they wanted to go when the request is finished. You should make sure to indicate to the user that their request has not just been ignored, but that they're waiting on something to finish first. Users don't want to be left in the dark, so make sure to give them some feedback (like changing the button text, disabling it, etc.).

Community
  • 1
  • 1
Kevin Brown-Silva
  • 40,873
  • 40
  • 203
  • 237
  • Yeah, I have been thinking about putting a turning thingy in the cart representing the fact that it's currently working with the server. But that in itself would not prevent the user from clicking a link (possibly external) or closing the window/tab. I guess I could look into capturing all clicks and if the cart (or whatever else) is not done, grey out the window until it finishes... that way I can also display a message explaining the problem. But additional clicks on the Buy Button and possibly many other buttons should not be prevented... Food for thought. – Alexis Wilke Jan 12 '15 at 04:20