So I'm playing around with promises and jQuery and I came up with the code below:
var timeoutAsync = function(millis) {
var deferred = $.Deferred();
setTimeout(function () { deferred.notify(millis); deferred.resolve(); }, millis);
return deferred.promise();
};
$(document).ready(function() {
var firstPromise = timeoutAsync(1000);
var secondPromise = timeoutAsync(2000);
var thirdPromise = timeoutAsync(3000);
var fourthPromise = timeoutAsync(1234);
$.when(firstPromise, secondPromise, thirdPromise, fourthPromise)
.done(function() { alert("LOL"); })
.fail(function() { alert("FAIL"); })
.progress(function(msg) { console.log(msg); });
});
I'd expect my console to show me four values, namely: 1000
, 1234
, 2000
and 3000
.
And it did when I put a console.log statement in the setTimeout callback. But with the code above I get 1000
four times.
I'm pretty sure I'm missing something here but I can't seem to find a hint in the docs as to why this would happen. So why does this happen?
I'm using jQuery 2.1.1 and I'm testing on firefox but it also happens in IE.