I have a queue, when a user choose file(s), file(s) will be added to queue, this files may or may not choose at once
The problem is, on start uploading, each file open separate XHR connection and separate request, This is not OK for 2 main reasons,If a user choose 100 files, there would be one hundred concurrent connections!!, and 2, if a connection fail, whole upload will fail, I tried different things but none of them works, Is there any workaround for this ? maybe limiting concurrent upload ? I don't think that's an option since it only works in multi-part upload
Here is some information about what I'm doing
I'm not using multi-part (using chunked upload).
It's possible to user to choose files from different directory(I mean he/she can click on input every time he/she wants and append files to queue).
And here is the code I'm using:
// Upload queue array
var filesList = new Array();
$('#fileupload').fileupload({
url: 'http://example.com/listener',
maxChunkSize: 200000000,
multipart: false,
autoUpload: false,
limitConcurrentUploads: 3,
add: function(e, data) {
// Just for rendering html table
$.each(data.files, function(index, file) {
$('#files table>tbody').append('<tr><td>' + file.name + '</td><td>' + humanFileSize(file.size) + '</td></tr>');
});
// Add to fileList array
filesList.push(data);
}
}).prop('disabled', !$.support.fileInput).parent().addClass($.support.fileInput ? undefined : 'disabled');
// Start upload button
$(document).on('click', '#startup', function() {
filesList.forEach(function(data) {
data.submit();
});
});
// Clear queue button
$(document).on('click', '#clearqueue', function() {
$('#files table>tbody').html('');
// Clear queue
filesList.length = 0;
// Replace file input element
$("#fileupload").replaceWith($("#fileupload").val('').clone(true));
});