0

So I'm using this function I made to read images from the users hard drive and display them as thumbnails if they have a browser capable of using the FileAPI:

// loop over them sequentially to prevent hogging memory
function load(i) {
    reader.onload = (function(file) {
        return function(e) {
            var loadNext = function(i) {
                if (data[i + 1] !== undefined) {
                    setTimeout(function() {
                        load(i + 1);
                    }, 100);
                }
            };

            if (e.target.result.match(/^data\:image\/.*$/) === null) {
                loadNext(i);
                return;
            }

            // output file name
            placeholders[i].custom.children.title.html(file.name);

            // the server has returned an error, don't load the image
            if (placeholders[i].custom.error === true) {
                placeholders[i].custom.update({loaded: true});
                loadNext(i);
                return;
            }

            // create new image
            var $image = $('<img />')
                .attr('alt', '')
                .attr('src', e.target.result)
                .attr('title', file.name);

            // setup remove link
            placeholders[i].custom.children.remove
                .click(function(event) {event.preventDefault();})
                .attr('data-file-name', file.name);

            // once finished loading
            $image.load(function() {
                if (placeholders[i].custom.error !== true) {
                    placeholders[i].addClass('success').attr('id', 'preview-' + file.name)
                    placeholders[i].custom.children.content.html($image);
                }
                placeholders[i].custom.update({loaded: true});

                loadNext(i);
            });
            return;
        }
    })(data[i]);

    reader.readAsDataURL(data[i]);
}
load(0);

The problem is that if they upload a particularly large file or large number of files then CPU usage (in chromium) and memory usage (in firefox) tends to shoot up.

I was addressing this problem a while ago (I had to stop this project and come back to it) and I managed to negate some of the problem by making sure that the files (from a <input type="file" multiple="multiple" /> element) were loaded sequentially. Unfortunately that still wasn't enough so I started working on loading the data in chunks and adding a short delay in reading each chunk (slice and readAsArrayBuffer), which fixed the issue with reading for the most part.

How do I output the data from an array buffer to a canvas? I got something displaying the last time but it was scrambled beyond recognition.

rich97
  • 2,809
  • 2
  • 28
  • 34
  • 1
    Why is this a problem? What's the point in spending an extra $100 to get more RAM or a CPU that provides a bit more performance if you're not going to use all of it? Why do people make having resources but not using them a goal when they are not resources that can be saved. It's not like if you use 50% of the CPU you can use 200% tomorrow. – David Schwartz Sep 20 '12 at 10:25
  • Because I'm developing a client side application and the FileReader slows the browser to a halt for a good half a minute while it's processing this. – rich97 Sep 20 '12 at 10:32
  • Please split your three questions into three questions. – Caspar Kleijne Sep 20 '12 at 10:44
  • I've made an edit. Just got the most important one for me now. – rich97 Sep 20 '12 at 10:47

1 Answers1

0

http://jsfiddle.net/8A3tP/1/

Here's how I ended up doing it. I use window.btoa() to convert the binary data (using readAsBinaryString instead of readAsArrayBuffer) into a base64 string. From there I can create an image object with var image = document.createElement('img'); image.src = result; and use that to populate the canvas with Canvas2DContext.drawImage(image, 0, 0);

rich97
  • 2,809
  • 2
  • 28
  • 34