9

I know that doing multiple dom manipulations is bad as it forces multiple repaints.

I.e:

$('body').append('<div />')
         .append('<div />')
         .append('<div />')
         .append('<div />');

Instead a better practise is apparently:

$('body').append('<div><div></div><div></div><div></div><div></div></div>');

but I am curious about virtual manipulation

I.e:

$('<div />').append('<div />')
            .append('<div />')
            .append('<div />')
            .append('<div />')
            .appendTo('body');

is it still bad, obviously there will be some overhead from calling a function several times, but is there going to be any severe performance hits?


Reason I am asking is this:
var divs = [
    {text: 'First',  id: 'div_1', style: 'background-color: #f00;'},
    {text: 'Second', id: 'div_2', style: 'background-color: #0f0;'},
    {text: 'Third',  id: 'div_3', style: 'background-color: #00f;'},
    {text: 'Fourth', id: 'div_4', style: 'background-color: #f00;'},
    {text: 'Fifth',  id: 'div_5', style: 'background-color: #0f0;'},
    {text: 'Sixth',  id: 'div_6', style: 'background-color: #00f;'}
];

var element = $('<div />');

$.each(divs, function(i,o){
    element.append($('<div />', o));
});

$('body').append(element);

Imagine that the divs array has come from an database table describing a form (ok, i'm using div's in the example, but it can be easily replaced with inputs) or something similar.

or with the "recommended" version we have:

var divs = [
    {text: 'First',  id: 'div_1', style: 'background-color: #f00;'},
    {text: 'Second', id: 'div_2', style: 'background-color: #0f0;'},
    {text: 'Third',  id: 'div_3', style: 'background-color: #00f;'},
    {text: 'Fourth', id: 'div_4', style: 'background-color: #f00;'},
    {text: 'Fifth',  id: 'div_5', style: 'background-color: #0f0;'},
    {text: 'Sixth',  id: 'div_6', style: 'background-color: #00f;'}
];

var element = '<div>';

$.each(divs, function(i,o){
    element += '<div ';

    $.each(o, function(k,v){
        if(k != 'text'){
            element += k+'="'+v+'" ';
        }            
    });

    element += '>'+o.text+'</div>';

});

element += '</div>';

$('body').append(element);
Hailwood
  • 89,623
  • 107
  • 270
  • 423
  • @nathan hayfield: "seems like its still bad" --- why so? "why not just use" --- and why not just keep it as it is? – zerkms Nov 02 '12 at 23:37
  • @nathan hayfield: "much faster"??? Can you tell the **REAL** difference for 5 `append` calls vs concatenation? The code is written for people, and it's optimized **ONLY** it it doesn't fit performance requirements. – zerkms Nov 02 '12 at 23:39
  • 2
    Repaints are probably deferred until after the event you're handling is complete anyway. – millimoose Nov 02 '12 at 23:43
  • @millimoose Unless you make a call that requires the engine to calculate the dimensions of the node you modified, then it will get re-rendered – Ruan Mendes Nov 02 '12 at 23:45

4 Answers4

10

Firstly, although it is great to read about potential performance hits like this you should always start by measuring to see if you even have a problem.

If you cannot perceive a problem, write the most readable code.

If you can perceive a problem, measure, change and measure.

Having said all this, the last example you have posted involves elements that are not yet written to the DOM, so there would be no repaint until the appendTo adds the elements to the DOM.

I would be very surprised if you could capture a difference in speed between second and third example - and quite surprised if you could see any major difference between any of them.

Fenton
  • 241,084
  • 71
  • 387
  • 401
  • But there's nothing wrong with using the document fragment approach as a best practice (if possible) – Ruan Mendes Nov 02 '12 at 23:47
  • As long as it doesn't damage readability. Browser vendors are all working hard on making their browsers ultra-fast, so really you just need to worry about making your code easy to understand and change. – Fenton Nov 02 '12 at 23:51
  • I am not sure how big the apps you write are. But I've been writing enterprise level apps that are big time performance hogs and though I'm a big time hater of premature optimization, we've chosen to adopt a number of practices that are known to be faster... I don't love it, but I hate the app being slow in IE even more – Ruan Mendes Nov 03 '12 at 00:00
  • That was my thoughts too, if you see the edit to my question, you will see the third method (in my opinion) becomes the most readable, so thought I would find out if there are any gotchas of doing that. – Hailwood Nov 03 '12 at 00:02
  • @JuanMendes if there is a performance issue, of course it should be fixed. My point is that far more work goes into perceived potential performance issues. If it helps, I recently finished a real-time telephony application with thousands of on-call agents needing real-time in-browser updates all served to them via HTML and JavaScript. – Fenton Nov 03 '12 at 00:05
  • 1
    @Sohnee My point is that we didn't wait for performance to degrade, our best practices are easy ways to do something faster. It's never a different technique that requires dirty code (unless we're chasing a performance problem, then any hacks are acceptable at that point). But I think we do agree, keeping your code readable is more important, just wanted to mention that in this case, they are both as readable, so you may as well use the one with better performance – Ruan Mendes Nov 03 '12 at 00:08
1

If you're really worried about performance when appending nodes, you need to use documentfragments. These will allow you to append elements to the dom without repaint. John Resign has an excellent article on this topic. He notes a 200-300% increase in performance. I implemented documentfragments in one of my apps and can confirm his claim.

Daniel Szabo
  • 7,181
  • 6
  • 48
  • 65
  • 1
    jQuery already uses document fragments internally when you do something like `$("
    ")`
    – Ruan Mendes Nov 02 '12 at 23:44
  • @Juan: Ah - so it does. Thanks for pointing that out (sorry, I come from a vanilla background). – Daniel Szabo Nov 02 '12 at 23:46
  • A point to note is that post was written 4 years ago. Recording the timeline in the Chrome developer tools tells me that his test page only ever performs layout and painting once. – millimoose Nov 02 '12 at 23:47
1

Whatever happened to good old markup generation at runtime? Seriously, what happened?

I agree with @Sohnee's point about importance of readability, but DOM manipulations are some of the most expensive operations a browser can perform. The option of maintaining a string of markup can be made perfectly readable and offer a user experience improvement beyond negligible.

In this jsperf, we're creating a 10x100 table at runtime - a perfectly reasonable (and not the most complex scenario by far) for data pagination. On a quad core machine running a recent version of Chrome the direct DOM manipulation script takes 60ms to complete, as opposed to 3ms for markup caching.

This is an indistinguishable difference on my setup, but what about the poor number-crunching folk sitting behind a corporate firewall and still forced to use an obsolete version of IE? What if the DOM operations required were to be heavier, with attribute manipulation and aggressively forcing re-paints/re-flows?

All I'm saying is if you ever want to optimize some javascript, this is not a bad place to start.

Oleg
  • 24,465
  • 8
  • 61
  • 91
  • AJAX happened ;) If you are concerned about the poor IE users then you could offer an ajax-less version as well, but often javascript markup generation provides a much more user-friendly interface, as an example, lets say you had a `add 10 more attributes` button that showed 10 more inputs, sure you could generate 500 inputs to begin with and just show/hide, or, you can just generate the 10 on the fly, yes, if it makes sense generate your markup at runtime, but there are real use cases for it. – Hailwood Nov 03 '12 at 08:56
  • @Hailwood: ಠ_ಠ *of course* ajax happened! When I was talking about markup generation at runtime, I assumed it was clear it was suggested to be *on the client* - i.e. transforming json data into an html string first and only then having it added to the DOM – Oleg Nov 03 '12 at 09:12
  • Well then in that case it really is just a case of readability, that and jQuery has made us lazy, it's easier to let jQuery handle building it than to do it ourselves, but on the point of readability and `markup generation at runtime` that's exactly what the last example in my question does, but look at the example above it, how would you make them equally as readable? – Hailwood Nov 03 '12 at 10:45
0

I think that if you start getting to the point where this kind of code is a performance bottleneck, you should be using templates instead of DOM manipulation.

But yes, using the document fragment approach (putting all nodes into a detached node before attaching to the DOM) is going to be faster in most cases, almost never slower.

Ruan Mendes
  • 90,375
  • 31
  • 153
  • 217