Speed up your JavaScript, Part 1

In my last post, I talked about the conditions under which the dreaded long-running script dialog is displayed in browsers. Browsers will stop executing script either when they’ve executed too many statements (Internet Explorer) or when the JavaScript engine has been running for a specific amount of time (others). The problem, of course, isn’t the way that the browser is detecting long-running scripts, it’s that the script is taking too long to execute.

There are four main reasons why a script can take too long to execute:

  1. Too much happening in a loop.
  2. Too much happening in a function.
  3. Too much recursion.
  4. Too much DOM interaction.

In this post, I’m going to focus on the first issue: too much happening in a loop. Loop iterations happen synchronously, so the amount of time it takes to fully execute the loop is directly related to the number of iterations. There are, therefore, two situations that cause loops to run too long and lock up the browser. The first is that the loop body is doing too much for each iteration and the second is that the loop is running too many times. These can cause the browser to lock up and display the long-running script warning.

The secret to unraveling this problem is to evaluate the loop to answer two questions:

  1. Does the loop have to execute synchronously?
  2. Does the order in which the loop’s data is processed matter?

If the answer to both of these questions is “no,” then you have some options for splitting up the work done in the loop. The key is to examine the code closely to answer these questions. A typical loop looks like this:

for(var i=0; i < items.length; i++){

This doesn’t look too bad though may take very long depending on the amount of time necessary to run the process() function. If there’s no code immediately after the loop that depends on the results of the loop executing, then the answer to the first question is “no.” You can clearly see that each iteration through the loop isn’t dependent on the previous iteration because it’s just dealing with one value at a time, so the answer to the second question is “no.” That means the loop can be split in a way that can free up the browser and avoid long-running script warnings.

In Professional JavaScript, Second Edition, I introduce the following function as a way to deal with loops that may take a significant amount of time to execute:

function chunk(array, process, context){
        var item = array.shift();
        process.call(context, item);

        if (array.length > 0){
            setTimeout(arguments.callee, 100);
    }, 100);

The chunk() function is designed to process an array in small chunks (hence the name), and accepts three arguments: a “to do” list of items, the function to process each item, and an optional context variable for setting the value of this within the process() function. A timer is used to delay the processing of each item (100ms in this case, but feel free to alter for your specific use). Each time through, the first item in the array is removed and passed to the process() function. If there’s still items left to process, another timer is used to repeat the process. The loop described earlier can be rewritten to use this function:

chunk(items, process);

Note that the array is used as a queue and so is changed each time through the loop. If you want to maintain the array’s original state, there are two options. First, you can use the concat() method to clone the array before passing it into the function:

chunk(items.concat(), process);

The second option is to change the chunk() function to do this automatically:

function chunk(array, process, context){
    var items = array.concat();   //clone the array
        var item = items.shift();
        process.call(context, item);

        if (items.length > 0){
            setTimeout(arguments.callee, 100);
    }, 100);

Note that this approach is safer than just saving an index and moving through the exist array, since the contents of the array that was passed in may change before the next timer is run.

The chunk() method presented here is just a starting point for how to deal with loop performance. You can certainly change it to provide more features, for instance, a callback method to execute when all items have been processed. Regardless of the changes you may or may not need to make to the function, it is a general pattern that can help optimize array processing to avoid long-running script warnings.



  1. John Rockefeller

    Don't you think, though, that having to take perfectly normal code and mangle it to jump through browser hoops is not a sign that something is wrong with your code (although some people's loops do need to be cut down) but that there is something inherently wrong with the JS engine of that browser in that it can't complete simple things in a short time-frame?

    Maybe I'm a little crazy but why do browsers handle JS synchronously? i.e., Why do they stop what they're doing to run your loop and then continue?

    Do you know of any threaded JS engines for web browsers?

  2. Nicholas C. Zakas

    I completely agree. There are a lot of times when the issue is poorly written code. I have, however, come across times when I needed to do a lot of array processing to get something to work. This technique helped a great deal.

    JavaScript really has no choice but to run synchronously because it can update the DOM, and thus, affect the appearance of the page. Imagine the trouble we'd get into if two different threads could each set the left coordinate for a DOM element at the same time!

  3. Thomas Peri

    I found your article through an entry on Ajaxian that also linked to me.

    Coincidentally, the same day you posted this article, I released an update to a library that takes a different approach to the same problem:


  4. Aaron T Grogg

    @John Rockefeller:

    My understanding is that JS runs synchronously because it doesn't know what might happen within a loop that might affect what the /next/ loop might do.


  5. mat

    would it be possible or prudent to create an iframe and load the intensive javascript in it?
    then maybe have the procedure/function issue some kind of call on completion?

  6. Raymond Ie

    Thank you, Nicholas.

    For anyone using this to replace jQuery's $.each(arr,function(i,v){...}), if your process function needs to access the iterator value (i), this might help:

    function chunk(array, process, context){
    var i=0;
    var item = array.shift();
    if (array.length > 0){
    setTimeout(arguments.callee, 20);
    }, 100);

  7. Renzo Kooi

    Why not use Duff's device? See: http://www.websiteoptimizat...

  8. Nicholas C. Zakas

    @Renzo - Duff's device may speed up array processing in some cases, but it can still cause the long-running script dialog to appear.

  9. Renzo Kooi

    @Nicholas: you're right, I recently experienced it, testing insertion of a huge bunch of new divs in the DOM tree. Thanks for your writings by the way.

  10. Ajay Nair

    "Perform node operations out of the DOM" - since they cause reflows
    However, IE, has a dom insertion order leak pattern (http://msdn.microsoft.com/e.... My tests find that if I have a dynamically created node, set innerHTML to it and then append it to the DOM, it leaks memory! But if I append it to the dom and set innerHTML, and then remove it, it recovers the memory. How can I get around it?

  11. Bob

    With regards to the first comment to this topic (yeah, old, I know), browsers are implementing the Web Workers API, which is introduces a true threaded JS engine. Of course, for the reasons posted, it doesn't mess with the DOM, but it does reduce or eliminate the need for timers.

    It would be a nice topic for another blog post. ;)

Understanding JavaScript Promises E-book Cover

Demystify JavaScript promises with the e-book that explains not just concepts, but also real-world uses of promises.

Download the Free E-book!

The community edition of Understanding JavaScript Promises is a free download that arrives in minutes.