Monday, March 28, 2011

High Performance Javascript

Here are some tips on high perfomance Javascript I have picked up. Most of it comes from the books High Performance Javascript by Nicholas C. Zakas and High Performance Web Sites by Steve Souders.

Loading

Load files at the end of the HTML page

Load the Javascript files right before the body, this will allow the page to render without having to wait for all the Javascript files.

Group files together

With normal loading, the files are loaded sequentially. Each file will be loaded and parsed before the next file starts to load. Merge them together into one large file. While you are at it, you should also minimize it. Tools to help you with this are:

Load files asynchronously

If normal loading with grouped files is not good enough, it is also possible to load the files asynchronously. This will also allow you to load files on demand. Tools for this are:

Variable Access

Literal values and local variables can be accessed very quickly. Array access and member access take longer. If the prototype chain or scope chain needs to be traversed, it will take longer the further up the chain the access is. Global variables are always the slowest to access because they are always last in the scope chain.

You can improve the performance of JavaScript code by storing frequently used, object members, array items, and out-of-scope variables, in local variables.

DOM

All DOM manipulation is slow.

  • Minimize DOM access
  • Use local variables to store DOM references you'll access repeatedly.
  • HTML collections represent the live, underlying document, so:
    • Cache the collection length into a variable and use it when iterating
    • Make a copy of the collection into an array for heavy work on collections

Reflow and rendering

The browser contains two trees, the DOM tree and a render tree. Whenever the DOM layout or geometry is changed the view will have to be re-rendered. This is known as reflow.

Reflow happens when:

  • Visible DOM elements are added or removed
  • Elements change position
  • Elements change size (margin, padding, border thickness, width, height, etc.)
  • Content is changed, (text changes or an image is replaced with one of a different size)
  • The page renders initially
  • The browser window is resized

Combine multiple DOM and style changes into a batch and apply them once. This can be done with documentFragments or by cloning the node.

// Create a document fragment 
var fragment = document.createDocumentFragment(); 
 
// Do something with framgment 
 
// Append the fragments children to the DOM 
document.getElementById('mylist').appendChild(fragment); 
 
// Clone Node 
var old = document.getElementById('mylist'); 
var clone = old.cloneNode(true); 
 
// Do something with clone 
 
// Replace node with clone 
old.parentNode.replaceChild(clone, old); 
 

Algorithms and Flow

Use algortithms with better complexity performance for large collections.

  • for-in loops are slower than for, while and do-while loops. Avoid for-in unless you need to iterate over a number of unknown object properties.
  • Lookup-tables are faster than multiple conditionals.
  • Recursion can be re-written with iteration if you get stackoverflow errors.

Strings and Regexes

Strings concatenation is quite fast in most browsers. In IE, you may need to use Array.join.

Regular expression can be improved by:

  • Focus on failing faster.
  • Start regexes with simple, required tokens.
  • Make quantified patterns and their following pattern mutually exclusive /"[^"]*"/.
  • Use noncapturing groups. (?:) instead of ().
  • Capture text to reduce postprocessing.
  • Expose required tokens /^(ab|cd)/ instead of /(^ab|^cd)
  • Resue regexes by assigning them to variables.
  • Split complex regexes into simpler pieces.

Responsiveness

The total amount of time that a single JavaScript operation should take is 100. If it takes longer it needs to be split up, this can be done using timers.

Two determining factors for whether a loop can be done asynchronously using timers:

  • Does the processing have to be done synchronously?
  • Does the data have to be processed sequentially?
// Function for processing an array in parallel 
function processArray(items, process, callback) { 
  var minTimeToStart = 25; 
  var copyOfItems = items.concat(); 
  setTimeout(function() { 
    process(copyOfItems.shift()); 
    if (copyOfItems.length > 0) 
      setTimeout(arguments.callee, minTimeToStart); 
    else 
      callback(items); 
  }, minTimeToStart); 
} 
 
// Function for processing multiple tasks in parallel 
function processTasks(tasks, args, callback) { 
  var minTimeToStart = 25; 
  var copyOfTasks = steps.concat(); 
  setTimeout(function() { 
    var task = copyOfTasks.shift(); 
    task.apply(null, args || []); 
    if (copyOfTasks.length > 0) 
      setTimeout(arguments.callee, minTimeToStart); 
    else 
      callback(); 
  }, minTimeToStart); 
} 
 

You should limit the number of high-frequency repeating timers in your web application. It is better to create a single repeating timer that performs multiple operations with each execution.

It is not recommended to have minTimeToStart less than 25 milliseconds, because there is a risk that the timers will fill up the queue.

// Timed version of process array, where each version is able to 
// process items from the array for up to 50 milliseconds. 
function timedProcessArray(items, process, callback) { 
  var minTimeToStart = 25; 
  var copyOfItems = items.concat(); 
  setTimeout(function() { 
    // (+) converts the Date object into a numeric representation 
    var start = +new Date(); 
    do { 
      process(copyOfItems.shift()); 
    } while (copyOfItems.length > 0 && (+new Date() - start < 50)); 
    if (copyOfItems.length > 0) 
      setTimeout(arguments.callee, minTimeToStart); 
    else 
      callback(items) 
  }, minTimeToStart); 
} 
 

Newer browsers support web workers. Web workers does not run in the UI-thread and does not affect responsiveness at all. Their environment is limited to allow this to work. It is limited to:

  • A navigator object, which contains only four properties: appName, appVersion, user Agent, and platform.
  • A location object (same as on window, except all properties are read-only)
  • A self object that points to the global worker object
  • An importScripts() method that is used to load external JavaScript for use in the worker
  • All ECMAScript objects, such as Object, Array, Date, etc.
  • The XMLHttpRequest constructor
  • The setTimeout() and setInterval() methods
  • A close() method that stops the worker immediately

It is not possible to create a WebWorker from code. It needs to be started with its own javascript file. You can however communicate with it through events.

// Application code 
var worker = new Worker("code.js"); 
worker.onmessage = function(event) { 
  alert(event.data); 
}; 
worker.postMessage("Tapir"); 
 
// Worker code (code.js) 
//inside code.js 
importScripts("file1.js", "file2.js"); // importing some files 
 
self.onmessage = function(event) { 
  self.postMessage("Hello, " + event.data + "!"); 
}; 
 

Any code that takes longer than 100 milliseconds to run should be refactored to use webworkers to decrease the load on the UI-thread.

Ajax

Favor lightweight formats in general; the best is JSON and a character-delimited custom format. If the data set is large and parse time becomes an issue, use one of these two techniques:

JSON-P data, fetched using dynamic script tag insertion. This treats the data as executable JavaScript, not a string, and allows for extremely fast parsing. This can be used across domains, but shouldn't be used with sensitive data.

A character-delimited custom format, fetched using either XHR or dynamic script tag insertion and parsed using split(). This technique parses extremely large datasets slightly faster than the JSON-P technique, and generally has a smaller file size.

XML has no place in high-performance Ajax.

Cache data! The fastest Ajax request is one that you don't have to make. There are two main ways of preventing an unnecessary request:

  • On the server side, set HTTP headers that ensure your response will be cached in the browser.
  • On the client side, store fetched data locally so that it doesn't have be requested again.

Multipart XHR can be used to reduce the number of requests, and can handle different file types in a single response, though it does not cache the resources received.

Some more guidelines that will help your Ajax appear to be faster:

  • Reduce the number of requests you make, either by concatenating JavaScript and CSS files, or by using MXHR.
  • Improve the perceived loading time of your page by using Ajax to fetch less important files after the rest of the page has loaded.
  • Ensure your code fails gracefully and can handle problems on the server side.
  • Know when to use a robust Ajax library and when to write your own low-level Ajax code.

Programming Practices

  • Avoid the use of eval() and the Function() constructor.
  • Pass functions into setTimeout() and setInterval() instead of strings.
  • Use object and array literals when creating new objects and arrays.
  • Avoid doing the same work repeatedly.
  • Use lazy loading or conditional advance loading when browser-detection logic is necessary.
  • When performing mathematical operations, consider using bitwise operators that work directly on the underlying representation of the number.
  • Native methods are always faster than anything you can write in JavaScript.

Building and Deploying

The build and deployment process can have a tremendous impact on the performance of a JavaScript-based application. The most important steps in this process are:

  • Combining JavaScript files to reduce the number of HTTP requests
  • Minifying JavaScript files using the YUI Compressor
  • Serving JavaScript files compressed (gzip encoding)
  • Making JavaScript files cacheable by setting the appropriate HTTP response headers
  • Work around caching issues by appending a timestamp to filenames
  • Using a Content Delivery Network to serve JavaScript files;
  • All these steps should be automated using build tools

Tools

Minification

Profiling

Development

  • Firebug
  • Internet Explorer Developer Tools
  • Safari Web Inspector
  • Chrome Developer Tools

Proxies

No comments: