Stop paying your jQuery tax
almost 13 years ago
##Reminder, script tags block rendering
It is common advice to move all your external JavaScript includes to the footer.
Recently, the movement for fancy JavaScript loaders like ControlJS, script.js and so on has also picked up steam.
The reason for this advice is sound and pretty simple. Nothing, Nada, Zilch below non asynchronous or deferred script tags gets rendered until the script is downloaded, parsed (perhaps compiled) and executed. With one tiny exception that is Delayed Script Execution in Opera. There are quite a few technical reasons why scripts block. You need a “stable” DOM while executing a script. Browsers need to make sure document.write works.
Often people avoid the async
attribute on script
tags cause they usually execute in the order they return this means you need to be more fancy about figuring out when the script is ready. The various JavaScript loaders out there give you a clean API.
##Why is that jQuery synchronous script include stuck in the html ‘HEAD’ section?
jQuery solves the age old problem of figuring out when your document is ready. The problem is super nasty to solve in a cross browser way. It involves a ton of hacks that took years to formulate and polish. It includes weird and wonderful hacks like calling doScroll for legacy IE. To us consumers this all feels so easy. We can capture functions at any spot in our page, and run them later when the page is ready:
$(wowThisIsSoEasy); // aka. $(document).ready(wowThisIsSoEasy);
It allows you to arbitrarily add bits of rich functionality to your page with incredible ease. You can include little scripts that spruce up your pages from deep inside a nested partial. In general, people break the golden rule and include jQuery in the header. That is cause jQuery(document).ready is so awesome.
##The jQuery tax
There are 3 types of tax you pay when you have a script in your header. The constant tax and the initial tax and the refresh tax.
###The initial tax
The most expensive tax is the initial hit. Turns out more than 20% of the page views we get at Stack Overflow involve an non-primed cache and actual fetching of JavaScript files from our CDN. These are similar numbers to the ones reported by Yahoo years ago. The initial hit can be quite expensive. First, the DNS record for the CDN needs to be resolved. Next a TCP/IP connection needs to be established. Then the script needs to be downloaded and executed.
Browsers these days have a pack of fancy features that improve performance including: HTTP pipelining, speculative parsing and persistent connections. This often may alleviate some of the initial jQuery tax you pay. You still need your CSS prior to render, modern browsers will concurrently download the CSS and scripts due to some of the above optimisations. However, slow connections may be bandwidth constrained. If you are forced to download multiple scripts and CSS prior to render, they all need to arrive and run prior to render starting.
In the image below (taken from Stack Overflow’s front page) you can see how the screen rendering could have started a 100 or so milliseconds prior if jQuery was deferred - the green line:
An important note is that it is common to serve jQuery from a CDN be it Google or Microsoft. Often the CDN you use for your content may have different latency and performance to the CDN serving jQuery. If you are lucky Microsoft and Google are faster, if you are unlucky they are slower. If you are really unlucky there may be a glitch with the Google CDN that causes all your customers to wait seconds to see any content.
The refresh tax
People love clicking the refresh button, when that happens requests need to be made to the server checking that local resources are up to date. In the screenshot below you can see how it took 150ms just to confirm we have the right version of jQuery.
This tax is often alleviated by browser optimisations, when you check on jQuery you also check on your CSS. Asynchronous CSS is risky and may result in FOUC. If you decide to render CSS inline you may avoid this, but in turn have a much harder problem on your hands.
The constant tax
Since we are all good web citizens (or at least the CDNs are), we have expire headers which means we can serve jQuery from our local cache on repeat requests. When the browser sees jQuery in the header it can quickly grab it from the local cache and run it.
Trouble is even parsing and running jQuery on a modern computer with IE8 can take upwards of 100 milliseconds. From my local timings using this humble page on a fairly modern CPU i7 960, the time to parse and run jQuery heavily varies between browsers.
Chrome seems to be able to do it under 10ms, IE9 and Opera at around 20ms and Firefox at 80ms (though I probably have a plugin that is causing that pain). IE7 at over 100ms.
On mobile the pain is extreme, for example: on my iPhone 4S this can take about 80ms.
Many people run slower computers and slower phones. This tax is constant and holds up rendering every time.
Pushing jQuery to the footer
Turns out that pushing jQuery to the footer is quite easy for the common case. If all we want is a nice $.ready
function that we have accessible everywhere we can explicitly define it without jQuery. Then we can pass the functions we capture to jQuery later on after it loads.
In our header we can include something like:
window.q=[];
window.$=function(f){
q.push(f);
};
Just after we load jQuery we can pass all the functions we captured to the real ready
function.
$.each(q,function(index,f){
$(f)
});
This gives us access to a “stub” ready function anywhere in our page.
Or more concisely:
<script type='text/javascript'>window.q=[];window.$=function(f){q.push(f)}</script>
and
Why have we not done this yet at Stack Overflow?
Something that may seem trivial on a tiny and humble blog may take a large amount of effort in a big app. For one, we need to implement a clean pattern for registering scripts at the page footer, something that is far from trivial. Further more we need to coordinate with third parties that may depend on more than a $
function or even god forbid document.write
for ads.
The big lesson learned is that we could avoided this whole problem if we started off with my proposed helper above.
We spend an inordinate amount of time shaving 10% off our backend time but often forget the golden rule, in general the largest bottleneck is your front end. We should not feel powerless to attack the front end performance issues and can do quite a lot to improve JavaScript bottlenecks and perceived user performance.
Edit Also discussed on Hacker news thanks for the great feedback
Could you provide a complete example (i.e. Your humble page) using this method so that JS beginners like me have it easier ?
Why not storing jQuery on the server providing the page content so that establishing another TCP connection would be avoided ?