This is something I've known for a while but tend to forget until it slaps me across the face. A reader wrote in with something odd she saw on her web site. She had a basic search site where some of the content was a bit slow to render. Instead of delaying the search results she simply used an Ajax call to update the results in real time. I think this is a great idea. Even if the "total" time remains the same (or close to it), the "perceived" time is much lower for the end user. However, in her testing she noticed something odd. She thought she was running N ajax based calls all at once but in her network tools they appeared to come in "chained", ie one after another. She had expected that if each call took about 1 second and she made 30 of them, they should run asynchronously and completely run within 1 second. (Or within a second or two given network latency randomness.) I whipped up a quick example of this so I could see this in action.
First, I began with a simple front end client that uses jQuery. This topic isn't jQuery specific of course but all good Ajax posts should mention jQuery at least once.
function runCall(x) {
console.log('doing call '+x);
$.post("slow.cfm",{"id":x}, function(res,code) {
$("#show"+x).html("Result was "+res);
console.log("back and res="+res);
});
}
</script> <cfloop index="x" from="1" to="20">
<cfoutput>
<div id="show#x#""></div>
</cfoutput>
</cfloop>
<script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/jquery/1/jquery.min.js"></script>
<script>
$(document).ready(function() {
for(var i=1; i<= 20; i++) {
runCall(i);
}
})
My template has 20 divs, representing search results, and on document load will run 20 calls to fetch data for each div. Warning - I'm using console.log for debugging. This is not a bug. When you click my demo, please do not report this as a bug. Ajax developers should use a browser (or browser+plugin) that supports console! All in all, pretty simple, right? Technically this is not "20 at once" since I have a loop, but I think we agree that it is close enough for our testing.
For the back end I wrote a quick ColdFusion script to simulate a slow process and result. It's hard to write slow ColdFusion code so I made use of the sleep command.
<cfset sleep(2000)>
<cfparam name="form.id" default="1">
<cfoutput>#form.id#=#randRange(1,100)#</cfoutput>
Running this in the browser shows some interesting results. I tested in both Chrome and Firefox. While I prefer Chrome, I thought Firefox (plus Firebug) had the best graphic result:
I think you can clearly see that the results are staggered. You can test this yourself by clicking the button below - but have Firebug or Chrome Dev Tools ready to go:
For me, I clearly see "spurts" of loading. Typically I saw 2 or 4 results pop in at once. This is not a bug though and is completely normal. The browser has set limits on how many network requests can be made to a server. This is both good for the client as well as the server. We see this every day when loading a web page, especially over mobile. Things 'pop' into view over time. However on a broadband connection it can sometimes be easy to forget. In this very clear cut example we ask the browser to quickly make a bunch of network requests at once and we can see the 'throttle' more clearly.
Archived Comments
this is why you have multiple domains for your website and spread images,css,js etc accross them
example
cfjedi1.com/images/image1.jpg
cfjedi2.com/images/image2.jpg
cfjedi3.com/js
cfjedi3.com/css
Making sure the above domains are cookieless
and of course use sprites.
this way ya browser (on a fast connection) can download say 10 items at once.
of course this won't work for ajax requests accross different domains, but it does make a page load a lot faster
Damn good points m@.
(I was like - why didn't he give his name - then I sounded it out. ;)
Ray,
Have you tried the IE 9 release candidate? The debugging tools give you quite a bit of information, especially for network requests. It also supports console.
Got IE9 - just don't use it very often. Glad to hear they added console.*.
The impact of a spread across a CDN is always worth bearing in mind (especially if you're running your AJAX request in true Asynchronous mode) but bear in mind there's a max limit of total requests that you can hit if you're not careful.
Plus it's quite possible to melt a webserver* by permitting too many simultaneous requests from one client as you lose a natural staggering of traffic that comes with single host throttling.
BTW - you know there's a 2 line test to see if console. is supported right?
*And yeah I know a decent webs server should be able to hold it's own but I've killed poorly configured Apache boxes on a number of occasions by ramping up the number of running forked processes
@Rob: The console remark was a bit of a ... snarky comment by me. I've been dinging a few times (ok, many times) on my blog about forgetting console messages. I'm trying to be better about it and removing it. However, in this case, the console messages were an integral part of the testing. Hence the snarky remark.
Can I blame the lack of (enough) coffee?
No need to apologise! I've lost count of the number of times my colleagues have moaned about my "broken ajax" stuff only for me to realise that I've left console traces all over the shop. So now I always wrap in:
if (window.console && window.console.log) {
//log msg here
}
or drop:
window.console||(console={log:function(){}});
in at the top of the script to stub the console method.
Either way I get to skip* happily about the office knowing that my Javascript is simple to debug and users in non console browsers won't have any problems :o)
*and yeah - that would be metaphorical skipping
@Ray/m@: You can also setup a subdomain wildcard (so anything that doesn't fit into what you have defined) then you can just generate a random string of characters for the subdomain. The result is that you can have as many connections as the browser/machine/connection/etc can handle.
Example:
sldjfkjhdfg.datanotion.com
iweikcvl.datanotion.com
etc...
Just something I've done in the past.
That's slick PB.
Very interesting post. I can confirm that on Google Chrome OS 6 requests are allowed through at once.
The same problem occurs with loading images. On one of my sites, we display like 100 little images per page... And when a browser only request 4 at a time, that takes a long time to load them all.
The problem with using a random string and a wildcard domain is that the browser then has to do a DNS lookup for every request... which might cancel out the async savings. Also, if what you're requesting is static, you loose some browser caching performance.
I ended up settling on having 5 subdomains that I randomly picked from. I decided that it was better to load a page more quickly the first time, and a little slower subsequent times.
Here is an answer about config of Firefox
http://www.speedguide.net/f...