Tuesday, May 29, 2007

Lessons learned from "Getting your ajax app to run faster and better"

While waiting for feedback from the client, I thought I would share a couple of educational tidbits that I and my colleagues learned in the process of speeding up our ajax web app.
  • It's not just about loading time anymore. Web development a few years ago was all about minimizing the loading time of web pages. This means minimizing the file size of graphics and utilizing CSS for styling. It still is, but today where javascript is as ubiquitous as people during rush hour, web developers also have to consider execution time of javascript. An audit of the javascript code and identifying which part exactly is taking the longest time is the key here.
  • A faster PC makes a difference. One observation is how the same web app running on the same browser can perform differently on PC's of varying speed. Again a few years ago, I would've shrugged at the need to have absolutely the latest and greatest for just suring the net. Apparently, web apps like the one we are developing really put the browser to the test, so it demands more CPU power and memory. The more your machine has the better it seems is the experience. Obviously, you have very little control over what your users use to surf to your web app, unless of course it's an internal web application like ours.
  • The browser cache is your friend. Pre-loading some of the CSS and javascript files on lighter and less busy pages like the login screen helps. By the time the user logs in, majority of the javascript and css files would have already been cached on the browser so loading time is much improved. We use YUI where the actual javascript files are stored on Yahoo's fast servers. The Yahoo webpages also use YUI so if the user happens to be a regular Yahoo page visitor, then you're in luck as the YUI javascript files may already be cached.
  • Two domains are better than one. There seems to be an internal limit on the number of connections that a browser can make to a single domain. Loading your css and other static files from another domain seems to allow us to work around that limit.
  • Compression is relative. One of the things I thought of implementing was gzip compression to miniimize load time. Most mainstream browsers since 1999 already have the capability to decompress gzipped content as they are received. This means that I can gzip a 50KB javascript file into a 10KB gzip file and serve that to IE or Firefox and they would know what to do with it. If you happen to be using Apache, you won't need to manually gzip your files as you can use mod_gzip to have Apache do it automatically with a few changes in configuration. On AOLServer 4.0.10 there is the Ns_Compress module but you have to compile aolserver with a zlib parameter that points to the zlib library files on your server for it to work. However, you have to keep in mind that the browser has to deflate/decompress the file for it to be useful, so the speed you get during load time could be once again taken away if you have a slow PC and decompression takes longer.
  • Measure, measure, measure. Unlike me :-) , Dave knows the value of good measurements, so he took the time to profile the speed of the webapp before and after each of our attempts and changes. Otherwise, how would we have guessed that compression doesn't necessarily speed up the webapp.
  • Optimize javascript. I think the most important thing I learned here is that even if you are using a fast and object oriented javascript library, how you code with it affects performance a great deal. The tips I found on this page helped me optimize my implementation of the javascript code. I didn't see a major jump in speed but I noticed some improvements in perceived execution time.

No comments:

Post a Comment