Skip to main content

Faster webpages fewer Images with CSS Sprites

In a previous post, I have made a point how to improve performance by reduce HTTP connections by stitching together content of css and js fragments. At that time I also mentioned a way to do it further is to reduce the number of images on a page. Because images are also loaded on HTTP connections. Again the same Cache-Control headers on the client can be used to reduce much of this traffic. But to reduce the Number of HTTP connections, You can employ a technique called CSS Sprites.

I have to confess, I don't know about this technique until I was researching on GWT's ImageBundle interface. Had I known this trick, my SarathOnline2.0 site would not have been lame - text only!

Simply the technique is to stitch all images together into one big image. This big image is called a Sprite. The images used on my site are sprite-d as shown on the left. How do we display them seperately? Using css backgrounds. To achieve this We will send a blank 1px gif. Then insert an img tag with src=blank.gif, and height and width set to the image that was orginally intended to be there (kick this, go Firebug > Inspect Element :) ). Then, using css background: url(path/to/sprite.png) no-repeat top left; background-position: 0 -5px; You can clip the background to show the exact image. As shown below.

This technique is so well discussed in detail all over the web. But you dont have to get intimidated about doing this whole stiching, and calculating the co-ordinates et al. This online css sprite genereator is simply awesome. Supply it with a zip file containing all the small gifs or pngs, and it will spit out a sprite and a piece of code for your css. If you are like me, have that css in a separate file and keep that deploy that image on to your site. And boom, you got your site running faster. If you need to add one more icon, dont worry, add it to your list images, zip them, generate this sprite again. Just keep the names of the images same. The class names generated will be the same, so you dont have to worry about modifying existing pages.

Awsome? Ain't it?

Popular posts from this blog

Powered By

As it goes, We ought to give thanks to people who power us. This page will be updated, like the version page , to show all the tools, and people this site is Powered By! Ubuntu GIMP Firebug Blogger Google [AppEngine, Ajax and other Apis] AddtoAny Project Fondue jQuery

Decorator for Memcache Get/Set in python

I have suggested some time back that you could modularize and stitch together fragments of js and css to spit out in one HTTP connection. That makes the page load faster. I also indicated that there ways to tune them by adding cache-control headers. On the server-side however, you could have a memcache layer on the stitching operation. This saves a lot of Resources (CPU) on your server. I will demonstrate this using a python script I use currently on my site to generate the combined js and css fragments. So My stitching method is like this @memize(region="jscss") def joinAndPut(files, ext): res = files.split("/") o = StringIO.StringIO() for f in res: writeFileTo(o, ext + "/" + f + "." + ext) #writes file out ret = o.getvalue() o.close() return ret; The method joinAndPut is * decorated * by memize. What this means is, all calls to joinAndPut are now wrapped (at runtime) with the logic in memize. All you wa...

How to Make a Local (Offline) Repository in Ubuntu / Debian

If you are in a place where you dont have internet (or have a bad one) You want to download .deb packages and install them offline. Each deb file is packaged as a seperate unit but may contain dependencies (recursively). apt-get automagically solves all the dependencies and installs all that are necessary. Manually install deb files one by one resolving each dependency would be tedious. A better approach is to make your own local repository. Before you actually make a repo, You need *all* deb files. You dont practically have to mirror all of the packages from the internet, but enough to resolve all dependencies. Also, You have to make sure, you are getting debs of the correct architecture of your system (i386 etc) # 1. make a dir accessible (atleast by root) sudo mkdir /var/my-local-repo # 2. copy all the deb files to this directory. # 3. make the directory as a sudo dpkg-scanpackages /var/my-local-repo /dev/null > \ /var/my-local-repo/Packages # 4. add the local repo to sour...