Skip to main content

Re: Return of the Mobile Style Sheet

In one of the recent articles on ALAP, a Return of the Mobile Style Sheet is discussed. For a whole lot of reasons, I dont think this would really be going to be a return. Or atleast it will not be a happy camper in the web community.

At least 10% of your visitors access your site over a mobile device. They deserve a good experience (and if you provide one, they'll keep coming back). Converting your multi-column layout to a single, linear flow is a good start. But mobile devices are not created equal, and their disparate handling of CSS is like 1998 all over again. Please your users and tame their devices with handheld style sheets, CSS media queries, and (where necessary) JavaScript or server-side techniques.

Before I jump on to throw dog-pile on this "re-emergence", I would like to say, using handheld media switch in css linking is a very good way to QUICKLY make your site go mobile-friendly. All you have to do is to write up a new css link to your template and vola, your site is right up on iphone, Cool. But will you be willing to leave it to the browser to handle this for you?

Assuming you are of the type who would: lets see, most sites these days are dynamic, either on the server side or on AJAX front. Dynamic here reads not just appearance, data and behaviour too. Pages, most commonly are templated, generated at run time, with lot of data in it. For one thing, It would be easier to maintain a seperate template altogether and switch the content presentation based on user-agent on the server side. More over, sending all the bits over the wire and then NOT showing it just because on the other end it is a small device doesnot make sense. Even Ajaxy things - You would have to write another js and switch it some how to achieve the desired effect. Over all, you are only band-aiding something which needs a total surgery.

Lets say, it is still up for those semantic web vouchers; How will you sell this idea to Google, or Facebook. The case there is further more complex, but even for a simple home page like google's, a user-agent based switch of the template itself on the serverside has been preffered (that is how iphone's safari opens www.google.com). To let users choose the classic version, a link is provided at the bottom. Talk about freedom to choose :)

images from google blogs, infomotions

Popular posts from this blog

Powered By

As it goes, We ought to give thanks to people who power us. This page will be updated, like the version page , to show all the tools, and people this site is Powered By! Ubuntu GIMP Firebug Blogger Google [AppEngine, Ajax and other Apis] AddtoAny Project Fondue jQuery

Decorator for Memcache Get/Set in python

I have suggested some time back that you could modularize and stitch together fragments of js and css to spit out in one HTTP connection. That makes the page load faster. I also indicated that there ways to tune them by adding cache-control headers. On the server-side however, you could have a memcache layer on the stitching operation. This saves a lot of Resources (CPU) on your server. I will demonstrate this using a python script I use currently on my site to generate the combined js and css fragments. So My stitching method is like this @memize(region="jscss") def joinAndPut(files, ext): res = files.split("/") o = StringIO.StringIO() for f in res: writeFileTo(o, ext + "/" + f + "." + ext) #writes file out ret = o.getvalue() o.close() return ret; The method joinAndPut is * decorated * by memize. What this means is, all calls to joinAndPut are now wrapped (at runtime) with the logic in memize. All you wa...

How to Make a Local (Offline) Repository in Ubuntu / Debian

If you are in a place where you dont have internet (or have a bad one) You want to download .deb packages and install them offline. Each deb file is packaged as a seperate unit but may contain dependencies (recursively). apt-get automagically solves all the dependencies and installs all that are necessary. Manually install deb files one by one resolving each dependency would be tedious. A better approach is to make your own local repository. Before you actually make a repo, You need *all* deb files. You dont practically have to mirror all of the packages from the internet, but enough to resolve all dependencies. Also, You have to make sure, you are getting debs of the correct architecture of your system (i386 etc) # 1. make a dir accessible (atleast by root) sudo mkdir /var/my-local-repo # 2. copy all the deb files to this directory. # 3. make the directory as a sudo dpkg-scanpackages /var/my-local-repo /dev/null > \ /var/my-local-repo/Packages # 4. add the local repo to sour...