Skip to main content

Use E-Tags and Cache-Control: Caching your Dynamic web Content Better

This site rendered js and css (now even images) dynamically. Contents of files are stored in Google BigTable Datastore and are served by a DynServlet. The DynServlet maps the name of the file to content of the file in DB. This was slowly using up a lot of resources. Adding a layer of memcache, saved some Datastore API calls. But the memcache calls were still significant. So an ETag mechanism is introduced.

A Global ETag is generated on any file posted or updated, and (G-ETag is) stored in Datastore. To save App Engine resources (which are counted to cost), the ETag header is obtained from each request checked with GDTS from backend. If it is the same a "304 – Not Modified" response is sent back, asking the client to use the local cache copy. If it is not exactly same, a the request is passed through and rendered dynamically. This guarantees all content will be served fresh after every deployment. Requests in between deployments may be served from client-side cache, because they will have same ETag.

However, having just ETag will still generate "GET " request and "304 – Not Modified" response (ping) conversation for each request, even if it has the latest version in the cache. Meaning, there will still be network round trips for each page. So a Cache-Control header is also used. This header, Cache-Control: public, max-age:3600. caches the content at the BrowserAgent (client) for an hour. So the change pings will not called at least for 1 hr.

This way, it is guaranteed that data can be stale at most 1 hr. (can also be considered downside). However, according to the site's analytics, max length of time spent by a user on the site is 14 mins. 1 hr. covers the most dedicated visitor :)

Popular posts from this blog

Being a Vegetarian

I am a Proud Vegetarian. I don't eat Meat or Eggs. People say its hard here in US to be one. I beg to differ. The mere fact that I am hail and healthy these 4 years is a definitive proof. Apart from being bullied and trash talked by The Meat-Eaters, There is really nothing that makes this choice of mine any more than a debatable issue at a lunch or dinner. Other things aside, I am writing this blog having watched a PETA Video. Before you click on the play button, I ask you - If you are a vegetarian : Dont watch it. If you are not : Dare to watch it till the end. If you think going veg is just a fashion, think again . Even if you just want to do it for Fashion . Do it. Go Vegetarian. And Feel better asking the waiter for a Vegetarian Entrée in your next lunch.

Using Equinox CommandProvider to make OSGi console interactive.

After fiddling with the First Bundles that "Hello World"-ed upon Activation, You want to see more interactivity in OSGi. Although Using OSGi for an interactive Command Line Application would be like this one would be, well, a callable over-kill, I am going to start with an example and Expand it in later posts. So, please Welcome CommandProvider. CommandProvider is an EQUINOX specific API for extending the Console. This basic Example illustrates how to get a command from console and do something in java and also gets your feet wet on Service Registry package com.so.examples.commandconsole; import org. eclipse .osgi.framework.console .CommandInterpreter; import org.eclipse.osgi.framework.console.CommandProvider; public class DisplayMessageCommand implements CommandProvider { public void _say(CommandInterpreter ci) { ci.print("You said:" + ci.nextArgument()); } @Override public String getHelp() { return "\tsay - repeats what you say\n"; } }

How to Make a Local (Offline) Repository in Ubuntu / Debian

If you are in a place where you dont have internet (or have a bad one) You want to download .deb packages and install them offline. Each deb file is packaged as a seperate unit but may contain dependencies (recursively). apt-get automagically solves all the dependencies and installs all that are necessary. Manually install deb files one by one resolving each dependency would be tedious. A better approach is to make your own local repository. Before you actually make a repo, You need *all* deb files. You dont practically have to mirror all of the packages from the internet, but enough to resolve all dependencies. Also, You have to make sure, you are getting debs of the correct architecture of your system (i386 etc) # 1. make a dir accessible (atleast by root) sudo mkdir /var/my-local-repo # 2. copy all the deb files to this directory. # 3. make the directory as a sudo dpkg-scanpackages /var/my-local-repo /dev/null > \ /var/my-local-repo/Packages # 4. add the local repo to sour