Skip to main content

How to Make a Local (Offline) Repository in Ubuntu / Debian

If you are in a place where you dont have internet (or have a bad one) You want to download .deb packages and install them offline. Each deb file is packaged as a seperate unit but may contain dependencies (recursively). apt-get automagically solves all the dependencies and installs all that are necessary. Manually install deb files one by one resolving each dependency would be tedious.

A better approach is to make your own local repository. Before you actually make a repo, You need *all* deb files. You dont practically have to mirror all of the packages from the internet, but enough to resolve all dependencies. Also, You have to make sure, you are getting debs of the correct architecture of your system (i386 etc)
# 1. make a dir accessible (atleast by root)
sudo mkdir /var/my-local-repo
# 2. copy all the deb files to this directory.
# 3. make the directory as a 
sudo dpkg-scanpackages  /var/my-local-repo /dev/null > \
 /var/my-local-repo/Packages

# 4. add the local repo to sources 
echo "deb file:/var/my-local-repo ./" > /tmp/my-local.list
sudo mv /tmp/my-local.list /etc/apt/sources.list.d/my-local.list
sudo apt-get update
You can now use synaptic or cli
sudo apt-get install the-package-from-local

If all dependencies are available, it will install and be done. IF not, download those deb files (dependencies) too, and repeat steps 2 and 3.

Popular posts from this blog

Powered By

As it goes, We ought to give thanks to people who power us. This page will be updated, like the version page , to show all the tools, and people this site is Powered By! Ubuntu GIMP Firebug Blogger Google [AppEngine, Ajax and other Apis] AddtoAny Project Fondue jQuery

Decorator for Memcache Get/Set in python

I have suggested some time back that you could modularize and stitch together fragments of js and css to spit out in one HTTP connection. That makes the page load faster. I also indicated that there ways to tune them by adding cache-control headers. On the server-side however, you could have a memcache layer on the stitching operation. This saves a lot of Resources (CPU) on your server. I will demonstrate this using a python script I use currently on my site to generate the combined js and css fragments. So My stitching method is like this @memize(region="jscss") def joinAndPut(files, ext): res = files.split("/") o = StringIO.StringIO() for f in res: writeFileTo(o, ext + "/" + f + "." + ext) #writes file out ret = o.getvalue() o.close() return ret; The method joinAndPut is * decorated * by memize. What this means is, all calls to joinAndPut are now wrapped (at runtime) with the logic in memize. All you wa...

Faster webpages with fewer CSS and JS

Its easy, have lesser images, css and js files. I will cover reducing number of images in another post. But If you are like me, You always write js and css in a modular fashion. Grouping functions and classes into smaller files (and Following the DRY rule, Strictly!). But what happens is, when you start writing a page to have these css and js files, you are putting them in muliple link rel=style-sheet or script tags. Your server is being hit by (same) number of HTTP Requests for each page call. At this point, its not the size of files but the number server roundtrips on a page that slows your page down. Yslow shows how many server roundtrips happen for css and js. If you have more than one css call and one js call, You are not using your server well. How do you achieve this? By concatinating them and spitting out the content as one stream. So Lets say I have util.js, blog.js and so.js. If I have a blog template that depends on these three, I would call them in three script tags. Wh...