Monthly Archives: September 2013

Parallelism & HTCondor

I recently ran into HTCondor, which is an implementation of High Throughput Computing (HTC). It is developed and maintained by the University of Wisconsin – Madison.

HTCondor makes use of all available computational resources to get a job done. Computational resources could be processors in a single machine or a distributed computing system. It allocates jobs to different machines based on rules and has the ability to transfer jobs from 1 machine to another.

Condor generally finds its applications in servers/distributed networks/farms. However it can be used to execute jobs in parallel on a single machine too.

1. Install condor

sudo apt-get install condor

2. Run condor_status to see your processors

condor_status -available -> shows available procs

condor_status -run -> shows procs that are running a job

Condor takes an SDF file as an input. If the various tasks you are running have dependencies, you can use DAGman. DAGman uses a directional acyclic graph to inform condor about the various dependencies. In all, it takes care of generating the SDF file for you (given a graph).
A sample sdf file can be found on the HTCondor web documentation.
My SDF file can be found below:

This job can be submitted to condor, which will allocate it to a processor
Lets say, I submit the job thrice:

Now a condor_status -run will show list of procs running a condor job.


Kodixg for Online Linux

I use Linux for a lot of things, and depend the command line to a great extent. Very recently I started using

This service provides every user with a dedicated Ubuntu VM with sudo access. I generally develop on this for the following reasons:

1. My work can be accessed from any place

2. Ubuntu provides apt, which is a superb package manager (combine this with the sudo access, and you know where I am going)

Net connection at my home is bad. So I can use koding to install modules onto itself, and that works really quick.

By the way, this is no advertisement 🙂