Physicists in the UK have shared their computers with biologists from around the world to create a grid computing network spanning more than 27 countries.  In an attempt to help find drug compounds which could help to fight malaria (which still kills more than a million people each year) scientists of the WISDOM project analysed around 80,000 compounds per hour.  In total 140,000,000 compounds were analysed.  The project ran from the 1st October 2006 to the 31st January 2007 during which time a grid of 5000 computers ran 420 years of processing and data analysis while generating 2 Terabytes of data.  The processing involved analysis of potential interaction between drug compounds and proteins of the malaria parasite.  A full account of the story can be found here.

Reading about this project reminded me of SETI at home, a project run by SETI (Search for Extra Terrestrial Intelligence) and Berkley University.  With over 3,000,000 subscribers this is the world largest distributed computing network.  The theory is simple, data collected by the radio telescopes of SETI is broken down in to small work units and then sent out to be processed by clients (usually when the machine is idle and running a screensaver).  Once the processing of a work unit has been completed it is sent back to base and a new block of raw data is received.  This is an amazingly efficient way to analyse and process vast amounts of data.

Both of these examples show the huge potential that distributed grid computing holds for research in all fields of scientific research.  I would be interested to hear of any other large scale grid computing projects.