BitTorrent, the peer-to-peer file sharing protocol, and its implementation software have received much ire from the content creation industries. However, there are certain fundamental aspects underlying this technology that could solve many issues relating to computing on a wide scale.
The IT department at INHOLLAND University used the BitTorrent protocol for dropping 22TB of patches on 6500 PCs in four hours. The gargantuan amount of downloads aside, that task used to take almost two dozen servers four days in the past.
Leo Blom of ITeleo, who came up with the idea of using BitTorrent, told Ars, “Let me put it this way: if INHOLLAND wants to migrate to Windows Vista, they only have to send out an image through BT. All 6,500 desktops can be migrated overnight in two hours’ time—with one push of a button. It’s a real migration killer. Migration used to mean a lengthy and trying process. At INHOLLAND, we took a different approach.”
Bandwidth was an abundant resource when most of the content on the Web was textual. Now, with the proliferation of media, ISPs are facing the pressure and are also demanding pricing schemes that take the data usage patterns into consideration.
Peer-to-peer networks are finding application in making Wi-Fi networks free for Internet access on a voluntary basis.
Perhaps applying the P2P protocols for applications, such as the ones mentioned above, can optimize the usage of bandwidth on a wider scale. Is it not time to ponder the legal applications of BitTorrent?