Hardware

Applying technology laws to the emerging Web services market

Are you wondering where the Web services market is headed? Columnist Tim Landgrave uses familiar rules to analyze the future of the market and project a new forecast of technological growth.

Just like it’s not nice to fool Mother Nature (with apologies to those of you old enough to remember the now ancient Parkay margarine commercial), it’s also not healthy for you to ignore the laws that will determine the nature of the emerging Web services market. You can apply the same laws that have served as guideposts for the PC revolution and the Internet revolution to the Web services market as well. In this article, we’ll review these four basic doctrines and how they will impact the upcoming market adoption of Web services.

1. Moore’s Law
Processor power doubles every 18 months.
Gordon Moore, cofounder of Intel, formulated this law in 1965. Do we still need faster processors? For years, today's server has just become tomorrow's workstation. But that cycle will soon change as processors become powerful enough to allow new types of input devices to work effectively (e.g., through voice and handwriting recognition). From a Web services perspective, faster processors allow higher levels of encryption for lower cost, an improvement that will be required for secure transactions.

The advances in processor size and flexibility may do even more to facilitate commerce. Experts have tested chips on plastic substrate instead of silicon and expect to have them in production within five years. As a result of these processor advancements, fast processors with small amounts of memory can be placed on Federal Express packages, for example, so the packages themselves become “self-tracking.” Once these processors can be printed on "plastic paper" using an ink jet with "conductive ink" and slapped on a box for mere pennies, the amount of data they will collect and store will need even more processor power if it is to be turned into useful information. The new Web services economy will depend on regular advances in processor speed to keep up with the information created by its participants.

2. Metcalfe’s Law
The value of a network is its number of users squared.
Although not officially documented until Robert Metcalfe (the inventor of Ethernet technology) coined it, the effects of Metcalfe’s Law were demonstrated throughout the 20th century. Consider the value of telephones or televisions with one user vs. two, four, or eight. It’s clear that any services network increases in value as the number of participants increases. In the 90s, this principle played out again in the public computer network. In the 00s, Web services will usher in the era of the business computer network. The dream of electronic data interchange (EDI), killed by the fragmentation of standards and limited adoption because of the high cost, will finally be fulfilled.

Standards allow a network to grow. Important examples include the standardization of the phone interface, the broadcast standards that allowed mass production of television sets that could receive any broadcaster’s programming, and, most recently, the HTML and TCP/IP standard that allowed millions of people to share information on the Internet. Any network growing to support a large number of users requires an adherence to standards.

3. Coase’s Law
There's a direct relationship between the cost of a transaction and the friction created when producing it.
In winning the 1937 Nobel Prize, Ronald Coase predicted the ultimate effect of applying the Internet to standard business practices. What he actually said was that firms existed solely to minimize the cost of doing transactions (those costs beyond the price of the purchase, whether in money, time, or inconvenience). The elimination of transaction costs (which he thought at the time was unlikely, if not impossible) would enable individuals to transact business with the same efficiency as corporations, obviating the need for a corporate structure.

Companies in the 1970s focused on using computer systems to reduce the cost of doing business, resulting in lower per-transaction costs and higher margins. In the 1980s, forward-thinking companies sought a reduction in transaction cost through business process improvement internally. In this environment, quality assurance programs and Japanese management principals became the rage. Through the 90s, businesses focused on disintermediation, the process of eliminating the middleman to go directly to the customer to reduce the transaction costs.

Now we’re in the age of data exchange and process integration at the business-to-business level. By eliminating technical constraints on processing transactions between companies, the overall cost of delivering products and services will decline. The added value becomes information about the transaction rather than the product or the service itself. Companies that participate in an industry-standard Web services infrastructure based on XML will prosper in this environment. Those that choose to keep their systems proprietary will incur a higher cost-per-transaction that will put them at a competitive disadvantage to those that have eliminated the friction.

4. Gilder’s Law
Optical capacity doubles every 18 months.
Anyone who’s read George Gilder’s book Life After Television or read his articles in Forbes magazine may dispute my summary of his beliefs with this simple statement. But this principle is central to his argument that physical assets are slowly becoming replaced with electronic ones. Although this may be the long-term effect, the short-term impact will be seen first in the way we store and process transactions (supporting Coase’s theory). Having larger pipes means that we can move larger amounts of data over larger distances at greater speeds. This may eliminate the need to store large, redundant data in multiple places if systems can use large, redundant pipes instead to access data necessary to complete transactions.

The overall impact will be an increase in the number of transactions per second that our network backbone can support. And most importantly, with these incredible increases in capacity come incredible decreases in price. Once we can push large amounts of data around the network without paying large fees to do so, then services begin to be priced in pennies and fractions of pennies rather than large monthly fees or per-transaction fees in dollars. It’s a good thing that Moore’s Law provides the processing power to deal with the increased transaction volume.

Transaction security will also increase as a result of Gilder’s Law. The “hidden price of encryption” today results from the fact that encrypted information is, by its nature, larger and therefore requires more time to transmit. More time to transmit means fewer transactions and less data moved in the existing infrastructure. Once data transfer becomes “virtually free” like processor power has become, then system designers can begin to build encryption and data security into applications by default rather than compromising to save either bandwidth or processing power.

An increase in transaction capacity and the ability to secure the transactions are two of the essential nutrients necessary to grow the Web services market. The number of business-to-business transactions processed today is miniscule compared to the number that the Web services infrastructure will enable. When any company can connect with any other company to purchase goods or services or to use electronic services exposed through a Web services interface, the network effect predicted by Metcalfe will drive up transaction volume exponentially, making the effects of Gilder’s Law a necessity for the success of the ecosystem.

What’s your scenario for the evolution of technology?
Do you have another interpretation of where it’s all headed? Or does Landgrave have it about right? Share your thoughts by sending us an e-mail or posting a comment below.

 
0 comments