Seeing how fast data can go

Graphics-intensive applications have become more and more

the norm in computers today. Whether you’re talking about the graphics

necessary to create movies such as Cars, to render 3D objects in AutoCad, or

the emerging graphic requirements coming in Vista, computers have long gone

from text-only to basic GUIs to completely graphical in nature. Along with this

has come an increasing need for faster graphics cards and wider bandwidth to

process the millions of pixels being manipulated.

We’ve all seen the progression of graphics in a PC

environment. The first graphics card that was available for a PC-Compatible

could display 4 colors on the screen at a resolution of 320 x 240. You had

choice of a Cyan/Magenta/White palette or a Red/Green/Yellow palette. The first

cards cards were all 8-bit ISA based with a maximum bandwidth of about 8Mbps

(Mega Bits Per Second).

Today, a high-end video card can produce full 32-bit color

with millions of colors at a resolution exceeding 3480x2400.  AGP8 slots have a maximum bandwidth exceeding

2Gbps, and newer PCI-Express cards have a maximum bandwidth exceeding 4Gbps.

So, that raises the question, just how much bandwidth would something as complicated as the human eye require? Apparently not much. 

According to a recent article in New


, human retinas operate with a bandwidth of about 8.75 Mbps. That

means you could easily run it on a 16-bit ISA slot or an old PCI slot. As with

most engineering projects, retina is actually theoretically capable of

processing about 4000 times as much data as it does in practice, but the extra

power needed to do so is too expensive. When you think about how much data that

is, it’s pretty impressive that the eye is designed in such a way to process

that much information and send it over a data line that today would be

considered obsolete.

A better analogy may be the amount of bandwidth consumed by

streaming media. In that case, it would take 5Mbps of data to transfer video at

DVD quality. HDTV quality is supposed to take about 15Mbps of bandwidth.  The human retina uses far less bandwidth, and

produces a much higher resolution picture.

In any case, whether you’re talking streaming media

bandwidth or the amount of bus bandwidth needed to process and display graphics

images, it would seem that man made technology still is pretty inefficient. Whether

you’re a fan of evolution or intelligent design, the human eye seems to get the

job pretty well done.    

Editor's Picks

Free Newsletters, In your Inbox