Most people just don't understand how to balance bandwidth and productivity
Written in Salt Lake City when it was -17C one night and not worth chancing the additional wind chill, and dispatched to silicon.com via my hotel free wi-fi service.
There you are driving along a freeway with plenty of space ahead when red brake lights suddenly accelerate towards you. What happened? Probably nothing - or an incident that cleared hours earlier.
This is all part of the natural action of traffic flow and a good example of non-linear dynamics in action. It turns out that sporadic 'clumping' is common in nature and many human activities.
We have all experienced waves in road traffic that can span 0 to 120km/h - and highly variable arrival times of data packets sent over the internet.
What is happening? It turns out that all networks have a critical capacity up to which traffic flow is linear and well behaved - but go beyond this point and all manner of unexpected effects break out.
In the case of major road traffic, loadings up to 20 per cent capacity might see reasonably linear behaviour, while 100 per cent capacity results in the perfect parking lot where nothing moves.
For IT the percentages are wider ranging. The old fixed line phone network peak-to-mean traffic ratio was about 4:1, while mobile nets are about 40:1, and the internet can exceed 1000:1.
The point at which strange networking effects appear is also well spread at around 40 per cent for the phone, about 30 per cent for mobiles and about 10 per cent for the net.
One big exception to the rule is broadcast communications which can operate at 100 per cent capacity. The subtle difference here is the unidirectional traffic flow, and a lack of any participation and interaction.
What does this all mean? Most commentators, politicians and even some IT people just don't get it! If you had 1Gbps broadband to your home and office, would it be running at a substantial percentage of that speed? Extremely unlikely - unless it is loaded with pure broadcast channels with no quiet times, still or blank frames. And even then the usage would be less than 100 per cent.
For humans the critical factor of bandwidth and capacity is latency. The time between seeing, understanding and clicking has to be short to maintain our creativity and engagement. Delay turns out to be a key parameter in every aspect of human communication but one that is widely ignored or misunderstood.
Some of these concepts are illustrated in this graph:
In the 1950s and 1960s, printing or downloading materials meant taking a coffee break. But hey, back then programs could have runtimes of hours and days, and we all had other things to do anyway. There was an expectation that everything was going to be slow, clunky and limited. After all, in the 1960s, 20MB memory was a big deal with a price tag of more than $20,000!
As time and technology have moved on, we've became more focused on screens and machine interaction. Delay times that were measured in hours are now down to seconds, and yet we can easily become irritated by the relatively short amount of time it takes to scan, print, download and upload.
The basic golden rules that have emerged for maximising our output are:
- Everything has to be three clicks or less away.
- Commands have to be executed in less than one second.
- Bandwidth has to be symmetric.
- Uptimes have to be better than 99.9 per cent.
The good news is that all this can be achieved by expanding the cheapest technology commodity we have - bandwidth. And should anyone measure our bandwidth use, they should do it at our eyes, lips and fingertips - and not on the wire, fibre or radio links.
Our most expensive asset is human creativity and application, which is crippled by a general lack of understanding of network economics and performance. A 10 per cent average utilisation of bandwidth can equate to a 100 per cent output of human ability, while 100 per cent network utilisation can see human performance significantly degraded by the attendant latency.