After Hours

The Singularity: the rise of the machines?

The Singularity is the point in time when humans will hit a technological wall, and machine intelligence will exceed human intelligence. Read about the science fiction-esqe implications to this theory.

Moore's Law (i.e., the number of transistors that can be placed on an integrated circuit will double approximately every two years) has been expanded to anything technology related. As a matter of fact, Moore's Law is expected to continue for at least five more years and perhaps much longer. The question becomes, though: What will be the change that causes us to deviate from that theory?

Some say it will be a dramatic slowdown in technological advancement. The human race will eventually hit a wall, and we'll be stuck at a technological level. This might happen centuries from now, or it may be next year, but we will come upon that point; our brains can only handle so much advancement and innovation. Entomologist and population biologist Paul Ehrlich gave a lecture on June 27, 2008 in San Francisco during which he stated that the human brain has not changed much in about 50,000 years. Cultural evolution information such as this leads to the wall-type theory.

But, what if the opposite happened? What if that wall was beyond the point where humans could create autonomous, thinking, self-advancing machines? Machines could reprogram their own source code and essentially learn freely without human intervention (think Data on Star Trek). This point in technology could trigger what has become known as The Singularity (or the Technological Singularity).

If it is possible for humans to reach this point in technological advancement, the machines could create new, better versions of themselves (thus advancing their intelligence) or simply rewrite their source code to advance their intelligence. Without the limitations of the human brain, the cap on intelligence could be infinite.

There are science fiction-esqe implications to The Singularity theory. If artificial intelligence (AI) exceeded human intelligence, what would stop the machines from taking over and potentially destroying humanity? There are several scientific theories on this subject, from an AI Box (the AI is kept constrained in a virtual world where it cannot affect the external world) to a friendly AI (which will likely be harder to create than an unfriendly AI, but it may keep unfriendly AIs from developing simply to maintain their existence).

This turning point is likely to happen at some point, even though none of us alive today may ever see it. Although transistors on ICs have indeed been doubling every two years or so since the 1950s, it's not likely to continue indefinitely.

This basic overview of The Singularity is meant to be a jumping off point to get Geekend readers talking about the theory, which has genetic, biological, and philosophical implications. For more details about The Singularity, take a look at these resources:

Editor's Picks