Did we miss something by ditching analog?

I am certainly no hardware expert, but for whatever reason, I think we lost something in the world of computers when we went to a purely binary world of computing. Maybe I have just read Destination: Void by Frank Herbert one too many times. But something inside of me is just nagging that a system of many parallel circuits performing identical functions, but that allows some randomness to occur, could be incredibly useful, particularly in the field of Artificial Intelligence. Any kind of randomness would be fine, whether it be slight delays in processing by some circuits, slightly inaccurate results due to minor differences in physical state, or even just random interference from stray EMF radiation.

Why do I think this? It is because I believe that making mistakes is a crucial function of the human learning process. Until one makes a mistake, it is impossible between understanding and the repetition of memorized information. Currently, with binary computing, all "intelligence" is simply a form of memorization. The computer is handed a set of initial patterns, and some rules to build new patterns, and it is supposed to "understand" linkages between abstract concepts. But since the programmer is giving the computer the rules to making new patterns (even if those rules are patterns to develop new rules for pattern generation), the computer is ultimately tied to the programmer's inherent biases and concepts.

What I envision instead is a tabula rasa of hardware, with some sort of input mechanism giving the computer access to data feeds (camera data, audio, keyboard, or whatever), a simple output mechanism, and a mechanism to display approval or disapproval of the results. It is up to the computer to reprogram itself in a method that avoids disapproval at the very least, and seek approval. In a nutshell, we create "pain" for making a mistake, and "pleasure" for success.

Why the tabula rasa? Because that is how human minds start. Humans may have an instinct for language like any other animal, but the language itself is undefined. There are language portions of the brain, but no particular language is needed, thus the proliferation of languages all over the world. It is quite interesting to note that certain cultures reflect themselves in language, but even more interesting that language itself affects the way in which people think. I believe that this is because language imprints patterns upon our thought, and our thought tends to follow those patterns.

Underneath it all should essentially be a massive pattern recognition system that the computer can rewrite as it sees fit, possibly in a method of evolution with some random mutation thrown in for good measure. Hopefully, many of these systems can be networked together in a less than perfect manner, so that they may swap patterns, but the communication system would occasionally alter or even completely corrupt the pattern. It would seem to me that the transmission of mutated ideas could help spur additional evolution of the pattern system.

I think that a good beginning of this system would lie in evolutionary game theory. Current techniques involve establishing set strategies in advance; this system would develop its own strategies. The evolutionary game itself would provide the reward and punishment system. I think many techniques from functional programming may be used for such a system. Overall, I think it would be an absolutely fascinating project. In my mind, I feel that this type of system could be used for a wide variety of tasks. Often in data, it is the exceptions to the norm (outliers) that are the truly interesting piece of information. If you are a police detective, the lives of the average citizen are not of much interest to you, but the behavior of a small segment is much more important. This kind of system would be excellent at finding the exceptionally interesting data.

Another great use of this kind of system would be to replace the folksonomy systems that are sprouting up everywhere. Folksonomy systems are a direct result of the fact that quality contextually relevant searching mechanisms are incredibly difficult to program. I have a major problem with the "tagging" trend, which is the same problem I have with Wikipedia. Only the truly enthusiastic bother to do it consistently, and the truly enthusiastic (or worse, fanboys) are not very reliable because of their enthusiasm. Just like it is very difficult to get good information about any OS thanks to the fanboys and zealots, I do not like the idea of using tagging or folksonomy to generate metadata. A system that is capable of generating its own contextually relevant patterns would be excellent at this.

If you have any ideas on this subject, I would love to hear them. Pattern matching is one of my personal pleasures (I drive my friends nuts with high-speed association games), and AI completely fascinates me.


About Justin James

Justin James is the Lead Architect for Conigent.

Editor's Picks

Free Newsletters, In your Inbox