This brief overview of polymathic analytics explains how da Vinci, as well as Star Trek, may serve as inspiration when designing AI solutions.
Most analytic applications focus on a very specific task. Weather applications try to predict the weather, and driving applications try to get you to your destination as quickly and safely as possible. But what if your next big idea with artificial intelligence (AI) wasn't specific at all? What if you built a solution that tried to learn a lot of very different things at once, and then learned how to draw inferences from a wide set of subject areas?
Just like the great polymaths of the Renaissance era, there's no practical limitation on what our advanced data and analytic systems can learn today. An ambitious but extremely valuable application of data science incorporates AI in a wide variety of subjects -- this is what I call polymathic analytics.
Channeling da Vinci
Polymathic analytics was inspired from the great thinkers of the Renaissance era like Leonardo da Vinci. During this time, mankind embraced the philosophy that we are limitless in our capabilities, and many strived to excel in a wide variety of disparate skills and talents. da Vinci was a paragon of Renaissance humanism, achieving great fame for his achievements in painting, architecture, cartography, botany, and many other disciplines. The quintessential Renaissance Man, da Vinci enkindles our creativity and challenges us to widen our scope when considering the design of our next big analytic solution.
Polymathic analytics extends the idea of predictive analytics to something more like an android. The stereotypical android, like Star Trek's Lieutenant Commander Data, wasn't built to solve a particular problem or predict a specific type of outcome; Data was built to be an artificial human. And as he grew (yes, Data physically grew from an infant into a man), he learned just like you or me -- well, maybe a little better than you and me. And although Data is a science fictional character, there's no reason why we can't take examples from real (da Vinci) and fantastic (Data) life to design 21st century AI.
From data to knowledge and beyond
Polymathic analytics exploits the knowledge layer in the ascent from data to wisdom on the DIKW (Data, Information, Knowledge, Wisdom) Pyramid. Where data forms the building blocks for information (applied data), information forms the building blocks for knowledge (synthesized information).
By definition, the knowledge layer implies multiple sources of information -- that could be 2 or 20. With polymathic analytics, you should strive for as many information sources as possible. This is consistent with da Vinci's unquenchable thirst to know everything about everything. I envision a polymathic analytic solution having the ability to draw from at least a dozen disciplines.
Your polymathic analytic solution should be more than just additive. You could easily take two predictive analytic solutions and form an amalgam that performs two different functions. For example, an application that predicts both the weather and traffic for a target destination and time period is an amalgamated solution; however, it's not a polymathic solution -- it's just a convenient interface for two predictive functions.
A polymathic analytic solution could apply the principles of technical analysis (i.e., to predict stock market movement) to predict the movement of a tornado and vice versa. In fact, technical analysis and tornadoes would only be two of many information systems that your solution could draw from. Let's explore how we might design one of these things.
Starting the blueprint
The design of a polymathic analytic solution comprises three parts: the information systems, an inference engine, and a learning system. Each information system is similar to the myopic analytic systems we were talking about earlier. Your design stops at the information layer of the DIKW Pyramid. The information layer provides the "so what" of all the data that supports it. It's the point at which your application can and should give some sort of recommendation based on the data that it has collected and analyzed. You should have about a dozen or so information systems that span a variety of subject areas.
The inference engine is a bit sophisticated and intentionally unfocused. This is where the real power of the polymathic solution lies. The inference engine investigates the learnings from all disparate information systems and applies these learnings to a knowledge system that covers all the disciplines within the system. Synergy is vital in your approach to designing the inference engine. It must be able to integrate learnings from various disciplines to solve problems in new ways. This is where the learning system comes in -- it works with the knowledge system to make sure it's constantly improving.
A polymathic analytic solution takes big data analytics to another level and extricates your scope from the confines of limited thinking. This is in the spirit of the great thinkers of the Renaissance era that brought us from the Middle Ages to the Modern Age.
Architect your polymathic solution with as many information systems as you can and then layer on a sophisticated inference engine that can synthesize the learnings into something greater than the sum of its parts. Take some time today to sketch out what that would look like. Your da Vinci system is just a creative brainstorm away.
- The 20 best schools to study big data analytics (TechRepublic)
- 3 must-have skills of effective analytics solution designers (TechRepublic)
- Open source initiative taps analytics to solve Asia's traffic jams (ZDNet)
- Quick glossary: Business intelligence and analytics (Tech Pro Research)