Hardware

IBM Watson supercomputer beats Jeopardy champs in practice round

IBM's Watson supercomputer won a practice round against Jeopardy champions Ken Jennings and Brad Rutter and raised a lot of questions about the capabilities of artificial intelligence.

This guest post from Larry Dignan originally published on TechRepublic's sister site SmartPlanet.

IBM's Watson supercomputer won a practice round against Jeopardy champions Ken Jennings and Brad Rutter and raised a lot of questions about the capabilities of artificial intelligence.

Watson, a four-year effort by IBM, was quicker on the draw, didn't fall prey to emotion and had a voice that could be confused for wayward computer Hal 9000 from the movie 2001: A Space Odyssey. For IBM, Watson is about tackling verticals and bringing hardware and analytics to the fore.

As one of the dozens of humans watching this practice round, I can't deny I was a bit squeamish about seeing a supercomputer wing it, adapt and show off its artificial intelligence.

Is this thing going to be Skynet? That's a bit farfetched-today IBM is thinking health care will make the most use of Watson-but a supercomputer that has self-awareness and can learn gives this human pause. David Ferrucci, principal investigator of Watson DeepQA technology, said Watson can conduct self-assessments and learn.

Naturally Ferrucci was asked about whether Watson had the risk of Hal 9000. "That's science fiction," Ferrucci said. "We're not even close to that." Ferrucci did say that Watson is more like the computer on Star Trek than Hal.

Here's a look at the practice rounds, taken with my smartphone from the audience:

When the duel was finished, Watson won the round with $4,400, Jennings had $3,400 with Rutter bringing in $1,200.

IBM execs, Jeopardy host Alex Trebek, Rutter and Jennings fielded questions after the round.

Among the highlights:

  • The Jeopardy questions hit Watson's chips as soon as the human retinas get the data. To take away any advantage, Watson has a mechanism to hit a signaling device, said Trebek.
  • It was odd watching a supercomputer playing a category called "chicks dig me".
  • The Jeopardy champs had more concerns about Watson coming back from the future and harming them. Skynet quips were plentiful.
  • Watson cannot be psyched out, which is a problem for human players. Jennings and Rutter said that it's a disadvantage to play a computer that has no emotions. Both Jeopardy champs said they were able to psych out rivals during their win streaks.
  • Rutter said Watson can be a bit overwhelming. Jennings and Rutter quipped about how computer capabilities are part of human advancement, but acknowledged that they were a bit uncomfortable. Jennings said he "didn't want technology to advance that far just yet." When John Kelly III, director of IBM Research reminded Jennings and Rutter that computer and human intelligence were at an intersection point and computing would only improve, Rutter quipped: "So we're all extinct."
  • I asked Rutter what it was like playing against Watson. "I'm impressed with Watson and its speed," he said. "But after 10 or 15 questions Watson is just another good player. I have every confidence that we'll do well."
  • If Rutter had Watson to help as a singularity tool, the Jeopardy champ said he'd most likely to be able to better assess risks for Double Jeopardy. "Watson's biggest advantage has algorithms that can make bets instantly," said Rutter. "If I had Watson's algorithm I could make bets and assess risk."
  • What's Watson's biggest weakness? Rutter said Watson's ability to understand human language and get the quips inserted by writers. "They tell me Watson can get jokes and have fun," said Rutter.
15 comments
PB ib CA
PB ib CA

The first principle of natural language is: communication is writ in a stream of data that is rich in patterns, but whose "primitive" data states are face-value only (free of symbols coded by a human to signify some meaning). This is of great consequence, for if the system's lowest level input stream symbols are free of coded meanings, then all of the information in the stream is vested in patterns that are learnable by the receiver. However, if the lowest level tokens in the input stream are already coded (laden with meaning by humans), the system cannot gain access to those meanings. This basic law of knowledge acquisition is violated when you feed ASCII text words to a computer (the predominant data stream Watson receives). Watson can only look for patterns in the stream of data. When it receives the pattern "Shakespeare" in ASCII, it is receiving just a string of ASCII bit patterns. Watson has never watched a play, nor even watched people interact dramatically, so it has no clue what a playwright does, or what different fine arts consist of. All Watson (or any computing system, including the brain) can do is accept the input tokens at face value (meaningless as individual symbols), and try to draw connections based on patterns found in the continuing stream of face-value data. From this vantage, it ignores basic semiotics principles to start with the assumption that natural language can be learned bottom up from ASCII text stream input. We can predict that such systems will have poor success with understanding humans on their terms. Rather, semiotics principles say that a system which receives a sensory datastream from a microphone and camera interface has the possibility to learn from the patterns in its input. As it build a repertoire of patterns, it can start to be taught language, by yet an overlaid set of sound patterns that it can associate with more primitive (earlier-learned) phenomena. The elementary states making up its input datastream do not code for anything (they are face-value), so the learner is not missing any meaning as would be the case if the primitive tokens were already signifying something (as ASCII text does). There are AI scientists/philosopers who understand this distinction, but I'm not seeing much influence on IBM Watson's design.

AnsuGisalas
AnsuGisalas

Artificial intelligence is more than one thing - if this is purely digital AI, then how was it trained for this? Remember, an AI needs to be taught the human language in question, so answering to "Chicks dig me" would require programming about A) slang in general or that phrase in specific, B) an repertoire of phrases to produce talk about human interaction, C) a list of things that define "me" to the AI. In other words; is this a breakthrough in AI or is it a breakthrough in computer simulated semantics, or is it something else entirely? We have so far had no success teaching a computer to speak in a spontaneously, grammatically and semantically correct way. So... WTF?

buzzbuzzard
buzzbuzzard

It doesn't sound at all like Hal to me; I was surprised at how human it sounded. I'm a Jeopardy dork, and I'm certain that the second to last bullet meant to refer to the Daily Double, not Double Jeopardy. The Daily Double is where bets are made and often trips up contestants. It's also an area where I've noticed that women tend to bet less than men. (This isn't meant as criticism; it's purely observation.) Unless a contestant is really behind and/or knows the category well, getting a Daily Double is a mixed blessing! In addition to losing money, it often make players lose their focus because they have to be more strategic in addition to focusing on providing questions (answers).

AnsuGisalas
AnsuGisalas

It reads a bit choppily now. Throw in some hyphens and some commas. Start new paragraphs where the pace or perspective shifts; minor stuff - but you'll notice it works better. And... please put a space before and after the hyphens!

reisen55
reisen55

Really refers to a dead era in IBM. Today, most of IBM is in India, China has IBM hardware by the ton, copying all of it. Thomas Sr. and Jr. would be rolling over in their graves at the thought of American workers RIF'd just because Bangalore offers cheap wages and no health care bennies.

Slayer_
Slayer_

Fancy answering machine for call cenetres, no more outsourcing to India, we now outsource to the server room :)

buzzbuzzard
buzzbuzzard

You're not taking into account that India has an extensive public health system, with free services for the poor and affordable services for the middle class. It also has world-class medical facilities in some cities, which is why it's becoming a leading destination for international "medical tourism."

HAL 9000
HAL 9000

To take this unit. It's still probably cheaper to pay the Indian than pay the Power Bills for a unit like this. ;) Col

AnsuGisalas
AnsuGisalas

what they pay me for... Oh, wait, they don't pay me. Crap.

buzzbuzzard
buzzbuzzard

I can't see how power bills for a room of 50+ call center employees would be lower than for a few of these units (assuming they can handle more than one call at a time, which seems feasible). Especially given that wages in India are rising quickly and the country will probably (pretty intentionally) price itself out of much of the call center market in 10 or so years as it moves into higher-end tech products and services and other countries with cheaper labor start providing them. For example, some Caribbean and Central American countries have established call centers for firms from the US (and other countries) and have perceived advantages of closer cultural ties, lighter accents, and closer time zones---I say "perceived" because these usually mean nothing except to biased US consumers. Though the difference in labor costs is partly due to he fact that Indian workers on night shifts are usually paid more than day workers, who obviously handle far fewer calls from the US and don't have to work insane hours.)

santeewelding
santeewelding

But, yes. Why else you think I persist here?

AnsuGisalas
AnsuGisalas

Do you need to bring them the scalps to get paid?

Editor's Picks