IBM's DeepQA Computer Watson Flexes AI Muscle On Jeopardy!

Remember the date. Because February 16, 2011, may likely go down in history as the day that human intelligence was surpassed -- or at least matched -- by the intelligence of computers and software programming. For three days this week, a computer named Watson was a contestant on the television game show Jeopardy, where it handily defeated two worthy opponents in the two-game "IBM Challenge" named for Watson's creator.

Representing all of humanity (as he put it) was Ken Jennings, whose 74-game wining streak stands as the quiz-show's longest, and Brad Rutter, whose cash winnings across three separate championship series totaled a record $3.2 million.

The epitome of throwing hardware at a problem, Watson consists of 10 equipment racks, each populated with 10 IBM POWER750 "embarrassingly parallel" servers with a total of 2,880 processor cores and 15 terabytes of system memory. Watson is reportedly capable of operating at 80 teraflops. Oh, and by the way, it's running Linux. A pair of giant cooling units are required to maintain a comfortable temperature in Watson's living room.

A successor of sorts to IBM's so-called Deep Blue chess computer, the DeepQA project delves into the topic of artificially intelligent question answering (the QA part) and has come to define such algorithms as those serving as Watson's brain.

id
unit-1659132512259
type
Sponsored post

The animated avatar that represented Watson on the set of the game show has 27 states of being, or facial expressions, that can be triggered by how its own actions or by other events taking place during the game. For example, when Watson answers correctly, he displays pleasure and elation with swirling green leaders and followers streaking around the top of the sphere.

Other states include answered, answering, answer correct, answer revealed, answer wrong, buzzer enabled, buzzer time out, category selected, clue revealed, opponent answering, score gain, score loss, opponent score gain, daily double and others. Yellow and red indicate low and lower moods, and Watson shines blue when it lands on a Daily Double.

Next: Watson's 'Buzz Threshold'

When formulating each Jeopardy response, Watson generates an answer panel listing its top three guesses in order of confidence. If its confidence in at least one answer isn't above a certain threshold, Watson won't take the risk of buzzing in. This "buzz threshold" also can vary along with the stakes in the game, forcing Watson to take fewer risks to preserve a lead, for example, or to play aggressively as Final Jeopardy draws near.

For call centers, questions coming in that are beyond the knowledge of the first-level support technician might be solved using natural language processor like Watson. For financial institutions, current events can be monitored in real time and analyzed along with what-if scenarios. In health care, a physician might confirm the diagnoses of highly complex cases or where drug interactions might not be anticipated. There are applications in banking, insurance, telephone networks and any industry where large amounts of natural-language text exists. In essence, all industries stand to benefit from Watson's so-called deepQA capabilities.

Unlike its human counterparts, which have to decipher language before formluating a response, Watson is fed its questions electronically, as text files. This led one person to theorize on a Youtube page that Watson had an advantage on longer questions because he could begin formulating a response before the humans were done listening to the question.

After reading this and watching the match again, I saw no evidence to support the theory. However, I did notice that humans were slower at buzzing in, and that when they did have an opportunity to answer, Watson invariably lacked high confidence. Perhaps Watson should be handicapped by the same 0.10 seconds in the IAAF's false start rule for track and field events. Because the humans never had a chance.

In the end, Watson had amassed more than $77,000 in the two-game tournament, nearly doubling the combined scores of its human challengers. Jeopardy producers awarded IBM with a $1m top prize, which it said will be donated to charities World Vision and World Community Grid.