Tue. Jan 28th, 2020

The Next 7-10 Years of IBM’s Watson

Jeopardy! Was Just The Beginning

The Next 7-10 Years of IBM’s Watson
Alex Trebek, Ken Jennins, Watson and Brad Rutter
Photo courtesy of Jeopardy!

IBM’s achievement with their Watson system and software was more than good television:

    The Next 7-10 Years of IBM’s Watson

  • It’s a major language processing realization. Computing systems will no longer be limited to responding to simple commands.
  • The data management aspect lends itself to specialization, ie, medical sub-sets, legal data sets, call/support centers databases, etc. John Markoff, in a recent NY Times article on the subject, said “any job that now involves answering questions and conducting commercial transactions by telephone will soon be at risk. It is only necessary to consider how quickly A.T.M.’s displaced human bank tellers to have an idea of what could happen.”
  • The language processing is amazing, illuminating, and lets one dream of a future where the promises of human-robot (or for that matter, human-device) interaction and instantaneous translation is really going to happen soon.
  • A staggering amount of horsepower was harnessed to work harmoniously using massively parallel technology on 2,700 processors spread over 90 servers to enable the Jeopardy! win.  Historically, this will advance to smaller devices within a few years. Ray Kurzweil, quoted in The Economist, notes that it was only five years after the massive and hugely expensive Deep Blue beat Mr Kasparov in 1997 that Deep Fritz was able to achieve the same level of performance by combining the power of just eight personal computers. In part, that was because of the inexorable effects of Moore’s Law halving the price/performance of computing every 18 months. It was also due to the vast improvements in pattern-recognition software used to make the crucial tree-pruning decisions that determine successful moves and countermoves in chess. Now that the price/performance of computers has accelerated to a halving every 12 months. Mr Kurzweil expects a single server to do the job of Watson’s 90 servers within seven years—and by a PC within a decade. If cloud computing fulfills its promise, then bursts of Watson-like performance could be available to the public at nominal cost even sooner.
  • And most importantly, right after the Jeopardy! win, IBM announced partnerships with a few hospital groups to provide diagnostic physician assistance using Watson’s DeepQA software and data management methods. And their website displays other areas where Watson might be particularly helpful. IBM is bringing Watson to the marketplace.
It’s important to keep in mind that inside a computer there is no connection from words to human experience or cognition.  To Watson, words are just tokens. In parsing a question such as those on Jeopardy!, a computer has to decide what’s the verb, the subject, the object, the preposition and the object of the preposition. It must remove uncertainty from words with multiple meanings, by taking into account any and all contexts it can recognise. When people talk among themselves, they bring so much contextual awareness that answers become obvious. The computer must use logic to “disambiguate” incoming tokens into choices which can be measured (scored) against alternative choices. And it must do all that within seconds.

What about robots and robotics?

The AI system managing a robot gathers facts through sensors or human input, compares this to stored data, and decides what the information signifies. The system then runs through various possible actions and predicts which action will be most successful.

Some robots also have a limited ability to learn. Learning robots recognize if a certain action achieved a desired result and store that information for the next time it encounters the same situation. Naturally, they can’t absorb information like a human but in Japan, roboticists have taught a robot to dance by demonstrating the moves themselves.

It’s important to remember that IBM isn’t the only AI game in town. There are many companies and research facilities developing and providing AI software, the most visible of which is Google.

The Next 7-10 Years of IBM’s Watson
IBM 701 Computer

From Wired’s Danger Room: Back in 1954, IBM announced that its 701 computer crunched a bit of Russian text into its English equivalent. A Georgetown professor who worked on the project predicted the computerized translation of entire books “five, perhaps three years hence.”

Thus was born a scientific (and sci-fi) drive that’s lasted 57 years, from Star Trek to Babel Fish to Google Translate: instantaneous speech translation. But even though no one’s mastered that yet, the Pentagon’s out-there research branch is asking for even more with its Boundless Operational Language Translation, or BOLT. As outlined in Darpa’s fiscal 2012 budget request. For the low, low starting cost of $15 million, Congress can “enable communication regardless of medium (voice or text), and genre (conversation, chat, or messaging).”  

Not only will BOLT be a universal translator — the creation of which would be a revolutionary human development — but it will “also enable sophisticated search of stored language information and analysis of the information by increasing the capability of machines for deep language comprehension. In other words, a 701 translator that works.

So What’s The Holdup?

There are many reasons for the delay in robotic training and interaction with humans – some of which can been seen in the mammoth resources it took IBM to achieve their Watson Jeopardy! victory. You cannot place those resources into a robot nor can you rely on a computer controlling a robot (or series of robots) via a wireless communication channel as they go about their various tasks.

Matthias Scheutz, an Associate Professor of Cognitive Science, Computer Science and Informatics and Director of the Human-Robot Interaction Lab at Tufts University, adds research funding to the equation saying:

The fields of robotics and human-robot interaction are growing, with the highest expected growth rates not in industrial, but service robots. Several countries (Japan, South Korea, the EU, etc.) around the world are heavily investing in service and social robotics. In the US, there are very few funding programs specifically targeted at artificial cognitive systems that would enable complex autonomous service robots. My hope is that this will be changing soon given enormous market potential of this area and the heavy investments other countries are making. To keep the US competitive and to enable, not Watson-like, but more modest, more natural interactions between humans and autonomous robots in natural language, we will need interdisciplinary funding programs that are aimed at developing the right kinds of integrated control architectures for these systems, which we are currently still lacking.

Scheutz goes on to say:

Computing power is obviously a critical component for a lot of AI technology (e.g., algorithms that are data-based and need to be trained on large data sets, or algorithms that have to explore large search spaces in a short amount of time). Equally important is the architecture of an intelligent system, the way in which different components operate and interact. And here is where we have made much less progress compared to the hardware side. Consequently, although the performance of Watson is very impressive and clearly a break-through, from an engineering perspective, it does not yet address the problem of human-like natural language processing as we will need it for robots. And while there will likely be applications in the context of recommender systems in the near future, it is not clear to me how the technology used on Watson can be put on a robot and make it have natural task-based dialogues with humans.

The EU, Japan and Korea have roadmaps which lay out the science that needs to be tackled before effective products can be produced. And they have national direction and public-private funding to make their plans happen. America does not yet have such a plan nor any national direction regarding robotics. And this is a critical holdup.

President Obama, in his State of the Union Speech, specifically excluded robotics when he discussed the need for strategic investment in key areas of innovation. How the President could overlook that not a single sector is devoid of the applications of robotics is one question. Another is to ask whether he is aware that 12 of the 13 major robotic manufacturers selling industrial and manufacturing robots in the US are off-shore companies.