Q&A: IBM is focusing its Watson cognitive computing technology on the area of embodied cognition, according to Grady Booch, chief scientist of Watson/M.
At the close of IBM’s recent World of Watson conference in Las Vegas, eWEEK interviewed Grady Booch, Big Blue’s chief scientist of Watson/M about the future of IBM’s Watson cognitive computing platform and where IBM is taking the technology to benefit enterprise customers, consumers and developers alike.
Among other areas, IBM is applying Watson to embodied cognition or putting artificial intelligence (AI) into the physical world. In this interview Booch talks about Watson in robots, avatars and spaces, as well as the future of AI in general.
“This is embodied cognition: By placing the cognitive power of Watson in a robot, in an avatar, an object in your hand or even in the walls of an operating room, conference room or spacecraft, we take Watson’s ability to understand and reason and draw it closer to the natural ways in which humans live and work,” Booch said in a talk. “In so doing, we augment individual human senses and abilities, giving Watson the ability to see a patient’s complete medical condition, feel the flow of a supply chain or drive a factory like a maestro before an orchestra.”
What was your biggest takeaway from the IBM World of Watson conference?
There are two takeaways. One is sort of an emotional one—that the buzz here is unlike IBM conferences I’ve seen in the past. There’s just a different level of energy that’s pretty cool. And I think that’s related to point two, which is I’m just astounded by the number of people and customers who are doing some radically cool things and in ways that are outside the normal IBM wheelhouse. That’s very encouraging. We have to get out of our wheelhouse.
So, with that in mind, who would you most like to partner with—either individuals or companies—to advance this cognitive cause?
I’m going to answer the question globally and then selfishly. The global question in terms of partnering is, as Ginni [Rometty, CEO of IBM] was saying, we think every business needs something cognitive in it. So, frankly, it’s all of our traditional customers in financial, banking, insurance and all those kinds of things. We’ll continue to transform them. And I think this is essential for IBM’s turnaround. There are a lot of companies who never bothered working with IBM, and it’s those that I’m excited to partner with. Like the whole GM thing, I think that’s so cool. And if you think about where we’ve gone in the health care space, that’s not an unexpected trajectory for what we’ve been doing. But things like GM and Staples—that’s different and I think that’s really encouraging.
What did you think of IBM announcing a partnership with Slack?
Well, I use Slack so that’s really cool. Since IBM kind of blew up Rational, we haven’t done a lot in the software engineering space. But the reality is that things like Git, which pervades IBM, and things like Slack, which also pervades it, that’s a pretty sexy connection, it really is.
That was the global answer. The selfish answer alludes to what I am up to these days. I’m chief scientist of a project called Software M. So I am particularly interested in customers who are in the embodied space. And that means robots avatars, spaces and such. This is also out of IBM’s wheelhouse, which I think is exciting because this gives us an on-ramp for Watson that we never had before.
It also represents the collision course between us and Watson IoT [internet of things]. And I mean that in the most positive way—that we ultimately are driving toward common architecture of dealing with devices as you deal with Watson. So as one brings cognition to the edge … if you think of anybody that might need a robot, from the concierge at Hilton hotels to industrial robots like Baxter or ABB or any of those kinds of things, that really interests me.
Devices is an obvious one because that’s an IoT play. Avatars and spaces require a little bit more thinking. Let’s talk about spaces. … We’re all about your obvious physical things, and they are either humanoid or not humanoid. But imagine if you take a 3D physical robot and button it so that now you have an avatar.
You ought to go take a look at the work of the University of Auckland that spun out a company called Soul Machines. They have the most awesome photo-realistic avatars. Their back story is that they were commissioned by James Cameron to do the facial skinning and mapping for the movie Avatar. They developed some really wickedly cool technology. What I’m telling you is all public knowledge. They built a model of the musculature of the human face, put a neural network behind it, put a skin on it and you have a photo-realistic avatar.