IBM fancies itself a cognitive computing enabler. So it’s ditching most of the vestiges from its legacy computer business and, for better or for worse, going all in.
The 105-year-old Armonk, N.Y., company made news two weeks ago when it announced it would supply the brains behind a self-driving bus project planned for Washington D.C. Although the idea of talking to inanimate objects might seem strange, its actually part of the bigger bet that chief executive Ginni Rometty has made about the future of IBM. She’s been aggressively moving away from hardware and focusing on cloud-based analytic and artificially intelligent software.
Like so many other CEOs, Rometty sees a world full of smart things. In that world, it’s completely natural to carry on a conversation with a bus, kiosk or even a household appliance. It’s the future we were all promised but never really thought would happen.
To make it happen, companies like IBM need to do a lot of heavy lifting behind the scenes. Powerful cloud-computing architectures running sophisticated algorithms must work hard to recognize natural language, making sense of our thick accents and colloquialisms. IBM got a head start doing this sort of thing in 2011 with Watson, a sassy Jeopardy!-playing computer program. Now the company is rolling out those capabilities and much more over a wide swath of industries.
In 2015 the company launched the Watson Health Cloud. The concept was to build a secure, open innovation platform where corporations and researchers could build systems and exchange data via an application program interfaces, or APIs. Already pharmaceutical firms like Johnson & Johnson and Medtronic (MDT) are working with it to develop new drugs. Apple and Under Armour are using Watson analytics to decipher the deluge of data from connected watches and fitness bands. And medical facilities like Memorial Sloan-Kettering Cancer Center, have made Watson the centerpiece of their oncology research.
Subscribe by Email
Follow Updates Articles from This Blog via Email
No Comments