In the 1980s, the television program Knight Rider gave a glimpse into a world where artificial intelligence could learn, communicate, and make independent decisions. The star of the show, a self-aware computer, was housed in a 1982 Pontiac Trans Am, and it intrigued viewers. They imagined how the technology could change lives, handling dull or dangerous tasks with a simple spoken command. At the time, such software sounded like a creation of science fiction. Now, less than four decades later, artificial intelligence and machine learning technology are our reality, communicating with users through natural language interfaces.
When the average person thinks of bots entering the workforce, scenes from X-Men: Days of Future Pastleap to mind, but actually, bots are not the future, they are already adopted and deployed among us. The reality of bots in day-to-day life is less “dominating Sentinel” and more “helpful and productive assistant.”
A bot is a piece of software that is designed and created to automate the kinds of tasks we would usually do on our own. Powered by a set of simple rules and varying degrees of artificial intelligence (AI), we can now create bots that can hold natural-sounding conversations with human beings with the aim of accomplishing tasks, such as answering questions or enabling product purchases.
In the 1980s, the television program Knight Ridergave a glimpse into a world where artificial intelligence could learn, communicate and make independent decisions. The star of the show, a self-aware computer called KITT (Knight Industries Two Thousand), was housed in a 1982 Pontiac Trans Am, and it intrigued viewers. They imagined how the technology could change lives, handling dull or dangerous tasks with a simple spoken command. At the time, such software sounded like a creation of science fiction. Now, less than four decades later, artificial intelligence (AI) and machine learning technology is a reality, communicating with users through natural language interfaces. Everyday jobs will soon be transformed as this technology advances.
It has been a dream of science fiction authors since the advent of computers: hands-free interfaces that can respond to our every whim — without the need to strike a single key.
That future is now closer than ever, with engineers across dozens of industries hard at work designing both computers and mobile devices that can interact through simple conversation. Known as natural language Interfaces (NLI), expectations are that this form of communication will spread from talking programs such as Siri, Alexa, and Cortana — and most recently, Samsung’s Bixby — to a multitude of interactive apps and programs in the coming months. And, in many cases, it already has.
As it gets easier to connect with one another, the lines become increasingly blurred between our physical reality and the digital world. The million-dollar question is whether the enterprise will be able to effectively manage both their intelligent machines and their human talent, an expert in the field of innovation says.
The power of voice controlled intelligence systems like Alexa is no longer reserved for the home. We have enhanced Alexa with enterprise skills allowing for a seamless integration between Alexa and our Digital Assistant, Wanda. The video below presented by myself and Thomas Staven from Unit4’s Innovation Labs team, demonstrates what happened when Wanda met Alexa for the first time, and what it means for enterprise computing in the future.
In the last few years, commercial organizations have relied heavily on immersive technology, effectively transforming operations and even market competition. The internet makes cross-device integration increasingly common, particularly as more devices become connected. As it becomes easier to connect with one another, the lines between reality and digital world become increasingly blurred.