For almost a decade, we’ve talked about wanting to modernize enterprise software user interfaces (UI) to match consumer software, but we’ve gone about it the wrong way. The modernization of UIwas proposed as a solution to meet the increased expectations that enterprise software should be as simple to use and nice to navigate as the applications we use at home from any device. But investing in software UI that merely looks beautiful is a waste of time and resources.
In the 1980s, the television program Knight Rider gave a glimpse into a world where artificial intelligence could learn, communicate, and make independent decisions. The star of the show, a self-aware computer, was housed in a 1982 Pontiac Trans Am, and it intrigued viewers. They imagined how the technology could change lives, handling dull or dangerous tasks with a simple spoken command. At the time, such software sounded like a creation of science fiction. Now, less than four decades later, artificial intelligence and machine learning technology are our reality, communicating with users through natural language interfaces.
It has been a dream of science fiction authors since the advent of computers: hands-free interfaces that can respond to our every whim — without the need to strike a single key.
That future is now closer than ever, with engineers across dozens of industries hard at work designing both computers and mobile devices that can interact through simple conversation. Known as natural language Interfaces (NLI), expectations are that this form of communication will spread from talking programs such as Siri, Alexa, and Cortana — and most recently, Samsung’s Bixby — to a multitude of interactive apps and programs in the coming months. And, in many cases, it already has.
As it gets easier to connect with one another, the lines become increasingly blurred between our physical reality and the digital world. The million-dollar question is whether the enterprise will be able to effectively manage both their intelligent machines and their human talent, an expert in the field of innovation says.
The power of voice controlled intelligence systems like Alexa is no longer reserved for the home. We have enhanced Alexa with enterprise skills allowing for a seamless integration between Alexa and our Digital Assistant, Wanda. The video below presented by myself and Thomas Staven from Unit4’s Innovation Labs team, demonstrates what happened when Wanda met Alexa for the first time, and what it means for enterprise computing in the future.
In the last few years, commercial organizations have relied heavily on immersive technology, effectively transforming operations and even market competition. The internet makes cross-device integration increasingly common, particularly as more devices become connected. As it becomes easier to connect with one another, the lines between reality and digital world become increasingly blurred.
In the last few years, commercial organizations have relied heavily on immersive technology, effectively transforming operations, and even market competition. The internet makes cross-device integration increasingly common, particularly as more devices become connected. And as it becomes easier to connect with one another, the lines between reality and the digital world are increasingly blurred.