Talk to Siri, Alexa or a company chatbot and sooner or later we get to the “I don’t understand that…” moment. While customers and developers are dreaming of that blockbuster movie AI that understands the context and meaning of what we say, in reality, we might not be that far off.
Do you feel jealous of sci-fi heroes when they are having mundane conversations with their robots or home AIs? Do you feel upset when Siri sends you some pointless Google search in an attempt to help with something she clearly can’t deal with? Or, when you type into a chatbot, wondering why it feels like playing something not that much advanced from an early text adventure game, without the reward.
In reality, most bots are designed to only have to deal with a tiny fragment of what’s going on inside our heads. Travel company chatbots can help you find that ideal destination in generic terms, but they can’t really picture your mind’s eye view of holiday heaven. Similarly, customer service chatbots will successfully deal with 95% of all queries but there’s that odd one (likely the one you have) that they don’t quite get.
Even those bots can perform self-learning or managed-learning to respond better in the future to queries they have trouble with, but that won’t help you there and then. Instead, many people with bots, AIs and digital concierges or virtual avatars on their smart home devices, smartphones or other screens are waiting for that “wow” moment when the AI does something you thought was tough or impossible.
The drive to great AI
In reality, experts and scientists in labs around the world are working on AI avatars and tools that can deliver better, if not great, functions or services. All the major players regularly roll out modest updates that eventually will all add up to the super-bots we crave.
On the voice side, it looks like Apple plans to make Siri’s voice more flexible for developers in iOS 14. And AI’s are gradually improving in lots of small areas, making that road to genius AI seem like it is taking forever. For example, Samsung makes regular progress with Bixby, as demonstrated in this recent video that highlights how Bixby controls interactions with phones.
Samsung also remains hard at work on the visual front-end with Star Labs’ Neon characters, unveiled earlier in the year but still in the in-development phase. They’re hiring more people for the work, according to their latest posts, but haven’t shown any sign of moving to a consumer release. But this could be the next big thing when it comes to AIs and avatars, creating a flashy front end, but what about the underlying smartness?
Are bots getting smarter?
Talking to MIT News, Posh Bots founder CEO Karan Kashyap highlights some of the problems, “The platforms on the market today are almost spreading themselves too thin to make a deep impact in a particular vertical. If you have banks and telecos and health care companies all using the same [chatbot] service, it’s as if they’re all sharing the same customer service rep. It’s difficult to have one person trained across all of these domains meaningfully.”
“Context understanding allows us to more intelligently understand user inputs and handle things like changes in topics without having the bots break,” Kashyap says. “One of our biggest pet peeves was, in order to have a successful interaction with a bot, you as a user have to be very unnatural sometimes to convey what you want to convey or the bot won’t understand you.”
On another note, SnatchBot Founder Henri Ben Era reckons that “right now, there are thousands of NLP models being created, each to solve a particular task. This isn’t the smartest way to advance towards a sophisticated conversational experience with software because there is a lot of reinvention and repetition. We need to move to a meta-model level where chatbot creators can share and enrich each other’s work.”
This creates a gap between bots for businesses who have narrow requirements and those bots on devices that wish to be our future virtual assistants offering a total service that could cover any topic of conversation or require any request to be fulfilled.
If the super-bots can take over the role of the narrower bots in the future, we could perform all of our online interactions from one place which would increase the value of those bots, even if they are interacting (below the surface) with the other bots on our behalf?
MIT Tech Review highlights how AI continues to make waves in customer experience, and from there it will spread across other parts of our lives, as our home smart AIs develop more strengths. Meeting in the middle of the business agenda for bots and the wants of home users, slowly both ends of the smartness spectrum will merge, giving us the bots we deserve and see in the movies. Soon, we can be just like Tony Stark talking to Jarvis or asking robots to do whatever tasks we need performing, ordering flying taxis and all the other joys of the future.