- Turn speech into structured data objects.
- Analyze text from any communications channel with a single natural language understanding model.
- Available now in developer preview.
This May at Signal, we announced Twilio Understand— a new TwiML verb that uses natural language to figure out the intent of what someone is saying or texting. We’ve seen a lot of interest in it, from people that want to build more intuitive phone trees (IVR) to smarter messaging bots. And today, we’re excited to share that Twilio Understand is now available in developer preview.
With Twilio Understand, you can not only transform their digital experience across IVR, SMS, Facebook Messenger, and a variety of messaging channels, but, you can also build experiences in new channels like Amazon Alexa.
Twilio Understand can be trained to handle everything from simple questions, such as “How late is your Austin store open?” or “I’m calling to confirm my appointment” to more sophisticated interactions such as re-booking a flight or purchasing health insurance.
Face it, your customers don’t want to use cumbersome keypad input to navigate phone menus. They’re tedious for the customer and take a long time to figure out intent, resulting in long and deep menus that you have to build and maintain over time. While speech is much more efficient, parsing speech is not a simple task. Now with Twilio Understand, you can train models to understand the intent of the communication so you can build applications that interact with human language.
Here are some areas where Twilio Understand will come in handy:
- IVR and Message Routing: Analyze answers to “What can I help you with?” to determine the best way to route the call.
- Data Capture: Automate data capture to improve lead qualification, support escalations, and new business opportunities.
- Conversational Interfaces: Build fully-integrated bots, trained within the context of your business, with the intelligence to understand human language and help customers without human oversight.
Twilio Understand helps users talk to businesses by speaking naturally. Let’s look at an example of how it can improve a customer interaction:
Letting your customers just say what they need is faster, easier, and leaves them happy. We think every customer interaction can be this good.
How Does it Work?
Twilio Understand can be easily trained to recognize the intent and entities of spoken and written word. Developers only need to design, train, and build a natural language application once to have it work with all existing (and future) channels such as voice, SMS, chat, Messenger, Twitter, WeChat, and Slack.
Example: Turning human language into structured data
“I want to book a hotel in London for two people checking in on September 18th and checking out on September 20th.“
Now you can transform the way your business interacts with customers by offering a better, more accurate voice experience. Twilio Understand is available in developer preview today.
To see a live demo of how it works, check out this video from SIGNAL London.
This is only the beginning for voice-driven interfaces built on Twilio. Stay tuned for more.
We can’t wait to see what you build!