For over a decade I’ve worked to make the interactions between machines and humans more natural and efficient. Through visual design, user research, demographic studies, user personas and UX best-practice principles, I have helped create intuitive software products that accelerate task-completion via immersive, conversational language: a critical virtue for any enterprise or consumer-focused digital solution.
The evolution of technology has reduced our dependence on visual interfaces. Virtual assistants such as Amazon Alexa, Siri, Microsoft Cortana, Google Assistant, etc. provide a more conversational experience through voice (or text). In a few years, I’m certain we’ll be able to tell our cars to turn on the lights, specify a cruising speed, even tell it where to go—and then sit back and enjoy the ride.
In the enterprise there’s already a growing trend of adding Conversational AI to products and services to boost user engagement and further accelerate task completion. But Conversational UX is a discipline that requires a huge design effort, as it involves sociological and conversational analysis to ensure appropriate, effective language patterns that enable great user interactions. Complex, Conversational AI systems transcend traditional chatbots by understanding, empathizing, and responding to users in a much more natural way, independent of context, topic, or request. This is called Natural Language Processing (NLP).
Conversational User Experience Design
Without getting into the technical aspects of Artificial Intelligence, the approach to Conversational UX design is similar to that of designing a graphic user interface (GUI), since both interfaces need to cover the 5 Phases of Design Thinking.
The best way to create user-empathy is by gathering real-life conversations (some of which Conversational AI has taken part in). Gathering conversational interaction between two humans (agent-customer) allows the team provides transcriptions for further analysis, as well as enabling recognition of the emotions associated with the conversations: anger, frustration, joy, uncertainty, etc. Additionally, recordings and transcripts help the team better understand typical conversation patterns around queries, commands, and requests.
Besides experimenting with real-life situations, holding interviews with users is very important to filling-in detail around the meanings, attitudes, and motivations that might not arise in casual conversation. For instance, during these interviews, users can speak to past situations and discuss how they influenced their experience and mood.
Once you have the results of your user research, the team will be able to define one to three user personas and map those users’ needs when interacting with a virtual assistant. For instance: requesting answers, recommendations, or assistance. These personas could also document user pain-points and gain points, allowing teams to determine which specific tasks to focus on first.
During this process, the team analyzes recordings and transcripts to familiarize themselves with natural conversation patterns. For example, turns in conversation, deviations from the main point, interruptions, moments of silence, multiple words for the same concept, figures of speech, and expressions that aren’t necessarily relevant to the conversation but which denote feelings (e.g., mm hm, Ooh!, ha ha, etc.).
The assistant’s personality is also defined in this phase. Since the AI provides communication around the brand, it must reflect the tone and values of the company; a friendly, relaxed personality might be appropriate for a restaurant assistant while a formal, straightforward tone might work better for a medical insurance assistant.
Ideation at this stage also involves brainstorming to define how the virtual agent will interact with users at every utterance, be they on- or off-topic, requests or demands, impassioned or lighthearted.
Prototypes are the most effective tool for envisioning and testing design solutions without fully implementing them—enabling the detection of errors and UX gaps in early stages to ensure the budget is kept in check. In contrast with visual design, the first round of Conversational UX mockups are focused on creating transcripts that demonstrate interaction scenarios between user personas and agent profiles based on previous steps. These transcripts represent conversations that link user utterances with context as a conversation progresses. That is, they diagram how the assistant will empathize with and understand the user to create a better bond and find the best resolution for the user’s request. All these scenarios try to encompass the intent-entity-context-response paradigm and build conversation logic translated into pseudo-code during the design phase.
Commands in Dialog Design:
- Create condition (if). Conditions are compared to input utterances.
- Assign default (else). Assign action if no conditions are met.
- Set variable (set). Capture the context of the current input for future turns.
- Route to node (goto). Route to another dialog node.
- Respond to user (say). Output text to the user.
In creating the first type of basic action: “create condition,” designers typically combine the following elements:
Components of Dialog Conditions:
- Intents (#). Linguistic classes against which the similarity of a text input can be scored.
- Entities (@). Keywords or phrases to be matched exactly.
- Context ($). Variables for capturing events in the conversation.
From Robert J. Moore’s “Conversational UX Design: A practitioner’s guide to the natural conversation framework.”
During the pseudo-code process, little interactions called node branches are identified on which conversation structure is based and all possible paths are covered, allowing virtual assistants to respond and interact in a fluent and natural way, regardless of the users’ command-type.
Pseudocode – Node Branch.
- if #EXAMPLE_REQUEST
- if $example has value
- else say “I’m afraid I can’t think of an example.”
When building the prototype, scenarios will incorporate personas and assistant profiles to create start-to-finish dialogs, using node branches to cover every conversation path.
The objective of prototype dialogs is to identify those cases where users may possibly disengage the conversation with the virtual assistant and to account for these cases by providing additional node branches. These tests could be called beta versions, though they will not necessarily be implemented in the chatbot or AI. Iterating on these dialogs with the team, stakeholders, and users will strengthen your conversational UX and better prepare it for version 1 implementation.
Our recent Amelia platform partnership takes Anexinet’s UX design team to the next level by enabling us to design creative enterprise Conversational UX solutions. As effective as our team has being delivering great visual experiences, we are working to achieve that same quality and effectiveness with our Conversational Interfaces—providing intuitive solutions, enabling lightning-fast responses, and fostering a special, immersive bond with your customers and users. Please take a moment to check out Amelia’s skills; you’ll find they feel very natural. Almost human. To learn more about how your organization can achieve greater loyalty and satisfaction through Conversational AI, check out our Conversational AI Strategy & Roadmap Kickstart. In just three weeks, our accelerator helps your organization transform into a first-class customer experience organization by developing the perfect Customer Service Automation Solution Strategy.
Lastly, some concepts and ideas were taken from Robert J. Moore’s book, “Conversational UX Design”: a great practitioner’s guide to Natural Conversation Design.
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
|cookielawinfo-checbox-analytics||11 months||This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".|
|cookielawinfo-checbox-functional||11 months||The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".|
|cookielawinfo-checbox-others||11 months||This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.|
|cookielawinfo-checkbox-necessary||11 months||This cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".|
|cookielawinfo-checkbox-performance||11 months||This cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".|
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.