#IxD17 - "Design Is All Talk" (CUI Workshop)

It is my favourite time of the year as a designer: the IxDA Interaction conference!

And for the first time since it started, I didn't have to leave my current domicile to experience it! This year, the conference came to NYC.

It had been a while since I took a workshop before the main conference (the last time was a mobile prototyping full-day workshop at Interaction 12 in Dublin). This time around, it couldn't have been in anything more exciting and relevant that chatbots & Alexa!

RGA offereded a very interesting workshop at their NYC studio called "Design is All Talk: Conversational UI's Impact on Brands" (with that title, I simply couldn't miss it).

CUI's & NUI's are something that I've been looking into more closely in the past couple of years --basically ever since Alexa entered my home and changed the way I interact with "the wired" (any LAIN fans out there?)

The workshop was primarily about getting to know all the intricacies behind "chatbots": what they are, how they are manifesting in various ecosystems (specially with brands) and what to have in mind when building and designing a bot. 

We learned how closely related chatbots are to Artificial Intelligence and Machine Learning (which is all about identifying patterns, learning from and adapting to users). These were also themes that would come up in talks throughout the conference.

It was also clear just how important chatbots are in today's interactions with the web: they are where people already are. User don't need to download an app to interact with a bot. A bot lives in Facebook Messenger, WeChat, iMessage, or just about any website you can visit (and buy things/uneed help from).

One of the biggest take aways form the workshop were some principles to apply when designing for bots:

  • Use them! (so you can understand them) . Seems obvious, but advice like this important to always have in mind, no matter what... --specially when dabbling into new technologies.
  • Choose Wisely: chatbots are limited in capability. They can't do it all. Instead, how can they complement other channels in your product/service/company?
  • Don't Leave People Hanging: always provide an answer, even when you don't know the answer.
  • Learn How To "Hide" Most Information: try to get as close to the desired answer with on question. This is one of the most crucial differences with the ability to "browse" all the information (on the web)
  • Respect Your Customer: taking into account not building a bot that's "too clever"; gender pronouns; simple flows and only offering witty (funny) responses for unsupported topics.
  • "Troll your own bot": there is no better way to test it. Early adopters will sure do this, and so should you as a designer.
  • Be Honest: set expectations on how you're gonna interact with the bot. 
  • "Hug a Tree": decision trees are the basis for a rules-based or an Natural Language Processing (NLP) bot. (You'll need a flow for your dev anyways!). It'll help you think though each step from the user's perspective.
  • Prototype: learn by observation. Learn what people are writing in. Refine along the way.

When it comes to this last point, we were shown a really great tool called Reply.AI, which not only helps to prototype bots, but also makes it possibly to quickly build one for real! The tool is primarily optimized for bots deployed in Facebook Messenger, but it is possible to use in other ecosystems. Very powerful tool --definitely check it out!

Once we were done talking about bots, the workshop turned to something I'm even more keenly interested in: Designing for Alexa.

This is an area I've been wanting to learn more about in the last couple of years, and this workshop certainly made me understand all sorts of different elements I'll need to have in mind when designing skills for Amazon's voice assistant.

We were told all about the Alexa Skill Kit; Alexa Voice Skills; and AWS Lex & Poly (for apps). We also learned about some of the key components, such as Automatic Speech Recognition (ASR), Natural Language Understanding (NLU) and Text-To-Speech (TTS).

 Learning all about the structure of an Alexa command.

Learning all about the structure of an Alexa command.

We were also presented with some key aspects to conversation with voice:

  • Make prompts for responses clear to users.
  • Provide a clear path to help users
  • Keep it brief --unless the user is asking for details 
  • State management is critical to maintain context for conversations 
  • Leverage re-prompting and re-stating to clarify.
  • Prompt users to view visual cards.
  • Keep conversations open until the user is done.
  • Always ask "would it be much faster for for the user to use an app or the web?"
  • Design for your ears, not your eyes.
  • Don't lift and shift your app into a conversation interface. Use the power of constraint.

This last point really resonated with me --specially since this is one of the most important aspects I take into account when designing for mobile: constraint is an opportunity, not a liability. Using the power of constraint is also one of the points I like to make the most clear whether I'm at work or when I'm teaching the "Introduction to Mobile UX Design" class at General Assembly.

The constraint with voice (and CUI's) are in a whole different level, which makes it very interesting.

At the end of the seminars, the workshop turned from listening to doing. The group was divided into 4 separate sub-groups: 3 designing a chatbot using Reply.AI and one designing for Alexa.

I chose to work for the "lady from Seattle".

The scenario that we worked with was one that I actually would find very useful (and cant't wait to see in real life). Assuming a partnership with Airbnb, we designed a few scenarios around "MyHost": an Amazon Echo using Alexa as a virtual host in a rented home through Airbnb.

We brainstormed all sorts of different commands a guest would be likely to ask Alexa when staying somewhere through Airbnb. We thought of the things Alexa would say to the renter upon entering the space for the first time; the types of things a guest would ask that are relevant to this particular context; and we finally developed the scenario of "asking for a good place to eat nearby" in the new city. 

We did all of this using decision trees, as well as exploring various ways of composing a sentence to deliver the same intent behind a question or command.

Here's the final deliverable we came up with at the end of the workshop, which a couple of us presented to the rest of the workshop participants:

 Alexa sure knows where the best tacos in town are...

Alexa sure knows where the best tacos in town are...

We had all intentions of building a quick real demo using an Echo Dot as part of the presentation, but we ran out of time.

Overall, this was a really informational and relevant workshop to be a part of, not only because it is a topic that I am deeply interested in, but because the very next day, we would all get almost an entire day of chatbots and conversational/natural user interface talks during the first day of Interaction 17.

(more on all the talks I saw during the conference coming soon...)

I would like to sincerely thank RGA for offering such an amazing workshop to all of us who attended this conference. It was sure a fantastic way of kickstarting this year's Interaction conference on the right tone.