I was invited to talk at the first ASOS UI event late last year - an evening for front-end engineers to learn. It was a fascinating evening, with things to be gleaned from both talking and listening.
The key takeaway is that Artifical Intelligence is quietly revolutionising customer experience for many companies. Tools have become readily and cheaply available. It's now possible to build complex apps that can understand what a person wants to do by having a conversation.
The talk really needed a story on which to hang the more technical aspects. I went right back to primary school and crafted a beginning, a middle, and an end to help guide my audience through the talk.
When we launched a new product in the Russian market, aimed at Russian speakers, we did a lot of research to try and understand the idiosyncrasies of users in that market. Other Russia websites are really plain: black text with blue links on a white background.
Several rounds of designs and user testing later, the result was a really complex interface with deeply nested information. It looked nothing like the initial trends we’d seen. We were completely blinkered initially, we weren’t looking at the content of the sites because we didn’t speak the language. We only looked at the aesthetics. What we saw as simple designs were actually very information heavy sites.
People in Russia didn’t have any concept of free commerce until 25 years ago and no credit cards until about 2007. They’re just not comfortable buying anything online without swathes of information to dive into.
This brought us a problem. Our admin tools just couldn’t be used to nest the information in this way.
We really try to deliver simplicity and great customer experience to both users and also the people managing the sites. We just wouldn’t settle for a clunky fix.
We set out to evaluate tools we could use to build this novel user interface. What we found in React was perfect. First, it enabled us to approach development by splitting components into vertical silos rather than horizontal layers. It was akin to the invention of the assembly line. Rather than each member of the team building something in isolation from beginning to end, we could now think about building individual components and assembling them further down the line.
React also gave us the ability to define a user interface independent from the data that would be displayed. Passing a list of things into a list component would generate a row for each item on the list without needing prior knowledge of how it was all put together.
More recently, I gave the same talk at Voxxed Days in Zurich last week, and you can watch the talk in full here:
Another benefit of attending events like ASOS UI is that the breadth of talks is really inspiring. Here is a roundup of the things that really caught my attention.
Peter Gasston and Arran Ross-Paterson talked about how user interfaces will develop in the near future. The state-of-the-art technology always raises expectation for successive generations of users. Google blew people away with their search box that didn’t require complex boolean syntax to find results. However, search boxes are now the bane of engineering teams across the globe. We can easily add simple text matching, and even account for spelling mistakes and synonyms. That’s just not good enough any more, Google has raised expectations.
Here are some examples of searches we’ve had on sites we’ve built:
You and I can figure out the user’s intent quite easily, but to make a computer understand requires something called Natural Language Processing. This is a branch of Artificial Intelligence. Even 18 months ago, unless you had the resources of Google, the thought of Natural Language Processing was just a pipe-dream.
The entire landscape has quietly changed in recent months. Artificial Intelligence is now fully commoditized. I can add a credit card and simply pay as I go for artificial-intelligence-as-a-service and build smart apps on top of it.
We’re now in a position to build some sophistication into our search boxes. If a user who searches for “cheap honeymoon destinations” clicks on Bali, we can teach our app about the user’s intent and improve the experience for subsequent searchers.
Ultimately this is heading in the direction of allowing us to engage with products in the way we engage with each other: conversation. If we can understand a user’s natural language then we’re in a good position to switch the app to the user’s context rather than asking the user to switch to our context.
“Hey, Siri” and “Ok, Google” are already commonplace but WeChat in China is taking it further. Their City Services products allows you to message a dentist, plumber, taxi or even Doctor to arrange an appointment. The conversation sits naturally alongside chats with friends. It’s just a natural extension of how people communicate.
This just means one thing. The next generation of users will have conversations with products so regularly that you won’t be able to afford to not offer that means of communication. By Christmas your children may be playing with Hello Barbie, a doll with a conversational interface.
The revolution has already started.
Josh regularly writes on the future of technology on Medium. Follow his posts here.