Artifical Intelligence and the Future of UX

This was partially written as an answer to a Quora Question

How do you think AI will impact the UX design world?

We are already seeing the beginnings of this with chat bot interaction, voice interaction, and augmented reality.

In the short term, there’s going to be a lot of experimentation and growth around what happens in these new methods of interaction. I think UX practitioners will need to be more HCI-based and able to work in the worlds of psychology and sociology.

The biggest issues to solve will not result in a cool shot to post to Dribbble.

As we move into augmented and virtual reality, we’re going to see more development and design starting to affect more than just visual. More aural and haptic interactions will start to make things more integrated.

AI will power many of these interactions and as we discover and develop best practices we’ll start to codify the practice around artificial intelligence interaction design.

How will we interact with complex applications such as CAD or 3d modeling when it’s more efficient to speak the commands than use traditional input devices?

How will AI improve pattern matching and integrating digital information on physical environments? I see this as one of the largest boons for the AEC (architecture, engineering, and construction) industry. When a user can open Procore or Co Construct on their phone and it integrated with a pair of (not awful looking) glasses, and they can start to see updates as they walk through a construction project? That’ll be step 1.

When the AI starts to predict common issues will be step 2 (or possibly a parallel step 1). There will be a phase of tuning the AI until there will be enough data for it to start processing and adjusting things on its own. You’ll still need something more akin to a traditional interface for a small subset of users to interact with the AI.

When that tech makes it down to the consumer level, you’ll see a mix of pattern matching with predictive behavior (already happening on the iPhone) and advertising that will be beneficial in many circumstances but probably overwhelming for most people that don’t grow up with it.

My phone already suggests a possible destination when I get in my truck near a certain time (typically near when I would pick up or drop off my kids for school). It’s easy if there’s a calendar event, but otherwise it’s guessing based on my repeated trip from home to preschool and back again.

That’s just the tip of the iceberg.

On the design side and professional practice, Krishna Nandula points us to another tip of a different iceberg. Wix’s ADI tool is really just a very complex pattern machine, but those are the beginnings of design by computer. So much of what was “web design” when I started is now completely handled by templates and site building tools and content management systems (e.g. WordPress).

We will likely see more tools that analyze actual customer use of products (imagine FullStory but with a User Researcher built in) and be able to initially identify problematic workflows and suggested best practices. Heuristic analysis will probably become mostly automated. Design tools will suggest and eventually adjust color palettes to handle color blindness and low contrast issues.

We will probably get to the point where we can actually design a system and it will build the UI in HTML, simply requiring some tying together between database and application code. Heck, we may not even be working in HTML at that point. Who knows :)

As with all great UX, you’ll know it’s great when it doesn’t stand out too much to you. It’s going to be a very interesting few decades in the craft.