Get ready UXers, it’s time to get up close and personal with virtual reality. As a UX practitioner, I’ve been closely watching the mainstreaming of VR and how it’s evolving. I’m particularly interested in how user interface design methods and principles will have to adapt to this immersive, three-dimensional world. Particularly, the mainstreaming of VR technologies in industries such as gaming, home theater, and live event broadcasting will open up new environments that UXers must master. eCommerce will soon follow, leading to a new revolution of truly adaptive design. As an eCommerce expert, I’m also keenly interested in how UX designers can elegantly expose common design patterns to enrich VR experiences — patterns such as search, browse, refinement, purchasing, and so on.

Out with the old, in with the new

Recently, I had an opportunity to demo some virtual experiences on the HTC Hive, and I came away with a few interesting observations. Practical design patterns have been thrown out the window. Elements such as top navigation and mega menus have instead been replaced by “dashboards” that are exposable via gestures. Yet, as with traditional two-dimensional screens, primary and secondary focus will be the foundation for designers to correctly expose and overlay content opportunities. The real X-factor in virtual will be depth of field, teaching the user that content can be initiated from the primary or secondary viewspace via ever-available menu components.

A UXer’s responsibility will be to create an organized design system that allows the user to stay focused on their primary task while simultaneously providing access to multiple avenues of content. If I’m sounding too UX-geeky for you, just imagine a few potential future VR scenarios:

Immersive action-adventure ads
Let’s imagine you sit down to watch your favorite show or movie. You have your wearable remote and lightweight head set on, and you are ready for some action adventure! On the show, you really like the shoes that the villain is wearing. With a quick touch and swipe off to the side of the show, a product information module is exposed, providing you additional details and a marketplace of options on where to buy right now. Even in the infancy of VR retail, imagine watching a show and exposing a commercial for a new shoe/shoe line — giving the ability to access (and navigate) additional information or content “tasks” in an interface next to your primary viewing experience will be paramount. From a TV/home theater perspective, the ability to extend your physical viewing experience will definitely be the next interface frontier — multi-tasking at its best!

Virtual front-row tickets
You’ve bought an immersive ticket for the latest playoff game and you have access to multiple views to drop yourself straight into seats at the stadium. Along with a great view of the action, you are keeping up with the fans’ tweets during the action. Let’s admit it, you are a fantasy nut and have to see your team’s progress and other real-time scores while watching the big game. Don’t like the location of the stats/update ticker covering part of your viewing experience? Move it off your primary viewing experience — this is adaptive design at its most complex level.

UX considerations for VR

Going forward, UXers will have to figure out what to account for in immersive design systems, what toolbox of design principles and information systems will be relevant, and how to cross over from the current 2-D interface world into a new 3-D one. While we have no way to know for sure what that transition will look like, I think the following considerations will be top-of-mind as UXers design the experiences of virtual reality:

Common design elements: What uses of foreground/background, left side/center/right side or carousel wheels will be solidified as standard design elements in an immersive experience (similar to the home button in upper left/top center or the cart in the top right in traditional 2-D interfaces)?
Contextual clues: How will users open/close/access content and information?
Use of focus: What use of focus will be utilized to ensure optimal performance, displaying sharply only what the eye is focusing on?
Gestures: UXers better be ready to understand and define what users are doing (and can do) with their hands. Gestures will be king, and may in fact hold clues as to what actions can be performed in any given experience. Static navigation and menus may be replaced by a flick of the wrist or a nod of the head.
Acoustical cues: What will be the usage of acoustical clues to help orient the user in the environment and the task at hand?
Shared experiences: What will be the collaborative VR experience of sitting beside your wife and kids, inside your living room? A staple of family living is the ability for family to come together — the shared experience — whether that be a live audience singing show, a live sporting event, or a movie. All will need to have the capability to extend personal preferences (beyond visual preferences — for example, Haley has SnapChat and Instagram running; Henry keeps an eye on Clash of Clans; Mom’s got a Facebook chat open, and Dad has work email open on the side), while keeping everyone in the same immersive context.
UI Balance/Equilibrium: What ramifications will multiple interfaces (and their inherent usage/gestures) in a 3-D environment have on our sense of balance and equilibrium? When is enough enough, when viewing multiple pieces of content? Is there a natural limit to what a user can consume in a VR environment?

Only time will tell which patterns will emerge for VR applications, but one aspect will always hold true for the user experience: successful UXers build a cohesive design system by understanding the customer and their needs & goals, deciphering the environment(s) in which they are interacting, and mimicking tools to their behaviors. Designing based on experience evidence will become ever more imperative as VR continues to push the limits on how we integrate our digital environment with reality.