Want to explore "what if"?

AR/VR and how to interact with gesture technology

Written by DiogoRole: Our Former Ideas Factory Manager

Heads up: Our Ideas Factory has been refreshed, levelled up, and grown-up into Alphero Intelligence. Some of our old posts are pretty cool tho'. Check this one out.

htc vive controller
  • AR and VR technology hasn't quite taken off (yet) in the mainstream, but has instead found very specific applications.
  • The big thing that first-time users have to re-learn about digital navigation is gestural interactions.
  • We built a few prototypes to test how natural it feels to interact with AR by way of gesture.

One of the areas that currently feels the most like “the future is here” is virtual and augmented reality (AR/VR for short). There are enormous possibilities for creating new worlds and overlaying information in front of people’s eyes for context, and many good uses are popping up around the globe — one of many favourites with the Alphero team is the use of VR to help children get through vaccination in Brazil.

The many applications and potential shortcomings of AR/VR are a topic for a larger discussion. The technology hasn’t found widespread use outside specific campaigns or the gaming world.

We’re here to talk about a very specific aspect of AR/VR experiences — how you interact with it! *jazz hands* 👐

Where do I click?

Pretty much all VR/AR headsets in the market today (Oculus, HTC, HoloLens) require a remote-control looking thing with buttons encrusted all over. Some manufacturers call them joysticks, some call them controllers, and they do more than just hold buttons. They also interact with the headsets to detect where the user’s hands are in space, and some of them also feature a little touchpad to detect finer movements such as scrolling left, right, up and down.

It’s up to different developers to translate what each user action means on their AR/VR application. Some will make the back button grab something, while some will have the top one do that. Depending on the application, a button can mean point and hold, or enlarge something on your viewpoint. The range of actions is largely dependent on the task, and unsurprisingly users can find interacting with such devices confusing and unnatural.

Put your hands in the air!

Gesture detection technology has been around for a while (carmakers have been experimenting with it since 2012) so we decided to give it a go. How good is it?

Man using gesture technology
Jeff gets ready to take his car for a spin

Using Unity and Leap Motion, we set out to build a few prototypes to test that. It turns out there are cheapish devices that give surprisingly good accuracy and are very responsive to the user’s hands. Driving a car in a VR space would be a cumbersome experience with controllers, but using gesture we can hold your hands up where a driving wheel would be and turn it where you want the car to go. The challenge then becomes not falling off your chair if you turn the car too aggressively and make it flip (as I’ve been told by a friend).

Swiping and changing the size of something can be done by using gestures expected by the user, and space feels much more natural to interact with. The challenge then becomes interacting with objects that are away from you in the space, which some other gestures (such as the “come here” gesture) can help overcome.

Understanding the technological possibilities means that we can explore ways to design for it. One of the ideas we’re also experimenting with is using gesture technology outside of the VR and AR realm — can we interact with presentations and projections without having to use a clicker doodad?

Written by DiogoRole: Our Former Ideas Factory Manager