done

Interaction with a UI via Handtracking in VR

2020 • 6 months • Bachelor's thesis

XR is coming and it's coming fast. There is no doubt that in a few years time people will use AR-glasses alongside, or even instead of their smartphones. As my bachelor's thesis I took a look at this future and built a working 3D-environment in Unity to be used with the Oculus Quest.

In several experiments various forms of interaction with handtracking were put into practice and findings documented. In the final product you are able to start, land and in other ways control a hot-air balloon solely with hand tracking and without controllers.

Hand tracking with Oculus Quest is possible since the beginning of 2020 and brings countless possibilities but also various problems with it. This thesis came into being in order to analyze these possibilities and problems and find out how interfaces in VR can work and which things there are to be considered.

 

handmenu

Difference to 2D-Design

Interface-design in 2D has been explored for so many years now that countless guidelines, best practices and use cases exist out there. But in the world of mixed reality (XR) is still a lot unexplored and untestet. There are different things to be considered here. For example if you build an interface in VR, you don't just think about what is on that interface but also where, how big and how far away it is. You have to make sure it doesn't block things behind it and decide how you make the user look where he's supposed to look since things can always happen outside of his view.

 

slider

Some of the things I've learned:

  • When building an Interface in VR you have to decide how you want the user to interact with the environment. Raycasting let's him interact with things far away but has to be explained. By using direct interaction the elements have to be in arms reach. The so called "Pinch" brings back haptic feedback but is not a natural way of pushing buttons for example.
  • Non-interactable elements should be placed 3-5 metres away from the user in order not to strain the eyes. Interactable elements can be placed within arm's reach.
  • Don't place elements in too many depths because switching between them is tiring for the eyes.
  • Interfaces can follow the user or even be linked to the movement and position of the hand so you always have them at the ready.
  • Haptic feedback is missing with hand tracking and must be compensated in other ways. Auditory and visual feedback can be used stronger than usual to solve this problem.
  • Tell the user where to look and where the effect of a certain button lies if it's outside of his view.
vrbutton