Explained: How does VR actually work?

How does VR work? How does wearable tech make you think you're standing on Mars when you're actually about to bump into the kitchen counter? We'll be explaining how VR headsets work here - not projections which transform whole rooms. Let's start with some basics.

The headset set-up is being used by Oculus, Sony, Samsung and Google and usually requires three things. A PC, console or smartphone to run the app or game, a headset which secures a display in front of your eyes (which could be the phone's display) and some kind of input - head tracking, controllers, hand tracking, voice, on-device buttons or trackpads.

Total immersion is what everyone making a VR headset, game or app is aiming towards - making the virtual reality experience so real that we forget the computer, headgear and accessories and act exactly as we would in the real world. So how do we get there?

The basics

VR headsets like Oculus Rift and Project Morpheus are often referred to as HMDs and all that means is that they are head mounted displays. Even with no audio or hand tracking, holding up Google Cardboard to place your smartphone's display in front of your face can be enough to get you half-immersed in a virtual world.

The goal of the hardware is to create what appears to be a life size, 3D virtual environment without the boundaries we usually associate with TV or computer screens. So whichever way you look, the screen mounted to your face follows you. This is unlike AR which overlays graphics onto your view of the real world.

Video is sent from the console or computer to the headset via a HDMI cable in the case of headsets such as HTC's Vive and the Rift. For Google Cardboard and the Samsung Gear VR, it's already on the smartphone slotted into the headset.

VR headsets use either two feeds sent to one display or two LCD displays, one per eye. There are also lenses which are placed between your eyes and the pixels which is why the devices are often called goggles. In some instances, these can be adjusted to match the distance between your eyes which varies from person to person.

These lenses focus and reshape the picture for each eye and create a stereoscopic 3D image by angling the two 2D images to mimic how each of our two eyes views the world ever-so-slightly differently. Try closing one eye then the other to see individual objects dance about from side to side and you get the idea behind this.

One important way VR headsets can increase immersion is to increase the field of view i.e. how wide the picture is. A 360 degree display would be too expensive and unnecessary. Most high-end headsets make do with 100 or 110 degree field of view which is wide enough to do the trick.

And for the resulting picture to be at all convincing, a minimum frame rate of around 60 frames per second is needed to avoid stuttering or users feeling sick. The current crop of VR headsets go way beyond this - Oculus is capable of 90fps, for instance, Sony's Project Morpheus of 120fps.

Head tracking

Head tracking means that when you wear a VR headset, the picture in front of you shifts as you look up, down and side to side or angle your head. A system called 6DoF (six degrees of freedom) plots your head in terms of your x, y and z axis to measure head movements forward and backwards, side to side and shoulder to shoulder, otherwise known as pitch, yaw and roll.

There's a few different internal components which can be used in a head-tracking system such as a gyroscope, accelerometer and a magnetometer. Sony's Project Morpheus also uses nine LEDs dotted around the headset to provide 360 degree head tracking thanks to an external PS4 camera monitoring these signals, Oculus has 20 lights but they are not as bright,

Head-tracking tech needs low latency to be effective - we're talking 50ms or less or we will detect the lag between when we turn our head and when the VR environment changes. The Oculus Rift has an impressively minimised lag of just 30 milliseconds. Lag can also be a problem for any motion tracking inputs such as PS Move-style controllers that measure our hand and arm movements.

Finally, headphones can be used to increase the sense of immersion. Binaural or 3D audio can be used by app and game developers to tap into VR headsets' head-tracking technology to take advantage of this and give the wearer the sense that sound is coming from behind, to the side of them or in the distance.

Motion tracking

Head tracking is one big advantage the as-yet unreleased premium headsets have over the likes of Cardboard. But the big VR players are still working out motion tracking. When you look down with a VR headset on the first thing you want to do is see your hands in a virtual space.

For a while, we've seen the Leap Motion accessory - which uses an infrared sensor to track hand movements - strapped to the front of Oculus dev kits. We've also tried a few experiments with Kinect 2 cameras tracking our flailing bodies. But now we have exciting input options from Oculus, Valve and Sony.

Oculus Touch is a set of prototype wireless controllers, designed to make you feel like you're using your own hands in VR. You grab each controller and use buttons, thumbsticks and triggers during VR games. So for instance, to shoot a gun you squeeze on the hand trigger. There is also a matrix of sensors on each controller to detect gestures such as pointing and waving.

It's a pretty similar set-up to Valve's Lighthouse positional tracking system and HTC's prototype controllers for its Vive headset. It involves two base stations around the room which sweep the area with lasers. These can detect the precise position of your head and both hands based on the timing of when they hit each photocell sensor on both the headset and around each handheld controller. Like Oculus Touch, these also feature physical buttons too and incredibly you can have two Lighthouse systems in the same space to track multiple users.

Other input methods can include anything from hooking an Xbox controller or joystick up to your PC, voice controls, smart gloves and treadmills such as the Virtuix Omni, which allow you to simulate walking around a VR environment with clever in-game redirections.

Eye tracking

Eye tracking is possibly the final piece of the VR puzzle. It's not available on the Rift, Vive or Morpheus but it will feature in FOVE's very promising crowdfunded VR headset. So how does it work?

Well, an infrared sensor monitor's your eyes inside the headset so FOVE knows where your eyes are looking in virtual reality. The main advantage of this - apart from allowing in-game characters to more precisely react to where you're looking - is to make depth of field more realistic.

In standard VR headsets, everything is in pin-sharp focus which isn't how we're used to experiencing the world. If our eyes look at an object in the distance, for instance, the foreground blurs and vice versa. By tracking our eyes, FOVE's graphics engine can simulate this in a 3D space in VR. That's right, blur can be good.

Headsets still need hi-res displays to avoid the effect of looking through a grid. Also what our eyes focus on needs to look as life-like as possible. Without eye tracking, with everything in focus as you move your eyes - but not your head - around a scene, simulation sickness is more likely. Your brain knows that something doesn't match up.

Explained: How does VR actually work? " data-width="50" data-layout="button_count" data-action="like" data-show-faces="true" data-share="true">

blog comments powered by Disqus