With the Apple Vision Pro out and generating lots of discussion, it seemed like a good time to present an experimental VR prototype we built at IRL415 in 2023.
To present the idea - imagine trying to look at something to your side: Your brain sends a signal to your neck muscles to contract on one side and release on the other, and then waits for confirmation from your eyes and body that your head is in fact rotating. When you are wearing a VR headset, a gyro in the device senses your head’s rotation and presents the appropriate change on the screen just a moment later. But this tiny time delay causes a disagreement between what your eyes see and what a different part of your body senses - the ‘vestibular apparatus’ just inside your ears. This slight error, unfortunately, triggers nausea due to an evolved behavior that is widely believed to be an adaption keeping you safe from drinking too much alcohol (which our animal ancestors had access to through fermented berries) or other poisons.
Now imagine that instead of having you wear VR goggles, we had you lying comfortably on a headrest which also keeps you from turning your head at all (imagine a really deep memory foam pillow), while surrounded by big monitors. Then, when you try to turn your head, the headrest detects the force your muscles are producing, and updates the view on the monitors to turn to one side or the other appropriately. Your eyes get the visual feedback they expected, which feels very much like turning your head! And because your head isn’t actually moving at all, the secondary system your body uses - the vestibular system mentioned above - doesn’t report the slightly different version of the rotation that makes you sick.
Extending this idea beyond the head, imagine that your arms and legs were also held steady - say by holding onto a rigid handlebar and wearing ski boots glued to the floor, or even something like the ‘Link Bed’ from the movie Avatar, pictured above. When you try and lift your hand up in front of your face, the forces detected are used to move your avatar’s hand. When you try to take a step, your avatar take a step even though your physical feet aren’t moving.
Because you aren’t actually moving, heavy weights and solid walls feel exactly the way they should: If you try to punch a hole in a wall, your real arm doesn’t keep moving right through it the way it does in a VR headset. If I hand you a virtual bowling ball, your biceps and shoulder muscles have to generate the exact same forces you’d normally need to keep it from falling, rather than how it is weightless in VR. If you hold a heavy broadsword, it takes the same strength and skill and managing momentum to move that it would for a real swordfighter. It should be possible to correctly experience complex interactions such as dancing or fighting or gymnastics using this technique. And haptic feedback such as pressing on the skin to simulate contact with an object or person would be easy to add because the subject is not actually moving - no gloves needed.
There are some big challenges that would still need to be faced to commercialize such an approach to VR, but it seems possible that they could be solved:
Optical flow without vestibular input can still cause some nausea, although in our experiments with 20 or so initial testers it already seems substantially less than in a VR headset. For example, many testers were able to navigate a scene with a space navigator - doing smooth rotation, accelerated linear motion, and jumping - without triggering nausea. It might be that techniques commonly used in existing VR apps to reduce nausea, such as restricting peripheral view during rotation, might work even better with the head fixed in this way.
Proprioceptive feedback from muscles lengthening and shortening (how you know where you arms and legs are with your eyes closed) is absent in our technique, and therefore movements like walking or picking something up feel different and harder - kind of like moving your arms or legs when they have fallen asleep. It seems possible, however, that training and visual feedback may provide an adequate replacement. Additionally, topical vibration to muscles is well known to stimulate proprioceptors and create a modified sense of joint angle, a technique we tested a bit that could be used to add additional kinesthetic awareness.
In a sense, this immobilization approach has you exiting reality to enter another reality, rather than trying to be in both places at the same time. And of course this also has many side benefits, like not crushing your pets, looking like an idiot, running into walls, or falling over furniture.
The experiments in our lab reconstructed and extended the head portion of a prototype that was initially built in 2000 during the exploratory R&D stages of building Second Life, including the foot rests shown in the following video:
The original method used to immobilize the arm is shown in the following photo - imagine holding the white beam while your elbow rests on the green pad:
Finally, here is August at the lab explaining the approach and circuitry used in the new version, including UE5 blueprints and Raspberry PI for telemetry:
If you are interested in learning more about this approach, trying the demo yourself, or working on it with us - reach out at IRL415.
Philip very interesting no nausea not a small time deal as far as I'm concerned Bob
In the early 90's, I met an inventor named "Barrett Haines" who showed off his patent (or application) on what looked like an "Iron Maiden" that worked with arrays of surface strain gauge sensors distributed throughout the interior of the enclosure.
It (the device) seemed comical at the time, but, it was explained to function in a near identical manner as what you've described. At that time (in VR history), an actual prototype wasn't practicable, nor existed, a system for the interface to drive in real-time.
Seemed weird and radical, at the time, and, I'm wondering IF I have any documentation about it or took video of his presentation in my unprocessed archives?