Interactions and embodiment in Virtual Reality – Part 1

I’m planning on creating a few posts detailing my informal experiments with embodiment in virtual reality. The aim is to come up with an immersive, high fidelity setup that is able to be deployed quickly. To this end, base stations and wearable trackers are something I’ve tried to avoid. My use case involves traditional classroom environments and trying to capture natural user behaviour, unimpeded by the presence of numerous physical attachments. I’m also keenly aware of how outdated what I do now (July 2018) may seem in a few years. As is, the head mounted displays and trackers we have today are still somewhat cumbersome though I have every confidence they’ll continue to undergo miniaturisation and usability improvements.

In the video below I’m using a Windows Mixed Reality headset. These are wonderful pieces of kit, mainly due to what’s referred to as inside-out tracking. All that means is that I’m not required to set up base stations in the corners of the room to get accurate positioning data. They also work out much cheaper, with the only caveat being a marginal decrease in accuracy (in terms of the user experience, this remains negligible). I’ve also attached a leap motion device to the front of the headset. This is designed to allow for extremely accurate hand and finger tracking (without having to wear gloves). The next item on my agenda is to see to what extent accurate full body limb tracking (legs and posture) can be achieved without using attachable sensors.

Stay tuned for more information. VR is a medium that continues to advance at a rapid pace!

Leave A Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.