Experience Real-World Training in a Virtual World – The Best of Both Realities

A boy driving a podracer created with the Tangible VR system.

In the realm of virtual reality (VR) training, Tangible VR stands out as a groundbreaking project that bridges the gap between digital simulations and physical interaction. This innovative approach integrates real physical structures—like buttons, handles, and tables—into the VR environment. The key to this integration is the combination of advanced hand tracking technology, real/virtual world synchronization and microcontroller technology providing sensor feedback.

Tangible VR thereby transforms the training experience, allowing the user to interact with physical elements while fully immersed inside a virtual world.

Traditional VR training often relies on controllers, or hand-tracking while users wave their hands in thin air, limiting the naturalness and intuitiveness of interactions. By contrast, Tangible VR offers a more immersive and realistic experience, making it ideal for skills development in various fields, such as production industries and industries involving heavy machinery.

Trainees can manipulate actual objects, enhancing muscle memory and spatial awareness in a way that traditional VR cannot match. Moreover, Tangible VR combines the flexibility and safety of a virtual environment with the tactile feedback of real-world objects. This dual approach allows for a broader range of training scenarios, from routine crane operations to high-risk maneuver training, without the associated real-world risks or costs.

It represents a significant step forward in VR training, promising more effective learning outcomes and a deeper level of engagement for users.

A modular and scalable approach to Tangible VR application development.

The project has developed a new Unity development system, which automatically handles the synchronization process and auto-generates the necessary microcontroller code, based on user input (e.g. which types of sensors they prefer to use, and which objects should be synchronizable).

The only requirement is that  users possess both a physical and virtual object – making it ideal for 3D design and printing.

An overview of the system and its innerworkings can be viewed in the following article (published at HCII, 2024), and the system is freely available from the following link:

*The system may not be used for any military purposes and or applications.
**Version 2.0.0 which is made for the new META SDK, will be released in mid July – stay tuned.

Example: Robotic Arm (Tangible VR - Development System, v1.0.0).

The video below demonstrates how a simple robotics arm can be synchronized, using only a simple microcontroller and three potentiometers (rotation sensors).

Example: LEGO Go-Cart Steering Wheel (Tangible VR - Development System, v1.0.0).

The video below demonstrates the synchronization of a LEGO steering wheel.

The LEGO model was first designed using the 3D tool Meca Bricks, then assembled in real life and equipped with a potentiometer to detect the steering wheel rotation and two push-buttons to detect the speeder and brake inputs.

Example: POD-racing (Tangible VR - Development System, v1.0.0).

The image below, is from a POD-racer project, in which two tangible handles have been 3D modelled using Blender, then 3D printed and outfitted with a potentiometer to detect their angular positions.

In addition, an emergency push-button were taken from stock and 3D modelled in Blender.

The components were then registered as synchronizable in the development system and placed inside a POD-racer modelled for the game.

This allowed kids at an open-day at the University of Southern Denmark, to visit and race around in an “actual” POD-racer, which they could both see and touch.

Project contact: Patricia Lyk and Bjarke Pedersen

Research topics: #Augmented Virtuality #Augmented Reality #VR #AR #simulation #haptic feedback #training #skill development #cyber-physical systems #virtual reality training