System: Holodeck

Project Holodeck is a virtual reality platform built with consumer facing technology, DIY off-the-shelf components, cutting-edge custom software, and creatively integrated peripherals. The goal of Project Holodeck is to bring 360-degree 6-DOF full-body virtual reality out of the research lab and into a fun, accessible consumer gaming platform.

We want to make the dream of a VR play space a reality, and at an affordable cost.

The project is a joint venture between the University of Southern California Viterbi School of Engineering and the School of Cinematic Arts, specifically the GamePipe and Interactive Media programs.

The Holodeck system combines accurate head tracking, limited body tracking and simple button inputs in a large 3D space with full 360 degrees of movement. This space will be combined with vehicular locomotion (piloting an airship) or in-place locomotion (jogging in place to “run”). This way, players can move around in personal “micro” space as they would naturally while also moving around in larger “macro” space. Although you’re in a VR environment, you’ll feel like you’re outside with a whole world to explore.

Two people inside a Holodeck.

Finally, we will then develop tools to create a game specifically for this platform. This includes, among other things, a networked first-person prototyping system, pre-warping for optics and hardware agnostic input scripts.

Hardware:

The current hardware design of the Holodeck system uses the Oculus Rift for head mounted video feedback, the Playstation Move optical system for head tracking and the Razer Hydra magnetic system for limited body tracking. When combined, these systems allow us to create a realistic 3D space that the user can freely move around in and interact with.

The Oculus Rift:

Palmer Luckey, our lead hardware engineer and pioneer of Oculus, has developed an affordable high-FOV head-mounted display called the Oculus RIFT. Each of these HMDs utilizes two specifically sized and tuned lenses to amplify a 1280×800 resolution screen into two oculi. Players can see a stereoscopic 3D image with a 90-degree horizontal FOV and 105-degree vertical FOV, rivaling high end HMDs and putting consumer systems such as the Sony HMZ-T1 and Vuzix WRAP 920 to shame. This isn’t like watching a floating television – this is true immersion in a virtual world with simulated peripheral vision.

An early version of the Oculus Rift head mounted display.

Check out more specs on the Oculus RIFT at Palmer’s website here.

Playstation Move:

The Playstation Move system will be used to provide head tracking data that can fed into the Oculus Rift.

The Playstation Eye camera has a 75 degree horizontal field of view and a 56 degree vertical field of view. By tracking a Playstation Move wand, this gives us a rather large playspace to work with.

The large playspace provided by the PlayStation Eye camera.

A 3D representation of the playspace.

Razer Hydra:

The Sixense Razer Hydra gives us fast and accurate six-axis tracking along with buttons and analog sticks.

Although the Holodeck will track parts of your body, your hands are still your most important input system and they must be tracked quickly and accurately.

Haptic Feedback

To increase the sense of immersion in the game, ship speed can determine the speed to strategically placed box fans linked to the game server using an Arduino controller.

An Arduino box fan setup.

This will allow you to actually feel the wind in your hair as you pilot a vehicle (in the case of Wild Skies, the airship the Little Dynamo).

Software:

The main piece of software we’re using for this project is Unity3D.

Unity is the first truly special next-gen game development tool. It combines power and flexibility with ease of use. The work we do in Unity can also be easily shared with the larger community of game developers and VR enthusiasts.

Leave a Comment