Consumer Goods

Inertial Motion-Tracking Technology for Virtual 3-D

Originally published in 2005
Body

Originating Technology/NASA Contribution

I n the 1990s, NASA pioneered virtual reality research. The concept was present long before, but, prior to this, the technology did not exist to make a viable virtual reality system. Scientists had theories and ideas—they knew that the concept had potential—but the computers of the 1970s and 1980s were not fast enough, sensors were heavy and cumbersome, and people had difficulty blending fluidly with the machines. Scientists at Ames Research Center built upon the research of previous decades and put the necessary technology behind them, making the theories of virtual reality a reality.

Virtual reality systems depend on complex motiontracking sensors to convey information between the user and the computer to give the user the feeling that he is operating in the real world. These motion-tracking sensors measure and report an object’s position and orientation as it changes. A simple example of motion tracking would be the cursor on a computer screen moving in correspondence to the shifting of the mouse. Tracking in 3-D, necessary to create virtual reality, however, is much more complex. To be successful, the perspective of the virtual image seen on the computer must be an accurate representation of what is seen in the real world. As the user’s head or camera moves, turns, or tilts, the computer-generated environment must change accordingly with no noticeable lag, jitter, or distortion. Historically, the lack of smooth and rapid tracking of the user’s motion has thwarted the widespread use of immersive 3-D computer graphics.

NASA uses virtual reality technology for a variety of purposes, mostly training of astronauts. The actual missions are costly and dangerous, so any opportunity the crews have to practice their maneuvering in accurate situations before the mission is valuable and instructive. For that purpose, NASA has funded a great deal of virtual reality research, and benefited from the results.

Partnership

Scientists at Ames, led by Elizabeth Wenzel, were looking for a better way to track head motion for use with 3-D audio systems. Dr. Wenzel’s group was conducting research experiments on binaural localization with subjects wearing tracked 3-D audio headsets. Traditional headsets were either too slow, had high latency, or were too bulky. They were looking for a better headset.

A graduate student at the Massachusetts Institute of Technology, Eric Foxlin was conducting research in virtual reality systems when he came up with the idea to develop a miniature tracking device based on the same technology found in large, ship-borne navigation tracking devices. The Inertial Measurement Unit (IMU) uses large precision gyroscopes, accelerometers, and compasses to estimate, with great accuracy, the position and orientation of ships and aircraft. Foxlin used this concept to build a miniature IMU using low power, low cost microelectromechanical systems (MEMS) components with support from an Ames seed grant. The first miniature, body wearable, sourceless tracker was developed and used as a technology seed to attract venture capital and start InterSense, Inc., in 1996.

Bedford, Massachusetts-based InterSense now develops precision, miniaturized inertial motion-tracking technology extensively deployed in simulation and training, entertainment applications, clinical and medical settings, for oil and gas discovery, virtual design and testing, and in a variety of research applications. A cutting-edge virtual reality company, one of its frequent customers is NASA.

InterSense’s IS-900 Virtual Environment Tracking system was recently installed at Glenn Research Center for use in a 3-D virtual immersive display wall. Combined with 3-D fluid flow analysis software, the stereo display wall is used to gain better understanding of 3-D flow modeling of jets, turbines, and Space Shuttle aerodynamics. Simulated failure analysis is also performed, such as what happens when a jet turbine blade is impacted by a foreign object.

Product Outcome

InterSense’s unique technology tracks motion with an unmatched combination of size, cost, and precision. Its users create products and applications that allow anyone—from product designers, computer and Internet game developers and players, to scientists, teachers and students, assembly line workers, and video and film production companies—to interact with virtual 3-D images just as they do physical objects.

The InterSense products have the added bonus of allowing the users unlimited range, negligible jitter, high update rates, and low latency. The units have no discernable interference or line-of-sight problems and offer superior motion prediction.

InterSense offers standard products which provide orientation-tracking and location-tracking systems. Each product offering has a unique combination of inertial technology which is sensor-fused with complementary tracking technologies. The InertiaCube product family combines MEMS gyros and accelerometers with magnetometers to offer an accurate, sourceless tracking sensor. The IS-900 product family fuses the MEMS inertial components with an ultrasonic positioning system to offer an accurate tracking solution for wide-area tracking with wireless tracking devices. The IS-1200 product family fuses MEMS inertial components with passive or active optical position references, providing an autonomous tracking solution for mobile or moving vehicle applications.

With its extensive product line, InterSense is the leading provider of head and helmet tracking systems to major government contactors like the Boeing Company, Lockheed Martin Corporation, and L3 Communications, Inc.’s Link Simulation and Training division (Link).

One of the projects it worked on with Link is the Aviation Combined Arms Tactical Trainer (AVCATT) now employed by the U.S. Army for training helicopter pilots. It allows multiple users, so pilots can practice simulated flights as a fleet, as opposed to individually. The AVCATT is the Army’s newest aviation training simulator which includes a series of networked simulators offering a dynamic, reconfigurable system used for collective training and mission rehearsal. The AVCATT provides up to five functional cockpits, and even allows pilots to choose between different models currently employed by both the Army and the National Guard.

Link and InterSense have also teamed together to create the F/A-18C Distributed Mission Training System, a project similar to the helicopter simulator. This, however, is used to model jet aircraft, and, like the AVCATT, it allows pilots to train simultaneously in real-time, 3-D missions in simulated cockpits.

Another project using InterSense’s NASA-funded technology is the Stinger Missile Trainer, a 40-foot dome with projected terrain and aircraft images. The dome allows the trainees a full 360-degree surround scene and 70-degree vertical field of view. The system trains up to three gunners, individually or simultaneously, to identify, acquire, and track airborne targets, and then launch a Stinger missile. An upgraded version of this program was recently developed for domed simulators to improve the overall training efficiency and realism. As part of the upgrade, InterSense worked with several other cutting-edge simulation companies to free the trainees from tethers and cables that were connected to training weapons.

Another InterSense partner, Fakespace Systems, Inc., of Marshalltown, Iowa, constructed a reconfigurable visualization system as part of an immersive environment simulator used to study how soldiers use equipment in combat zones at the Army Research Laboratory (ARL). The RAVE II visualization system consists of three detachable, large-scale stereoscopic display units that ARL uses to project realistic imagery of simulated hostile environments. The immersive environment simulator integrates the RAVE II with InterSense’s motion-tracking device and an omni-directional display system consisting of three self-contained, rear-projected modules that can be arranged to form a flat wall display or an immersive theater environment. The immersive environment simulator also integrates motion-tracking with an omni-directional treadmill allowing soldiers to run and move in any direction within the virtual hostile terrain in order to simulate live combat conditions.

It is not just the military that is benefiting from InterSense’s work. The company worked to integrate its inertial head tracker into the Microsoft games, Flight Simulator 2002 and Flight Simulator 2004: A Century of Flight. Touted by Microsoft as “the world’s most popular game,” Flight Simulator sold over 21 million copies. In the game’s “Virtual Cockpit” mode, the user becomes fully immersed in the cockpit and views the world as a real pilot does with a full 360-degree view. Along with the consumer market, numerous military departments and pilot training schools utilize this program as a cost- efficient training simulator for beginning pilots.

InterSense’s inertial motion-tracking technology can also be found in hospitals, where it is used to supplement ultrasound imagery; in oil fields, where it helps workers locate new locations for wells; at General Motors vehicle design facilities; television and film studios; and in university research. Its motion-sensing work, funded by NASA, can be found virtually everywhere.

RAVE II™ is a trademark of Fakespace Systems, Inc.

Microsoft® is a registered trademark of the Microsoft Corporation.

Flight Simulator 2004: A Century of Flight® is a registered trademark of Microsoft Corporation.

Abstract
NASA uses virtual reality technology for a variety of purposes, mostly training of astronauts and has funded a great deal of virtual reality research.. A graduate student at the Massachusetts Institute of Technology, Eric Foxlin, was conducting research in virtual reality systems when he came up with the idea to develop a miniature tracking device based on the same technology found in large, ship-borne navigation tracking devices. Foxlin used this concept to build a miniature device using low power, low cost components with support from an Ames grant. The first miniature, body wearable, sourceless tracker was developed and used as a technology seed to attract venture capital and start InterSense, Inc. InterSense now develops precision, miniaturized inertial motion tracking technology extensively deployed in simulation and training, entertainment applications, clinical and medical settings, for oil and gas discovery, virtual design and testing, and in a variety of research applications.
InterSense IS-900 MiniTrax 6-DOF Hand Tracker

Virtual reality systems depend on motion-tracking sensors to relay information between the user and the computer. Pictured here is the InterSense IS-900 MiniTrax 6-DOF Hand Tracker for immersive environment interaction.

InterSense immersive headset

InterSense’s immersive headsets, like this NVIS Head Mounted Display, bring 3-D to life when integrated with IS-900 MiniTrax 6-DOF Inertial-Acoustic Tracking Technology.

A man testing the IS-900 Wand and Hand Trackers

Virtual Design Review in University of Central London’s Immersive Display Room, using the InterSense IS-900 Wand and Head Trackers.

A man using a virtual reality system

The immersive virtual reality systems that InterSense works on can be found virtually everywhere.

Two designers testing the IS-900 Wand and Hand Trackers

Virtual Design Review of Automotive Interior in Peugeot’s Immersive Display Room, where designers use the InterSense IS-900 Wand and Head Trackers.