The Navy requires a virtual haptic instrument panel (VHIP) device that can faithfully replicate the look and feel of aircraft instrument panels for simulation and training. For this application, there are two technology areas that must be addressed. The first is the automated extraction of 3D panel model and texture data. This can be accomplished using image-based modeling or by using other, active methods like triangulation-based laser range scanning The second technology area is the device or device-set that presents the panel haptically to the user. The Phase I results indicate that an approach that leverage recent and ongoing advances in, and increasing ubiquity of, rapid-prototyping technology. Using such a device, the proposed system will allow new panels and control sets (i.e. buttons, dials, switches) to be rapidly constructed. We combine this rapid-prototyped hardware with a machine vision system that can "read" the state of the controls and then render the visual aspect of the cockpit to the pilot using Augmented Reality (AR) or projection. This architecture allows the Navy to configure tactilely accurate panel mockups quickly and cheaply, thereby increasing the availability of useful, accurate simulation. It will also facilitate the development and testing of new panel layout models while at the same time minimizing the space requirement for the simulation station. The Phase I prototype suite demonstrates that the rapid-prototyping and machine vision aspects of the proposed system perform very well. Phase II will integrate these technologies and implement the development tools to create new panels.
CapabilityHuman Computer Interaction