The major goal of this project is to develop an integrated system that will enhance the war fighter’s experience in the Amputee Virtual Environment Support Space (AVESS). We will deliver a concept demonstration to the Telemedicine and Advanced Technology Research Center (TATRC) at the conclusion of this Phase-I. This demonstration will show we have developed techniques for innovatively connecting COTS products and advanced behavior recognition techniques. This fusion will enable both verbal and non-verbal communication that will advance immersion into the environment and support effective interaction with a physical therapist. As part of this effort, we will integrate the Microsoft Kinect, as well as speech and gesture recognition software into a system that interfaces with Second Life avatars. This system will enable effective and enhanced interactions between physical therapists and their patients in a virtual world. We will be working with subject matter expert Professor Pamela Andreatta, of the University of Michigan Medical School’s Simulation Center. This project also closely fits with Cybernet’s existing gesture tracking line, and the technology will be incorporated into a number of our products. Our goal is to use the gesture and behavior recognition methods developed under this effort to make enhanced interface and surveillance systems.