The Shared Reality Lab works to achieve high-fidelity distributed interaction, with both real and virtual data, at levels of presence that support the most demanding applications, and to do so in spite of sensor and bandwidth limitations. Our lab works with audio, video, and haptic technologies, mixed reality and mobile computing, building systems that leverage their capabilities to facilitate and enrich both human-computer and computer-mediated human-human interaction. Active projects include development of conversational
avatars for therapy and engagement with older adults, rendering
audio-haptic experiences of graphics contents for users who are blind,
multimodal immersive walking experiences, telepresence for music and social interaction, and design of the
flight deck of the future.