Jul 4, 2023
We have recently been awarded funding from NSERC Alliance and MEDTEQ+ to pursue a two-year R&D project, Improving Intelligibility of Speech in Noisy Environments Using Cognitive Load Measurements, along with industry partner AAVAA. We are recruiting for two post-doctoral candidates to join the project, one with background in AI and acoustics, and the second in computational neuroscience.
Dec 20, 2022
Thank you once again to Healthy Brains, Healthy Lives Neuro Commercialization Grants and MEDTEQ, along with industry partner, Haply Robotics for their support of our research to make affordable haptic experiences of web graphics available to members of the blind and low vision community as an active objective of our ongoing IMAGE Project.
May 11, 2022
We are recruiting! We are delighted with the news that our project on ADvanced Airspace Usability (ADAIR) with colleagues at École Polytechnique and Toronto Metropolitan University, and industry partners in the Canadian aviation sector has been awarded funding from NSERC and CRIAQ!
March 31, 2022
CHI 2022
Congratulations to Hyejin Lee, Ruixi Jiang, Yongjae Yoo, aand Max Henry, co-authors of The Sound of Hallucinations, which received an Honorable Mention from ACM CHI.
March 1, 2022
Thank you to HBHL for the wonderful news of the positive funding decision for our submission to the Healthy Brains, Healthy Lives Neuro Commercialization Grants to support the next phase of the IMAGE Project
About us
The Shared Reality Lab works to achieve high-fidelity distributed interaction, with both real and virtual data, at levels of presence that support the most demanding applications, and to do so in spite of sensor and bandwidth limitations. Our lab works with audio, video, and haptic technologies, mixed reality and mobile computing, building systems that leverage their capabilities to facilitate and enrich both human-computer and computer-mediated human-human interaction. Active projects include development of conversational avatars for therapy and engagement with older adults, rendering audio-haptic experiences of graphics contents for users who are blind, multimodal immersive walking experiences, telepresence for music and social interaction, and design of the flight deck of the future.
For questions about the lab, please contact Prof. Jeremy Cooperstock.
The Shared Reality Lab is currently funded by grants and contracts from the Natural Sciences and Engineering Research Council, MEDTEQ, Healthy Brains, Healthy Lives, CRIAQ, iMD Research, AI Mental Health Precision, Humanware, and Haply Robotics. Past funding sources include the Fonds Nature et technologies, Sécurité publique Québec, Canadian Internet Registry Association, Networks of Centres of Excellence, Minstère du développement économique, de l'innovation et de l'exportation, Secrétariat du Conseil du trésor, CANARIE, and Innovation, Science and Economic Development Canada, as well as industrial support from HP Labs, Google research, the Mozilla. and InterDigital Corporation.
iMD Research Humanware Haply Healthy Brains, Healthy Lives
Credit for the CSS contents of this web site to the GRAND NCE. Credit for the SRL logo to former M.Eng. student, Naoto Hieda.