Project Team

Real-Time Emergency Response

A project for the Mozilla Ignite Challenge

Prototype

Our prototype system presents an augmented immersive street view system. The user is immersed in a 270 degree street view panorama capable of smoothly animating between locations. In the future, with higher bandwidth capabilities and greater infrastructure such a street level view might consist itself of live video. This projected scene is then augmented with data both from static, historical data sets - accidents, infrastructure conditions, locations of interest - and live data sets - weather conditions, traffic and transit information, EMS status information, other environmental data. In addition the system can show multiple real-time video feeds from android smart phone devices. These video feeds are embedded with GPS coordinate information allowing the immersed user to "teleport" to the location of the live video feed in order to react to new situations.

Street View Rendering

In order to render large scale street view images and project augmented object into the rendered scene, we have developed a new open-source library for Processing called Panoia. The library is structured in a similar manner to the Google Maps v3 API. It provides access both to the recently released planner projection Google Street View Image API static photos and also to the raw spherically projected Google Street View tiles which can be access through the v3 Maps API.

In addition, we have been developing functionality within the library to project simple spatial features into the rendered street level scene. Latitude/Longitude points can be projected into the images and annotated with floating marker bubbles. In addition simple trapezoidal shapes have been mapped on roads in the scene to provide for feature annotation and augmentation. We hope to extend the available anotable objects as we move forward with the development of the library.

Static Data Set Mining

In this preliminary prototype we have created generic structures for importing large data sets from standard storage formats. Data sets can be parsed to extract key spatial data - latitude, longitude, physical address, etc - and other descriptive information - time stamps, text fields, names. This data is the specialized withing the visible field of view.

For this prototype we have specifically focused on importing data relevant to our city. We have been drawing on the efforts of Montreal Ouvert and HackTaVille to generate open data sets for sustainable urban design and development. The current prototype includes historical information about bike and car accidents throughout the city.

Live Data Mining

Similarly to static data mining we have provided generic structure for mining real-time spatial data sets. Currently weather and environmental information can be overlaid onto the scene providing contextual information. In addition, simulated real time traffic information is rendered over roads within the scene (real traffic data will be added shortly).

Live Mobile Video Streaming

Finally, our system provides capabilities to stream live video and GPS information for android smart phones. This allows real-time crowd sourced video to be present picture-in-picture over the immersive street view scene. Using the open source ipcamera-for-android along with newly developed GPS streaming code we can stream video and JSON encoded GPS information. Using the GPS information accompanying the video feed we allow the user of the immersive system to "teleport" the augmented system to the location in street view where the video is being recorded. This means the user can quickly asses and manage situations brought to his attention via live video with the assistance his augmented system.

Video Demo

rtER - Mozilla Ignite from Shared Reality Lab on Vimeo.