Tools for Exploring and Harnessing Multimodal Sensor Network Data

Homes and offices are being filled with sensor networks to answer specific queries and solve pre-determined problems, but there are no comprehensive visualization or sonification tools for fusing these disparate data to examine relationships across spaces and sensing modalities. DoppelLab is an immersive, cross-reality virtual environment that serves as an active repository of the multimodal sensor data produced by a building and its inhabitants. We transform architectural models into browsing environments for real-time sensor data visualizations, as well as open-ended platforms for building audiovisual applications atop those data. These applications in turn become sensor-driven interfaces to physical world actuation and control. As an interface tool designed to enable rapid parsing, visualization, sonification, and application development, DoppelLab proposes to organize these data by the space from which they originate and thereby provide a platform to make both broad and specific queries about the activities, systems, and relationships in a complex, sensor-rich environment.

Lab Overview, Solid Walls Lab Overview, Transparent Walls Temperature and Humidity Sensors RFID Identification

DoppelLab's contributors are Gershon Dublon, Laurel S. Pardue, Brian D. Mayton, Noah Swartz, Nicholas Joliat, and Patrick Hurst, with help in the past from Anisha Jethwani, Jeffrey Prouty, Turner Bohlen, and Tanya Liu. DoppelLab is a project of the Responsive Environments group at the MIT Media Lab, and builds on previous work from Responsive Environments in the Dual Reality Lab.