The goal of the project was to support users in understanding scenes, e.g. a remote monitoring and maintenance scenario for modern factory environments. Such a scenario includes large sets of data in varying formats from different sensors, which need to be analyzed to adequately monitor the behavior of the physical world. Moreover, the sensor’s readings are not sufficient to understand the complete scenario. In order to make the connection between the data and the actual scene in the factory, we introduced a browser-based approach for multi-reality interfaces. Our approach combines different visualizations by showing the actual scene with two representations: stereoscopic video and 3D computer graphics (CG). In addition, the CG representation is augmented with data extracted from the scene (e.g., sensor data from a machine). Through rich visualization capabilities (stereoscopic video and 3D graphics) multi-reality interfaces offer a way to aggregate and structure the wide-spread variety of information from different sources that is required to analyze such complex scenarios [INCOM 2012]. Implementing this functionality in a browser removes the burden of installing additional software, thereby nicely integrating with existing control center setups, providing a modern (non-)industrial Human-Machine-Interface (HMI). 

Project Team

Principal Investigator

Prof. Dr.-Ing. Thorsten Herfet