Dreamspace is a collaborative project funded under FP7 of the European Commission (duration: 2013 until 2016) and develops a new platform and also tools for collaborative virtual productions in the production of visual effects, in film and TV, and in new immersive experiences. The aim of the project is to enable creative professionals to combine live performances, video and computer-generated imagery in real-time. It is led by the leading European post-production software company “The Foundry” and combines leading research and commercial organizations in imaging, visual production and creative experiences with a total of seven partners.

The film and TV industry is seeking ways to produce audiovisual media that combine the real world, CGI and 3D animation for major movies, and TV programs. This requires advances in the state-of-the-art technologies in several areas, including the combination of 2D and 3D methods, depth and lighting capture, surround videos, real-time depth-based processing and compositing, real-time high resolution rendering, and also new approaches to manipulate CG objects and videos in a virtual environment.

The use of CGI (computer generated imagery) in movie and TV productions has technically reached a level that makes it hard to identify which parts of a production are real and which are added in post-production. However, the traditional two-phase approach of on-set capture or filming, followed by the integration of visual effects in a later offline post-production phase, has proved to be a major bottleneck in terms of creativity and cost-effectiveness. Progress in computational power and new real-time capable tools developed by Dreamspace will enable a new workflow that gives creative people full control over the virtual (computer generated) components on set. This will be achieved by real-time visual feedback of the integrated real and virtual objects and actors through real-time camera tracking and composition of the scene components.

The Intel-VCI developed new on-set editing techniques and real-time global illumination rendering that allow the real-time visualization of realistic lighting of the combined scene. A set of tools to capture the real light set-up in a studio has been developed. These tools estimate a model of the studio lighting by using a set of HDR images (high dynamic range) of the studio. The estimated light sources can then be imported into the Dreamspace ray tracing renderer, and virtual objects are rendered seamlessly with the same light setting as to be found in the studio. An integrated version of this light-editing tool was demonstrated at FMX 2015 and Siggraph 2015 and can be controlled through an intuitive user interface running on a tablet PC.

The intuitive interfaces developed by a project partner allow to combine all elements of a production in real-time by multiple users. The project realizes a collaborative space in which creative workers can manipulate graphics elements in real-world scenes to move and control digital characters and scenes in order to explore and develop a story, switch viewpoints effortlessly to see the action from the point of view of a different camera, and create complex visual effects with a fraction of the time and effort.

A first version of the Dreamspace ray tracing renderer has also been developed by Intel-VCI based on a domain-specific extension of the AnyDSL tools. The ray tracer supports global illumination (using path tracing), transparency, textures, and advanced materials (like glass). With the help of the AnyDSL framework, the renderer can be compiled for CPUs and GPUs. It is currently being integrated into the Dreamspace production pipeline. The performance of the AnyDSL-based renderer matches that of state-of-the-art hand-tuned implementations on both platforms. Compared to Embree’s implementation on the CPU, the AnyDSL description requires one-tenth of the programming effort according to Halstead’s software complexity measures.

The project also investigates new immersive user experiences through project partners. These techniques make use of camera arrays to capture real content and head mounted or projection-based displays to present the content.

The DreamSpace project is supported by the EC within the 7th framework program under grant agreement no. FP7-ICT-610005.

Project Team

Principal Investigator
Dr.-Ing. Oliver Grau