3D-Context-Aware Display Interaction

Daas Software Architecture

The “Display as a Service (DaaS)” project continuum, previously funded through the Intel VCI and still funded through the DFKI, has produced a set of components for the flexible connection of pixel sources to displays using only IP networks as a communication medium between otherwise independent devices. The produced components primarily are a software API specification (“DaaS”), the DaaS reference implementation and plugins (“NetVFB”), as well as a variety of applications on top of those. DaaS specifies three kinds of APIs [Fig. 1] representing views onto the system: pixel sources (“Virtual Frame Buffers”, VFBs), displays (“Virtual Displays”, VDs), or “Controllers”. A controller maps VFBs onto VDs and thereby creates and manipulates the mappings we call “Projections”. One may combine any of the three DaaS APIs in one application.

User interaction in DaaS-based applications happens at all three ends of the system and is invoked either directly or indirectly on DaaS APIs:

  • VFB interaction manipulates pixel-producing applications, e.g., switches slides in Powerpoint or rotates a 3D scene in a rendering application. VFB interaction only changes the pixels continuously fed into DaaS, thus it only indirectly involves the DaaS system.  
  • VD interaction manipulates the location and/or size of VDs in a DaaS session, e.g., by interactively moving a physical display and updating its respective VD with a new transformation [Figs. 2, 3]. Here, VD properties are directly updated from the outside through the VD API.
  • Controller interaction creates or manipulates Projections, e.g., through direct interaction on a touch screen [Fig. 4]. Every Projection access in such a setting happens directly through the DaaS Controller API.
Tracked DaaS (virtual) display in front of projection
Contextual DaaS VD reconfiguration via pinch gesture

In this proposed project, we want to investigate a generic solution for user interaction happening at all ends around DaaS and create a generic interaction framework, which allows as much flexibility and network transparency as it is already the case with DaaS. Interaction is an orthogonal problem to the pixel flow in DaaS; the input for VFB, VD, or Controller interaction may again occur at any of the three conceptual ends of a DaaS session. As such, input events must not be constricted to single hosts, but again be network-transparent to be sent and received anywhere in a DaaS network. Also, input from multiple hosts is required to be interpretable as meta-events (e.g., in the Fig. 3 scenario two swipes on two devices become a global pinch meta-event). This interaction framework will not be a part of NetVFB, but extends its functionality from outside the DaaS API boundaries, also sharing the same IP network for inter-device event transport.

Interaction on a touch-screen Controller UI for DaaS

Based on this developed interaction framework, we want to investigate mobile display interaction in the 3D context of large-screen displays (projections or display walls). This interaction may combine (a) VFB interaction on the touch screen of a mobile (i.e., operating on the remote pixel source displayed on its screen); (b) VD interaction, which is invoked by moving the transformation-tracked mobile in the context of the large screen; and (c) Controller interaction, which moves or scales Projections shown on either mobile or large screen. All interaction types can involve various network hosts as interaction sources and sinks, and require capabilities to globally interpret singular interaction events as meta-interaction. Previous work with layered displays like Boring’s “LucidDisplay” or our own simple prototypes [Fig. 2] reconstruct only 2D positions of a tracked screen, as if it was part of the covered larger screen plane behind.

Schematic of a tracked mobile in a VR setup

A treatment of full 3D, later including a reconstructed view position for actual “see-through” mobile screens seems beneficial for a wide range of applications, also in the Virtual Reality (VR) context [Fig. 5]. For VR applications, the potential effects of mixed stereo (large screen) and mono (mobile) content should also be part of these investigations.



Further information: http://www.daas.tv/


Project team

Principal Investigator
Alexander Löffler, M.Sc.

Dipl.-Inf. (FH) Luciano Pica
Sebastian Alberternst
Jürgen Grüninger