Computer Graphics
TU Braunschweig

ICG Dome

Abstract

Featuring more than 10 million pixels at 120 Hertz refresh rate, full-body motion capture, as well as real-time gaze tracking, our 5-meter ICG Dome enables us to research peripheral visual perception, to devise comprehensive foveal-peripheral rendering strategies, and to explore multi-user immersive visualization and interaction.


 

 

The ICG Dome
DFG Grossgerät INST 188/409-1 FUGG

Specs

5-meter projection dome
Six 120Hz, 2560x1600-pixel video projectors
Six high-end render nodes + master PC
Integrated 120Hz, real-time eye tracking system
Integrated full body motion capture system
3D shutter glasses

Exploring the Human Visual System

In the periphery of our field of view, our visual sense differs distinctly from our foveal vision. Different psychophysical rules apply to our peripheral vision that need to be re-evaluated with respect to novel wide field-of-view and immersive display technologies. Our Dome enables us to systematically and comprehensively explore and quantitatively model the perceptual properties of our Human Visual System for computer graphics applications.

Peripheral Rendering

With the advent of mass-marketed head-mounted displays and proliferating immersive VR applications, computer graphics algorithms must simultaneously cater to our consciously perceived foveal vision as well as to our mostly subconsciously perceived peripheral field of view. If gaze direction is measured in real-time, can be reliably predicted or steered, gaze-contingent rendering methods are able to make use of a number of perceptual strategies to improve perceived visual quality, to cut down on computation time, and to subconsciously influence our perception of situational atmosphere. Our dome is the perfect tool to develop and evaluate novel gaze-contigent rendering techniques that take our entire field of vision into account.

Multi-User Immersive Visualization and Interaction

A digital planetarium is an immersive theatre built to present educational and/or entertainment content to an audience. With its additional eye tracking and motion capture capabilities, our dome facilitates researching novel multi-user interaction paradigms for immersive visualization environments in which the audience takes center stage.

Live eye tracking in the dome.Entering the digital realm.Raising the bar in surround live eye tracking.Body tracking with advanced motion capture.Science in stereo.

You can check the dome occupancy using the Dome Calendar,
and find it at our Aufnahmestudio und Visualisierungslabor at the northern campus.

 

Student Projects

Mai Hellmann:
Bachelorarbeit: Implementierung VR-fähiger dynamischer Stimuli zur Durchführung von perzeptuellen Experimenten

 

Bana Ahmed, Franziska Bloch, Josias Cierpka, Lucas Elsen, Moritz Küpper, Niklas Mainzer, Tina Omeragic, Jonas Penshorn, Maik Siegesmund, Steffen Smigielski, Johannes Weinert, Tristan Zichy:
SEP: Massively distributed collaborative crowd input system for dome environments
Mehrbenutzer-Eingabessystem mittels eigenem Smartphone.

 

Nikkel Heesen, Peter Kramer, Daniel Lars Richard:
Praktikum (MA): Creating an interactive VR-adventure for the ICG Dome
Weiterentwicklung zum Praktikum WS'18/19: Dynamische Interaktion und Modifikation virtueller Umgebungen zur Laufzeit.

 

Daniel Andersch, Deniz Schmid, Tobias Schmidt:
Teamprojekt: Unser kleines Planetarium
Interaktive 3D Visualisierung des Sonnensystems.

 

Nikkel Heesen, Peter Kramer, Daniel Pawelczyk, Lars Richard, David Schultz:
Praktikum (BA): Horror Adventure
Interaktives Adventure-VR-Spiel mit Rollstuhl-Steuerung zur immersiven Fortbewegung.

 

Daniel Andersch, Pascal Blum, Benedict Laumer, Sabrina Müller, Thanh Tuong Phan, Timo Pust:
SEP: 3D Animation in Gospel
Keyframe-Animationen als Erweiterung des Gospel-Renderingframeworks.

 

Till Affeldt, Nikkel Heesen, Peter Kramer, Daniel Pawelczyk, Lars Richard, David Schultz, Larissa Schwenzfeier, George Wägele:
Teamprojekt: A Look Through Floyd Lawton's Mask
AR in VR — Augmentierung von Objekten in virtuellen Szenen in Abhängigkeit von der aktuellen Blickrichtung des Nutzers.

 

Kai Bleeke, Jann-Ole Henningson, Felix Lehner, Jan-Frederick Musiol:
Praktikum: World Builder VR Toolkit Extended
Weiterentwicklung zum Teamprojekt WS'17/18: u.a. Erzeugung von Gras, Verformung des Bodens und Aktualisierung der Code-Basis.

 

Sakinah Arifin, Nikkel Heesen, Domenik Jaspers, Felix Kohrs, Carolin Kohrt, Michelle Schaaf, Melvin Scharke:
SEP: Virtual Glass Cupola
Schnee- und Regen-Wettersimulation hinter einer gläsernen Kuppel.

 

Oliver Urbaniak:
Bachelorarbeit: Guiding the Eyes: Development and Implementation of Gaze Guidance Methods for wide-field-of-view Immersive Environments

 

Max Hattenbach, Jann-Ole Henningson, Felix Lehner:
Teamprojekt: World Builder VR Toolkit Continued
Weiterentwicklung zum SEP SS'17: u.a. Generierung von Bäumen/Sträuchern und Simulation eines Tag–Nachtzyklus.

 

Anton Günther, Max Hattenbach, Jann-Ole Henningson, Jannis Hibben, Dennis Knoll, Felix Lehner, Kim Nguyen, Lanlan Su, Finn Thieme, Dennis Jason Tkocz, Oliver Urbaniak:
SEP: World Builder VR Toolkit
Toolset zur Erstellung virtueller Welten, nach dem Vorbild World Builder.

 

Jan-Frederick Musiol:
Bachelorarbeit: A framework for psychophysical experiment design in immersive full-dome environments

 

Markus Wedekind:
Praktikum: Particle Game
Gestengesteuerte Partikelsimulation (Portierung).

 

Nicholas Gao, Lukas Güldenhaupt, Johannes Ruthmann, Niklas Wrege:
Teamprojekt: Hungergames
Fruit-Ninja Variation mit Backzutaten.

 

Malte Klingenberg, Alexander Manegold, Frauke Pommerehne:
Teamprojekt: Kosmodome
VR Weltraum-Shooter mit Steuerung per Echtzeit-Eyetracking.

 

Publications


Jan-Philipp Tauscher, Alexandra Witt, Sebastian Bosse, Fabian Wolf Schottky, Steve Grogorick, Susana Castillo, Marcus Magnor:
Exploring Neural and Peripheral Physiological Correlates of Simulator Sickness
in Computer Animation and Virtual Worlds, vol. 31, no. 4-5, John Wiley & Sons, Inc., pp. e1953 ff., August 2020.
electronic ISSN: 1546-427X







Steve Grogorick, Michael Stengel, Elmar Eisemann, Marcus Magnor:
Subtle Gaze Guidance for Immersive Environments
in Proc. ACM Symposium on Applied Perception (SAP), ACM, pp. 4:1-4:7, September 2017.

Martin Weier, Michael Stengel, Thorsten Roth, Piotr Didyk, Elmar Eisemann, Martin Eisemann, Steve Grogorick, André Hinkenjann, Ernst Kruijff, Marcus Magnor, Karol Myszkowski, Philipp Slusallek:
Perception-driven Accelerated Rendering
in Computer Graphics Forum (Proc. of Eurographics EG), vol. 36, no. 2, The Eurographics Association and John Wiley & Sons Ltd., pp. 611-643, April 2017.

Thomas Löwe, Michael Stengel, Emmy-Charlotte Förster, Steve Grogorick, Marcus Magnor:
Gaze Visualization for Immersive Video
in Burch, Michael and Chuang, Lewis and Fisher, Brian and Schmidt, Albrecht and Weiskopf, Daniel (Eds.): Eye Tracking and Visualization, Springer, ISBN 978-3319470238, pp. 57-71, March 2017.


Michael Stengel, Steve Grogorick, Martin Eisemann, Marcus Magnor:
Adaptive Image-Space Sampling for Gaze-Contingent Real-time Rendering
in Computer Graphics Forum (Proc. of Eurographics Symposium on Rendering EGSR), vol. 35, no. 4, pp. 129-139, July 2016.
EGSR'16 Best Paper Award

Benjamin Meyer, Steve Grogorick, Mark Vollrath, Marcus Magnor:
Simulating Visual Contrast Reduction during Night-time Glare Situations on Conventional Displays
in ACM Transactions on Applied Perception, vol. 14, no. 1, pp. 4:1-4:20, July 2016.

Related Projects

Astrophysical Modeling and Visualization

Humans have been fascinated by astrophysical phenomena since prehistoric times. But while the measurement and image acquisition devices have evolved enormously by now, many restrictions still apply when capturing astronomical data. The most notable limitation is our confined vantage point in the solar system, disallowing us to observe distant objects from different points of view.

In an interdisciplinary German-Mexican research project partially funded by German DFG (Deutsche Forschungsgemeinschaft, grants MA 2555/7-1 and 444 MEX-113/25/0-1) and Mexican CONACyT (Consejo Nacional de Ciencia y Tecnología, grants 49447 and UNAM DGAPA-PAPIIT IN108506-2), we evaluate different approaches for automatical reconstruction of plausible three-dimensional models of planetary nebulae. The team comprises astrophysicists working on planetary nebula morphology as well as computer scientists experienced in the field of reconstruction and visualization of astrophysical objects.

Comprehensive Human Performance Capture from Monocular Video Footage

Photo-realistic modeling and digital editing of image sequences with human actors are common tasks in the movies and games industry. The processes are however still laborious since tools only allow basic manipulations. In cooperation with the Institut für Informationsverarbeitung (TNT) of the University of Hannover (http://www.tnt.uni-hannover.de/), this project aims to solve this dilemma by providing algorithms and tools for automatic and semi-automatic digital editing of actors in monocular footage. To enable visual convincing renderings, a digital model of the human actor, detailed spatial scene information as well as scene illumination need to be reconstructed. Hereby plausible look and motion of the digital model are crucial.

This research project is partially funded by the German Science Foundation DFG.

Eye-tracking Head-mounted Display

Immersion is the ultimate goal of head-mounted displays (HMD) for Virtual Reality (VR) in order to produce a convincing user experience. Two important aspects in this context are motion sickness, often due to imprecise calibration, and the integration of a reliable eye tracking. We propose an affordable hard- and software solution for drift-free eye-tracking and user-friendly lens calibration within an HMD. The use of dichroic mirrors leads to a lean design that provides the full field-of-view (FOV) while using commodity cameras for eye tracking.

Immersive Digital Reality

Motivated by the advent of mass-market head-mounted immersive displays, we set out to pioneer the technology needed to experience recordings of the real world with the sense of full immersion as provided by VR goggles.

Physics-based Rendering

In this project, novel techniques to measure different light-matter interaction phenomena are developed in order to provide new or verify existing models for rendering physically correct images.

Simulating Visual Perception

The aim of this work is to simulate glaring headlights on a conventional monitor by first measuring the time-dependent effect of glare on human contrast perception and then to integrate the quantitative findings into a driving simulator by adjusting contrast display according to human perception.

Visual Fidelity Optimization of Displays

The visual experience afforded by digital displays is not identical to our perception of the genuine real world. Display resolution, refresh rate, contrast, brightness, and color gamut neither match the physics of the real world nor the perceptual characteristics of our Human Visual System. With the aid of new algorithms, however, a number of perceptually noticeable degradations on screen can be diminished or even completely avoided.