Featuring more than 10 million pixels at 120 Hertz refresh rate, full-body motion capture, as well as real-time gaze tracking, our 5-meter ICG Dome enables us to research peripheral visual perception, to devise comprehensive foveal-peripheral rendering strategies, and to explore multi-user immersive visualization and interaction.
|The ICG Dome|
DFG Grossgerät INST 188/409-1 FUGG
|5-meter projection dome|
|Six 120Hz, 2560x1600-pixel video projectors|
|Six high-end render nodes + master PC|
|Integrated 120Hz, real-time eye tracking system|
|Integrated full body motion capture system|
|3D shutter glasses|
Exploring the Human Visual System
In the periphery of our field of view, our visual sense differs distinctly from our foveal vision. Different psychophysical rules apply to our peripheral vision that need to be re-evaluated with respect to novel wide field-of-view and immersive display technologies. Our Dome enables us to systematically and comprehensively explore and quantitatively model the perceptual properties of our Human Visual System for computer graphics applications.
With the advent of mass-marketed head-mounted displays and proliferating immersive VR applications, computer graphics algorithms must simultaneously cater to our consciously perceived foveal vision as well as to our mostly subconsciously perceived peripheral field of view. If gaze direction is measured in real-time, can be reliably predicted or steered, gaze-contingent rendering methods are able to make use of a number of perceptual strategies to improve perceived visual quality, to cut down on computation time, and to subconsciously influence our perception of situational atmosphere. Our dome is the perfect tool to develop and evaluate novel gaze-contigent rendering techniques that take our entire field of vision into account.
Multi-User Immersive Visualization and Interaction
A digital planetarium is an immersive theatre built to present educational and/or entertainment content to an audience. With its additional eye tracking and motion capture capabilities, our dome facilitates researching novel multi-user interaction paradigms for immersive visualization environments in which the audience takes center stage.
|Live eye tracking in the dome.||Entering the digital realm.||Raising the bar in surround live eye tracking.||Body tracking with advanced motion capture.||Science in stereo.|
You can check the dome occupancy using the Dome Calendar,
and find it at our Aufnahmestudio und Visualisierungslabor at the northern campus.
Bachelorarbeit: Implementierung VR-fähiger dynamischer Stimuli zur Durchführung von perzeptuellen Experimenten
SEP: Massively distributed collaborative crowd input system for dome environments
Mehrbenutzer-Eingabessystem mittels eigenem Smartphone.
Praktikum (MA): Creating an interactive VR-adventure for the ICG Dome
Weiterentwicklung zum Praktikum WS'18/19: Dynamische Interaktion und Modifikation virtueller Umgebungen zur Laufzeit.
Teamprojekt: Unser kleines Planetarium
Interaktive 3D Visualisierung des Sonnensystems.
Praktikum (BA): Horror Adventure
Interaktives Adventure-VR-Spiel mit Rollstuhl-Steuerung zur immersiven Fortbewegung.
SEP: 3D Animation in Gospel
Keyframe-Animationen als Erweiterung des Gospel-Renderingframeworks.
Teamprojekt: A Look Through Floyd Lawton's Mask
AR in VR — Augmentierung von Objekten in virtuellen Szenen in Abhängigkeit von der aktuellen Blickrichtung des Nutzers.
Praktikum: World Builder VR Toolkit Extended
Weiterentwicklung zum Teamprojekt WS'17/18: u.a. Erzeugung von Gras, Verformung des Bodens und Aktualisierung der Code-Basis.
SEP: Virtual Glass Cupola
Schnee- und Regen-Wettersimulation hinter einer gläsernen Kuppel.
Bachelorarbeit: Guiding the Eyes: Development and Implementation of Gaze Guidance Methods for wide-field-of-view Immersive Environments
Teamprojekt: World Builder VR Toolkit Continued
Weiterentwicklung zum SEP SS'17: u.a. Generierung von Bäumen/Sträuchern und Simulation eines Tag–Nachtzyklus.
Bachelorarbeit: A framework for psychophysical experiment design in immersive full-dome environments
Praktikum: Particle Game
Gestengesteuerte Partikelsimulation (Portierung).
Fruit-Ninja Variation mit Backzutaten.
VR Weltraum-Shooter mit Steuerung per Echtzeit-Eyetracking.
Exploring Neural and Peripheral Physiological Correlates of Simulator Sickness
in Computer Animation and Virtual Worlds, vol. n/a, no. n/a, John Wiley & Sons, Inc., pp. e1953 ff., August 2020.
electronic ISSN: 1546-427X
Comparison of Unobtrusive Visual Guidance Methods in an Immersive Dome Environment
in ACM Transactions on Applied Perception, vol. 15, no. 4, ACM, pp. 27:1-27:11, October 2018.
Perception-driven Accelerated Rendering
in Computer Graphics Forum (Proc. of Eurographics EG), vol. 36, no. 2, The Eurographics Association and John Wiley & Sons Ltd., pp. 611-643, April 2017.
Gaze Visualization for Immersive Video
in Burch, Michael and Chuang, Lewis and Fisher, Brian and Schmidt, Albrecht and Weiskopf, Daniel (Eds.): Eye Tracking and Visualization, Springer, ISBN 978-3319470238, pp. 57-71, March 2017.
Gaze-Contingent Perceptual Rendering in Computer Graphics
PhD thesis, TU Braunschweig, November 2016.
Adaptive Image-Space Sampling for Gaze-Contingent Real-time Rendering
in Computer Graphics Forum (Proc. of Eurographics Symposium on Rendering EGSR), vol. 35, no. 4, pp. 129-139, July 2016.
EGSR'16 Best Paper Award
Simulating Visual Contrast Reduction during Night-time Glare Situations on Conventional Displays
in ACM Transactions on Applied Perception, vol. 14, no. 1, pp. 4:1-4:20, July 2016.
Humans have been fascinated by astrophysical phenomena since prehistoric times. But while the measurement and image acquisition devices have evolved enormously by now, many restrictions still apply when capturing astronomical data. The most notable limitation is our confined vantage point in the solar system, disallowing us to observe distant objects from different points of view.
In an interdisciplinary German-Mexican research project partially funded by German DFG (Deutsche Forschungsgemeinschaft, grants MA 2555/7-1 and 444 MEX-113/25/0-1) and Mexican CONACyT (Consejo Nacional de Ciencia y Tecnología, grants 49447 and UNAM DGAPA-PAPIIT IN108506-2), we evaluate different approaches for automatical reconstruction of plausible three-dimensional models of planetary nebulae. The team comprises astrophysicists working on planetary nebula morphology as well as computer scientists experienced in the field of reconstruction and visualization of astrophysical objects.
Photo-realistic modeling and digital editing of image sequences with human actors are common tasks in the movies and games industry. The processes are however still laborious since tools only allow basic manipulations. In cooperation with the Institut für Informationsverarbeitung (TNT) of the University of Hannover (http://www.tnt.uni-hannover.de/), this project aims to solve this dilemma by providing algorithms and tools for automatic and semi-automatic digital editing of actors in monocular footage. To enable visual convincing renderings, a digital model of the human actor, detailed spatial scene information as well as scene illumination need to be reconstructed. Hereby plausible look and motion of the digital model are crucial.
This research project is partially funded by the German Science Foundation DFG.
Immersion is the ultimate goal of head-mounted displays (HMD) for Virtual Reality (VR) in order to produce a convincing user experience. Two important aspects in this context are motion sickness, often due to imprecise calibration, and the integration of a reliable eye tracking. We propose an affordable hard- and software solution for drift-free eye-tracking and user-friendly lens calibration within an HMD. The use of dichroic mirrors leads to a lean design that provides the full field-of-view (FOV) while using commodity cameras for eye tracking.
Motivated by the advent of mass-market head-mounted immersive displays, we set out to pioneer the technology needed to experience recordings of the real world with the sense of full immersion as provided by VR goggles.
In this project, novel techniques to measure different light-matter interaction phenomena are developed in order to provide new or verify existing models for rendering physically correct images.
The aim of this work is to simulate glaring headlights on a conventional monitor by first measuring the time-dependent effect of glare on human contrast perception and then to integrate the quantitative findings into a driving simulator by adjusting contrast display according to human perception.
The visual experience afforded by digital displays is not identical to our perception of the genuine real world. Display resolution, refresh rate, contrast, brightness, and color gamut neither match the physics of the real world nor the perceptual characteristics of our Human Visual System. With the aid of new algorithms, however, a number of perceptually noticeable degradations on screen can be diminished or even completely avoided.