exhibition

Overview of lastest internal projects and showcases of partners done with the instantreality framework.

MueGen Driving Simulator by Fraunhofer Austria

IMG_2720

IMG_2729

MueGenConstruction

Fraunhofer Austria developed an optimized 3D stereoscopic display based on parallax barriers for a driving simulator. The simulator is built and run by the Institute of Automotive Engineering at the Graz University of Technology. The overall purpose of the simulator is to enable user studies in a reproducible environment under controlled conditions to test and evaluate advanced driver assistance systems.

The Simulator consists of a modified MINI Countryman chassis with eight liquid-crystal displays (LCDs) mounted around windscreen and front side windows. Four 55inch LCDs are placed radially around the hood of the car in a slanted angle. Two 23inch LCDs are used for each of the two front side windows. The four LCDs in the front are equipped with parallax barriers made of 2cm thick acrylic glass to minimize strain caused by the slanted angle. Each barrier is printed with a custom-made striped pattern, which is the result of an optimization process. The displays are connected to a cluster of four “standard” computers with powerful graphics cards. Instant Reality is used for rendering.

The project “MueGen Driving” is funded by the Austrian Ministry for Transport, Innovation and Technology and the Austrian Research Agency in the FEMtech Program Talents. Furthermore, the work is supported by the government of the Austrian federal state of Styria within the project “CUBA– Contextual User Behavior Analysis”.

 

Project website

 

InstantReality @ SIGGRAPH 2015

Greetings from Los Angeles. Meet the team behind Instant Reality at the SIGGRAPH 2015 exhibition floor. We’re located at the Web3D consortiums booth #1018. We’ll show you the latest release 2.5.1 and our new service oriented architecture Instant3DHub, which fosters web-based rendering beyond your browsers capabilities.


S15_BLogo_Rtag

InstantReality and X3DOM at Web3D 2014 and SIGGRAPH 2014

One week until Web3D 2014 and SIGGRAPH 2014 in Vancouver! Meet the X3DOM and InstantReality team at both conferences and on the exhibition floor in the Web3D Consortium booth #1045, where we will present the latest features of X3DOM 1.6.1 and InstantReality 2.4.0. This year brings a major step toward a service oriented architecture for web-based applications, both rendered on a client or streamed from a server from different back-ends, and we’re thrilled to show you all of it.

S14_Logo_C_x

Car assembly simulation with haptic feedback from DLR

automatica2012_HUG bimanualHapticDevice_automatica2012_hires

Interactive VR simulations consist of visual and haptic feedback. The haptic signals are displayed to a human operator through a haptic device at a rate of 1kHz. The research of the DLR institute of robotics and mechatronics on this project comprises design and control of light-weight robots for haptic interaction, as well as volume based haptic rendering. Typical applications are assembly simulations, training of mechanics, and skill transfer.

You can visit the project website here.

InstantReality and X3DOM at SIGGRAPH 2013

Join the dev-team of Instant Reality and X3DOM at the exhibition floor of SIGGRAPH 2013 in Anaheim. We’re located in booth #233 and will show you the latest and greatest in X3D web-technology, fusing desktop and web environments to create streaming applications. We will also bring along new hardware toys which have been added to Instant Reality, as well as a demonstration of the latest progressive mesh compression technique we’re developing right now. See you there!

s2013

Web3D 2013

Meet the InstantReality team at Web3D 2013! We’re excited to present a new REST interface integrated into InstantReality, which enables efficient data exchange between (web-)applications. As usual, we have new hardware support up our sleeves and will present an X3DOM scene using an Oculus Rift HMD with headtracking, which is supplied by the InstantIO server component via WebSockets. No plugins required! See you in San Sebastian!

InstantReality at Siggraph 2012

Come meet the developer team at the exhibition floor of Siggraph 2012 in Los Angeles in booth #234! We brought some new Kinect goodies with us and will show a fusion of InstantReality as a service for out-of-core rendering with X3DOM. It’ll blow your mind! For those who can’t wait to see it, here is a 14.5 million triangle model paged to an iPad.

 

Supercomputing 2011 Demo by VirginiaTech

SC2011
X3D Scientific Visualization Mashups with Commodity Hardware (VT’s booth demo @ VT 2011) by VirginiaTech: Kinect + X3D (exports from VMD, MPIBlast, Paraview, Modo) rendered using InstantPlayer and OpenNI; gesture interface programmed in ecmascript.

A Future Remote Maintenance Scenario

Fraunhofer IGD presented this demonstrator at this year’s CeBIT in Hannover. The interactive multi-touch enabled demonstrator, which was developed within the “Software Cluster”, shows how deciders or technicians can integrate 3D documents via HTML5 into their workflow, e.g. by selecting an interactive 3D document on the touch table, sending it via e-mail, and then viewing it on a tablet — even at the building site or in the factory.

Altmühltal Nature Park Information Centre: »An interactive Panorama-Wall«

An interactive panorama wall

Design and Systems Institute developed an interactive wall for a guided tour of Altmühltal Nature-Park in Eichstätt. The installation is located in a Baroque dome-shaped hall (a former minster), and is part of a permanent exhibition about the park. The visitors can experience a virtual flight over the unique landscape and get additional information regarding to different topics. The 8 meter wide curved wall is conceived as a multi-user Application. A wide angle camera tracks the faces of the users and therefore they can directly interact with the scene.

3D-Icons distributed in the landscape can be triggered by the user-position in front of the wall – then background information is shown on info-layers with textual and visual content.

Project website

Coperion – Aluminium smelter

This application shows a complete aluminium manufacturing process. The tool was developed by Design and Systems Institute for Coperion.

It helps to communicate the internal operations in context of the whole process. Their tasks are involved in the design and implementation of a plant from project planning through to installation, and also in extremely complex individual process stages for handling alumina for the production of primary aluminium.

The user can navigate by 2D layers through the different parts in the virtual plant. Starting at raw material receiving with an controllable ship unloader up to the large hall with electrolysis cells.

Project website

DAVE – Definitely Affordable Virtual Environment

The DAVE is an immersive virtual environment located at the Graz University of Technology, Austria. By using mostly standard hardware components the installation costs are kept low. The DAVE is mainly used for interactive exploration of architectural models and other virtual worlds. It is also a scientific tool for controlled repeatable psychological tests and for the analysis of a variety of research questions, such as a planning tool for the development of efficient signage systems in public spaces.

Project website

Simulation for Training and Maneuvering of Wing in Ground Effect (WIG) Crafts

Wing in ground effect (WIG) crafts are ap­plicable as very fast vehicles for sea transport. The instruction and training of WIG craft pilots on the basis of an international approved program is up to now an unsolved problem. The training on the vehicle would not only expensive but also risky. A realistic simulation makes the operator familiar with the behavior in the water and in the air, even in extreme weather events and critical operating conditions.

Image by Frank Neumann / Rostock

Project website

Coperion – Virtual compounding plant

Design and Systems Institute developed an augmented reality application, first deployed at ‘K trade fair on Coperion group’s booth. A big real extruder was the central element and starting point of a complete virtual compounding plant which could be explored by some ar-viewer arranged at the stand. It enables users to see invisible processes inside the machine and plant components in full-scale.

Project website

Gutmann brewery – A guided tour

Since generations the gutmann brewery brews weiss-beer in traditional procedures. By the selection of the raw materials up to the storage many well-defined and necessary steps must be considerd to attain the desired result. Design and Systems Institute started to develop an interactive guided tour to explain the many little steps which leads to the treasured drink.

Project website

SONAR – Social Networks in Virtual Reality – An immersive 3D browser for Last.FM

The inherent structure of social networks is hidden in todays web portals. The task of SONAR, Social Networks in Virtual Reality, is to reveal the network structure, to make it interactively explorable, and to create an immersive user experience of the social network.

YouTube video

Classic 3D User Interaction Techniques for Immersive Virtual Reality Revisited

This project of Artificial Intelligence Group at University of Bielefeld demonstrates classic 3D user interaction techniques for navigation/travel, selection and manipulation applied to a virtual supermarket scenario. The implementations, a user study and the video have been done by our student group “Interaction in Virtual Reality” in fall/winter 2009/2010 for a video submission to the Grand Price contest of the 3D User Interfaces conference in 2010.

YouTube video

EADS TouchLab

The EADS TouchLab is an outstanding multimedia installation realized and developed by the agency NewMedia Yuppies GmbH. The table is placed within a multimedia environment and used as a next generation simulation and presentation tool for the client EADS. Together with the cooperation partner Fraunhofer IGD in Darmstadt, NMY brought an advanced system on track, which enables the user to fully interact within a 3D space in realtime using multitouch gestures. The installation is based on the InstantReality framework.

YouTube video

Display Explorer

Interactive kiosk for Merck Darmstadt informing about the behaviour of liquid crystals in flat screens through a realtime 3D-simulation. Ten custom designed portable terminals with touch-screen and 20″-auto-stereoscopic display for trade shows and permanent exhibitions worldwide. The 3D-scene is rendered and animated with instantplayer and controlled by a Flash-Application via TCP/IP-Socket. Concept, terminal design and 3D-application programming by Invirt GmbH for CAPCom AG. Methods for driving the Philips-3D-Display and a GLSL-Shader for the simulation of an etching process were developed by Fraunhofer IGD.

Project website

Haptic Rendering for Virtual Reality Simulations

Interactive VR simulations consist of visual and haptic feedback. The haptic signals are displayed to a human operator through a haptic device at a rate of 1kHz. The research of the DLR institute of robotics and mechatronics on this project comprises design and control of light-weight robots for haptic interaction, as well as volume based haptic rendering. Typical applications are assembly simulations, training of mechanics, and skill transfer.

Project website

Space Debris Visualization

With more and more man-made objects orbiting the Earth – most of them junk and high velocity debris – space travel and maintaining satellite orbits is becoming increasingly problematic. To illustrate the dangers of the more than 12000 tracked (and 600000 estimated) objects, ESA uses InstantReality to visualize the distribution and temporal variation of these objects. The results can be viewed either in real time, but they have also been compiled into an informative video, which is available at the ESOC in Darmstadt.

Project website

Augmented Reality Sightseeing

At CeBIT 2009 Fraunhofer IGD presented Augmented Reality technology for assistant living using the example of Berlin. Center of the project was a table with a satellite image of Berlin on which a 3D model of the Berlin Wall and the urban development from 1940 – 2008 are displayed. Therefore urban grain plans showing areas covered with buildings is augmented on the satellite image. The visualization was presented on UMPCs and the iPhone via video seethrough.

Furthermore posters showed the system working outdoors. Historic photographs are seamless superimposed and showing the development of landmarks. The interface is kept very simple by fading in information and overlays context and location-sensitive.

Project website

Educational 3D Visualization of Astronaut Motion in Microgravity

In this joint project of the MIT Department of Aeronautics and Astronautics and the MIT Office of Educational Innovation and Technology, researchers collaborated with visual artists to create three-dimensional visualizations of astronaut motion in microgravity. Through the graphical user interface, which is implemented in X3D, students can interactively explore how astronauts rotate in a microgravity environment without using any external torques, i.e. without contact from the surroundings. These rotations are simulated with results from recent research in astronautics [Stirling, Ph.D. thesis, 2008] using mathematical algorithms that implement core curriculum concepts, such as conservation of linear and angular momentum. The computer character, which was modeled and texture-mapped in Maya®, wears a next generation astronaut space suit which MIT researchers develop for NASA.

Project website

Radiohead’s ‘House of Cards’

Radiohead released the 3d data of their video “House of Cards” under the Creative Commons license. We wrote a converter script that creates an animated ParticleSet from the csv point data. The result is a real-time rendered music video you can walk through. The new Heyewall 2.0 renders the data on a 8160 x 4000 pixel resolution.

YouTube video

Multi-Touch 3D Architecture Application at CeBIT 2008

IGD’s multi-touch table application for visualization of 3D architectural models presented at CeBIT 2008. It features several scalable architectural 2D plans. Multiple users can move and zoom these plans like you know it from other multi-touch applications.

But the key feature is the tile with a high quality 3D view of the building. We implemented the multi-touch 3D camera gestures we introduced last year: You are grabbing a plan with the left hand. One finger of right hand moves the camera through the 3D model. The second finger defines the orientation of the camera. This enables incredible cinematic camera movements in 3D. When the second finger points on a certain object on the plan and the first finger moves around it the 3D camera moves around that object while keeping it on focus.

Project website

iTACITUS

iTACITUS is a sixth framework programme project and aims to privide a mobile cultural heritage information system for the individual. By combining itinerary planning, navigation and rich content on-site information, based upon a dispersed repository of historical and cultural resources it enables a complete information system and media experience for historical interested travellers. IGD’s part is the development of a “Mobile Augmented Reality (AR) Guide Framework” for Cultural Heritage (CH) sites. The framework delivers advanced
markerless tracking on mobile computers as well as new interaction paradigms in AR featuring touch and motion capabilities. In addition to visual components like annotated landscapes and superimposed environments the framework will feature a reactive accoustic AR module.

Project website

VRAx

VRAx® denotes a platform for designing machine tools that is based on Virtual Reality (VR) technologies analogous to existing CAx tools. VR technology is used as an active development and design medium in this process, and data created in VR are recirculated into the overall development process. Variants of machine tools with parallel kinematics can readily be created by combining a modular approach with free modeling functionality. This reduces the development times of new machine tools by the direct implementation of the customer’s wishes in an immersive environment and with considerably increased transparency of the development process. This is achieved by consistent integration of VR into the development and design process. Starting from a building block system of components of machine tools with parallel kinematics (PKM), functions for freeform modeling and optimizing the machine structure become available.

Project website

ParaMEDIC at SuperComputing 2007

Virginia Tech’s ParaMEDIC (Parallel Metadata Environment for Distributed I/O and Computing) won the Storage Challenge at IEEE/ACM International Conference for High Performance Computing, Networking, Storage and Analysis (). ParaMEDIC is a high-performance and portable framework to decouple computation and I/O in applications that require large quantities of both resources simultaneously. The visualization at the Virginia Tech’s Booth was rendered on 4 monitors with instant player and on one monitor in Argonne National Lab’s booth.

Project website

Augmented Reality Measuring System

In order to identify differences between physical and digital mock-ups, which do not match exactly, and to transfer those differences back into CAD format, we developed an AR based measuring system. The system allows matching the CAD data with real mock-ups and documenting differences between them. Essential functions like measurement and online construction are provided, allowing the end-users to create information in AR space and feeding them back into the CAD model. The application was developed within an authoring tool based on the instant*reality*-framework. Furthermore, it was presented on ISMAR ’07. The tool was developed for the needs of the submarine engineering of  (Howaldtswerke – Deutsche Werft GmbH), who also funded the project.

Coperion Visualization at K Fair

We developed together with Design and Systems Institute a technical information visualization based on our multi-touch table and an 8 meter wide HD projection. The complete interactive plastics production process was presented at K trade fair on Coperion group’s (global market leader for compounding systems) booth. It enables users to see and understand complex and invisible processes inside the machines via a simple multi-touch interface.

Project website

Virtual Graffiti at the International Automotive Fair (IAA)

Virtual Graffiti was the eye-catcher of the booth of car manufacturer SMART at this years. The new design and technology of Virtual Graffiti make it easy-to-install and easy-to-use. Many new features, such as automatic sending of the painting at your e-mail address, are now available.

InViS – Integrated Virtual Shipbuilding

The BMBF-funded project InViS (Integrated Virtual Shipbuilding) aimed at developing innovative tools for the product development and at their integration and introduction in shipbuilding industry. Focus was laid on concepts for integration of Virtual Reality (VR), tele-cooperation and simulation in the product development process. On top of the integration platform the instantreality player was used as tool that provides interactive operation capability to the platform’s services.

Architecture Visualisation – Messe Frankfurt GmbH

To validate the architecture of the planned exhibition halls the Frankfurt Fair is using the instant reality player for interactive exploring the architectural 3-D models. Starting from the architectural 2-D plans a high quality 3-D model of the booth area has been generated by Mainfeld. This model is visualised using the instantplayer within a CAVE  environment and on the HEyeWall. Therefore the walk model has been implemented realising intuitive navigation through the VR model. In contrast to the free 3-D flying mode the walk mode avoids the penetration of walls and it supports a walking simulation on floors and stairs. Thus an intuitive navigation algorithm has been implemented that is dedicated to users with no experiences in 3-D computer graphics. Using this architecture visualisation the decision making process of the architecture planning was supported. VR visualisation of architecture has been proofed as a valuable planning tool.

Project website

Kaskade

Kaskade is an interactive, stereoscopic X3D installation visualising parasitic behaviour. Using a WiiMote the user can navigate through a continously reconfiguring landscape made up of a 7×7 grid whose cells are controlled by a modified Game-Of-Life algorithm with four states. Each algorithm state is mapped to one of four visual representations: desert, forest, city and industry. These representations are fully animated for setup and tear down making the landscape an unpredictable, ever-changing world to explore. Using GLSL-Shaders a sketched look was realized. The project was developed as diploma thesis at the CrossMediaLab of the HfG Offenbach.

Project website

Virtual Graffiti

IGD/CAMTech’s Virtual Graffiti was selected to be exhibited in Singapore Science Centre’s iFuture exhibition from December 2006 to April 2007. During the iFuture Opening on 10 Dec 2006, Guest of Honour, President S.R Nathan and Dr Tony Tan were present to grace the event.

Project website

Interactive Goal Keeper

The interactive goal keeper-game is a virtual application for audience entertainment to use on exhibitions and on sales promotion. Via camera, the player beomes integrated into the virtual soccer scene. As a goalkeeper, the player has to catch the computer-animated balls. During the game a photo of the player will be taken and printed afterwards including the respective company’s branding.

Project website

Visualization of Statistics in Virtual Reality

This project tries to find new possibilities for the representation of high statistic values by using the endless virtual space. One can operate, inspect and thus physically experience huge diagrams in the virtual reality by means of an avatar. To make the diagrams more
seizable one can compare them with objects e.g. the Eiffel Tower. The user can operate and navigate different layers with different topics by means of two interface devices a gamepad and a hmd. Sensors of touch and proximity induce the growth of diagrams and the representation of dynamic information. The project was developed as diploma thesis at the and the institute.

Project website

Cybersaw – A Mixed Reality Chainsaw

Fraunhofer IGD created an entertaining chainsaw simulation for the worldwide leading  chainsaw manufacturers . The project simulates a typical working scenario with chainsaws via Mixed Reality technology. The main attraction is its interaction device: a real chainsaw with haptic and tracking capabilities.

Project website

Vivera – Photorealistic Rendering

The  presented the installation “Photorealistic Rendering” durng CeBIT 2006. The installation shows a 3D model of a BMW 1 Series car rendered in photorealistic quality. To expose the link between “photography” and “rendering” a light table with photographies of the car is used as interation device. When a user touches a photography the virtual camera shows the rendered car moves in the same position as on the photo.

Project website

Coperion – Bulk Material Plant Visualization

The  is the global market and technology leader for compounding systems, bulk materials  plants and components. The institute  and  used the instant technology to create a real time visualization of Coperion’s bulk materials plants. instant’s powerful particle system made it possible to simulate the complex internal processes and the flow of different bulk materials in real time.

Project website

MovableScreen

MovableScreen is a new Human-Computer-Interface where the display itself acts as an interfaces and a navigation system. It consists of an Apple iMac mounted on a 360° rotating pillar. The vertical rotation of the pillar is directly mapped on the virtual camera. The horizontal rotation is achieved by tracking the users head with the built-in camera. The Strength of the system lies in architectual walkthroughs and the presentations of cultural heritage artefacts.

Project website

Virtual Cathedral of Siena

The developed a virtual model of the cathedral on behalf of the Federal Ministry of Research and Technology. A very detailed model of the interior of the cathedral were developed. Special attractions represent the hexagonal cupola as well as the Piccolomini library. Beside a detail-faithful model complex elumination simulations provides for a photo-realistic representation. The project was first presented on the EXPO 2000 in Hannover. Later the project has been modified for presentation on Fraunhofer IGD’s  and.

Project website

Virtual Human

The project Virtual Human is about completely new questions of autonomous planning of animation and dialogue behavior of virtual characters as well as their photo-realistic representation in realtime – in contrast to pre-animated virtual actors in films or the avatars in chatrooms or teleconferences. The acceptance of the virtual dialog partner substantially depends on his visual appearance. Highest quality of the computer-graphic representation and natural movements, gestures and face expressions are therefore the main work topics in the project.

Project website

Archeoguide

With Archeoguide, a new and fascinating approach has become reality, offering the visitor on-demand information in compelling new ways. Being equipped with the Archeoguide system, the visitor can explore an ancient site on his own while getting a realistic impression of its original shape. The visitor can view virtual reconstructions of monuments appearing at their original surroundings at the time he is looking at their remains. Moreover, dynamic images such as virtualised historical sports events can also be incorporated into the visitor’s view, and he can discover what life was like back in the past.

Project website

Interactive Planetarium

The Interactive Planetarium takes you to a journey into the orbit, the solar system and the milky way. Experience a black hole and the vulcans on Jupiter’s moon Titan.

Vivera Medical Visualization

The aim of the competence network  (funded by) is the integration of Virtual and Augmented Reality in industrial and medical applications. This medical volume rendering system has been developed with the instant framework.

Driving Simulator

A driving simulator which constructs the street data (e.g. geometry) in real-time from common Navigation-CD datasets. Used in a multiscreen environment with some custom hardware (extra pipe for rearview mirror).