Intuitive & Precise Interaction in Handheld AR

Some of my recent research focused on developing novel methods to intuitively interact with 3D content in a handheld augmented reality scenario. In such a scenario, the user usually has only one hand available for interaction that limits the possibility to apply complex finger gestures for precise selection and 3D object manipulation.

DrillSample: Precise Selection in Dense Handheld AR Environments

We introduce a novel selection technique DrillSample, which allow for accurate selection in a one-handed dense handheld AR environments. It requires only single touch input for selection and preserves the full original spatial context of the selected objects. This allows for disambiguating and selection of strongly occluded objects or of objects with high similarity in visual appearance.  








3DTouch & HOMER-S: Intuitive Manipulation for One-Handed Handheld AR

We introduce two novel, intuitive six degree-of-freedom (6DOF) manipulation techniques, 3DTouch and HOMER-S that provide full 3D transformations of a virtual object in a one-handed handheld augmented reality scene. While 3DTouch uses only very simple touch gestures and decomposes the degrees of freedom, Homer-S provides integral 6DOF transformations and is decoupled from screen input to overcome physical limitations.






Virtual Reality Training for Upper Limb Prosthesis Patients


The initial fitting of an upper limb prosthesis can be a frustrating experience for amputees, impairing the learning of prosthesis control. Therefore, the prosthesis manufacturer Otto Bock in collaboration with Vienna University of Technology developed a Virtual Reality environment in which tasks can be trained in order to continuously motivate the amputees to practice their control skills without taking risks. 

iotracker optical motion capture system, developed by Vienna UT, is used to track the amputee's arm and head movement to allow for 3D input and correct visualization of the virtual environment in a head mounted display. Along with the tracking data, electromyography is used to generate input for grasping control of the virtual prosthesis, creating a realistic simulation. Tracking data of iotracker and electromyography is fed into ARTiFICE, an Augmented Reality Framework for Distributed Collaboration. ARTiFICe maps the tracking data to the virtual objects, using Unity3D game engine for networking, real time rendering and object’s physical behaviour.


ARTiFICe

Augmented Reality Framework for Distributed Collaboration

This project aims on developing a flexible and powerful VR/AR framework build around on an off the shelf game engine (Unity3D). It offers rapid prototyping to create distributed and collaborative virtual and augmented reality applications. Its flexible design allows the rapid integration of new hardware, software features as well as 3D interaction techniques. It can be used in fully immersive setups with 6DOF interaction devices and HMDs, semi-immersive enviroments with stereo projector walls, desktop setups as well as mobile environments. ARTiFiCe is currently used for our research and teaching purposes at Vienna University of Technlogy.


RTMIOT

Real-time Tunneling Measurement based on an Infrared Optical Tracker


This project develops a stereo camera system to perform tunneling measurement in real time based on optical infrared tracking.
To achieve the goals to track and determine the 3D-coordinates of several static as well as moving optical targets in a large measurement volume under tough conditions light vibrations, dust and interfering light, an infrared optical tracking system, usually used for standard room-sized indoor virtual reality applications, is heavily extended and optimized. 

d*star

Dynamic Spatial Test in Augmented Reality

In the project we develop a new type of test for the assessment of spatial abilities that differs from conventional spatial ability tests in several aspects.

First, traditional spatial ability tests (paper-pencil as well as on-screen computer versions) assess 3-dimensional spatial abilities with 2-dimensional means. The new test will measure the ability to visualize and mentally manipulate 3-dimensional objects in actual 3-dimensional space, and should thus have a higher ecological validity than previous spatial ability tests. This will be possible through use of the augmented reality tool Construct3D, which allows the projecting of virtual 3-dimensional objects into real space where they can be seen and manipulated by means of Head Mounted Display and 6DOF input device.