Virtual Human Interface Virtual Human Interface Virtual Human Interface


Virtual Robots - The Future of Augmented Reality in Education

A virtual robot teaching e.g. physics and geography is deployed in an Augmented Reality (AR) application to break down the final frontier between physical robots and their virtual counterparts. The research was carried out as part of a research project (EASEL) in an effort to create expressive agents for symbiotic education and learning. The unique robotic control architecture, used our Temporal Disc Controllers (TDC) to addresses two critical aspects, namely to avoid repetitive behaviors and allowing high level controls (such as "wave", "read", "be happy") to be sent to the robots independent of low-level platform dependent implementations. To evaluate the robots' true social capabilities virtual models were treated and assessed as if they were real human faces and subjected to psychological validation and testing using automatic facial analysis tools based on Paul Eckman's Facial Action Coding System (FACS). Public validation included a FORBES Flow event, where a large number of people interacted with one of the virtual facial models to express their mood while none of them could detect that it was not a real human face.

WATCH DEMO VIDEO: Virtual Robots & Cardboard - The Future of Augmented Reality in Education!


* * *

Virtual Human Interface Virtual Human Interface Virtual Human Interface


TeleMedicine & Robotics

We developed a robotic concept to care for the elderly in their own homes. Our application relates to Ambient Assisted Living (AAL) and contains many novel elements of interfaces and interaction. We developed an architecture called Life Style and Health Management System (LHMS) using mobile platforms and deployed this in the context of home-based care. Our platform combines low-cost computing with wearable sensors, bi-directional visual communication, services and applications and effectors that can be instructed to carry out certain task in the patient’s own environment. The later of these technologies introduce simple, special purpose robots in the home to help them lead a normal life. These robots incorporate autonomous behavior, local intelligence and human-centered interaction, such as facial information processing and Ambient Facial Interfaces.


* * *

Virtual Human Interface Virtual Human Interface Virtual Human Interface


Ambient Facial Interfaces

Ambient Facial Interfaces (AFIs) provide visual feedback and confirmation to the user in a manner independent from age-, language-, culture- and mental alertness. AFIs use photo-realistic animated faces or photographic humans to display emotional facial expressions, non-verbal feedback, and body language most reliably recognizable by a person. These digital faces are controlled by the output parameters of physical measurements or data derived from the state of the user or the products and objects he or she is interacting with. The output of an AFI system combines these measurements into a single facial expression that is displayed to the user, thereby allowing them to evaluate overall “quality” at at-a-glance.


* * *

Virtual Human Interface Virtual Human Interface

Remote-controlled Robotic Film Camera

See PanoCAST / Gallery for more details.

Papers & Links

IJCSCI 2007 TeleMed 2008

iRobot , Bang&Olufsen