Skip to the content | Change text size

The Bionics and Cognitive Science Centre

Research Projects

Touching Virtual Objects
The Exograsp
Kinesthetic and Cutaneous Components of Normal Haptic Perception
The Tactile Display System
Active versus Passive Haptic Exploration
The Phantom
Haptic Illusions
Haptic Feedback
Haptics versus Vision
Competing Haptic and Visual Perception of 3D Virtual Objects
Temperature Perception
Presence in Simulations

Touching Virtual Objects

A major project is the addition of tactile cues to the kinaesthetic inputs typically made available for the perception of haptically perceived virtual objects. This would represent a major step towards fully immersive virtual reality. Presently we are
constructing a hand-worn exoskeleton to provide kinesthetic cues and to this we shall add tactile cues.

The Exograsp

A hand-based force-feedback exoskeleton prototype is currently under development. This device allows a user to grasp a virtual object with thumb and forefinger. Temperature and pressure at the fingertip will provide additional cues to simulate contact with the virtual object.

Our team of in-house programmers have created a graphic of a hand that displays the movements made while using the Exograsp, shown at right holding a soft drink can. A short movie segment is available that shows the Exograsp and the hand graphic in action (2 qualities are available depending on your connection speed).

Exograsp movie: 817kb or 237kb

This powerpoint presentation provides an overview of intelligent grasping.

Kinesthetic and Cutaneous Components of Normal Haptic Perception

Using the Tactile Display System, we have been able to show that a stationary fingertip can be used to identify a raised line drawing passed under it, and that identification is as good as it is when an fingertip is moved along a pathway corresponding to the outline of the drawing. We are now exploring the limits of passive cutaneous processing by testing for information provided by shear force, and drag.

The Tactile Display System

The Tactile Display System (TDS) was specifically built to systematically compare active and passive tactile perception. It accurately records an active explorer's finger movements as they investigate a raised line drawing with their fingertip. A passive participant can then be guided over the same path, matching for position and speed so that the passive participant is essentially yoked to the active participant. The device has also been used to separate the components of haptic exploration in two dimensions.

Click on the thumbnails on theright for larger photos of the device.

There are also two short movies available showing the TDS operating in its two primary passive-guidance modes: 1. Where the passive subject is guided over the exploratory pathway taken by the active subject, matching for speed and location 2. Where the device holds the passive subject's finger still and moves the pattern underneath their fingertip, again matching the movement to that created by an active explorer. The TDS has also been used in studies comparing vision and touch. Subjects haptically explore a two-dimensional raised line drawing using the TDS (or a Phantom ). In another condition subjects watch these exploratory movements plotted out on a monitor using our Haptics to Vision Translation Program (HVTP). We have found touch and vision to be comparable for these tasks.

Active versus Passive Haptic Exploration

A team of our in-house programmers produced an application that can display any shape visually and haptically in a virtual world and record the movements of an individual as they explore the object with a Phantom. This program can then be used to guide another Phantom user over the same exploration path taken by the "active" (ie free) explorer.

A short movie segment is available that shows the Builder and Phantom in action (2 qualities are available depending on your connection speed). Builder movie: 731kb or 214kb

Using the Builder program, active and passive-guided exploration on two and three dimensions has been compared using a Phantom (or two when two Phantoms were yoked in real time). Passive exploration was found to be superior for two-dimensional exploration, consistent with our previous results
using the TDS. However, active exploration was superior for exploring three-dimensional virtual shapes. Further research along these lines is ongoing.

The Builder has recently been updated to allow for two Phantoms to interact in the same virtual space, where the movements of both can be recorded and "played back". Future research will examine active and passive exploration in three dimensions using more than one finger.

The Phantom

The Phantom is a force-feedback device that allows the exploration of virtual objects in three dimensions. We have four Phantoms (2 Desktop and 2 Omni Phantoms) and have used them for a number of projects, including comparing active and passive-guided touch in two and three dimensions and investigating haptic size and shape perception and with and without a visual counterpart.

Haptic Illusions

The fact that many illusions are experienced both haptically and visually suggests a common "spatial" mechanism in the brain. Using a Phantom, we have examined instances of when touch is dominated by vision for virtual and real objects.

Also, we are looking at whether touch and kinaesthesis mediate haptic illusions in different ways.

Haptic Feedback

We are investigating ways in which "Tadoma-like" cues can be exploited to present speech haptically (through the skin). Remotely controlled robots and other machines are also likely to benefit from the addition of haptic displays and we are currently working on a project to provide 3D visual and haptic feedback to remotely controlled mining machinery.

Haptics vs Vision

A substantial body of research has pitted haptics and vision against each other in order to determine which sense dominates, or "captures", the other. Generally vision is reported to be the "superior sense". However, in many of these studies haptics is disadvantaged, either because the task is more "natural" or more suited to vision, or because the haptic task simply does not make best use of the abilities of touch. In an ongoing research programme we are ensuring that the haptic and visual tasks are suitable for their respective senses, and are comparable between the senses.

The Haptics to Vision Translation Program (HVTP) was written to convert a recording of haptic movements into a visual analogue. We have been recording the movements of subjects as they explore two-dimensional raised line drawings with the TDS or the Phantom , or three-dimensional virtual objects with the Phantom. These movements are then shown traced out on a monitor, matching the speed of movement. In both the haptic and vision conditions the subject's task is to identify the stimulus as quickly and accurately as possible.

This short movie file demonstrates the output of the HVTP, showing two of it's principal functions:
1. to create a moving window, the equivalent to moving a 1cm window over the pathway taken by the haptic explorer;
2. to create a stationary window, the equivalent to moving the pathway taken by the haptic explorer behind a 1cm stationary window.

Generally, haptics and vision have been found to be equivalent when used to identify capital letters, and the moving window produces a superior performance compared to the stationary window condition.

Publications based on research using the HVTP can be found on the Publications web page.


(click to download)

Competing Haptic and Visual Perception of 3D Virtual Objects

In this study we vary the attributes (e.g., shape and size) of a virtual object presented visually and haptically at the same time. Because the visual and haptic features can be independently controlled, they can match, or be discrepant. When they differ, we test for dominance effects and tolerance of degredation in one mode and not the other. This helps us design virtual environments.

Temperature Perception

Since one of the tactile cues we hope to include for virtual objects is temperature change, we are investigating some factors affecting perception of temperature, particularly the role played by generation of one's own movement in determining temperature (i.e., active versus passive perception), and the likelihood of capture, created primarily by visual attributes of virtual stimuli.

Presence in Simulations

The extent to which individuals immersed in a simulated world feel as if they are "really there" can be measured as both a physical attribute and a behavioural one. For virtual stimuli to be perceived successfully the various visual, haptic and auditory cues provided have to be sufficiently compelling. Some of the required reality need not be present but can be achieved by approximations or by capture.

The behaviour of individuals in a simulated world may or may not depend on the extent to which that world is perceived as real. Computer games provide an environment that is patently not real and yet they are sufficiently engaging that individual's behaviour may still be controlled by the same expectations and social norms as in the real world. Our research in this area is at an early stage - we are using the SIMS game to take measurements of various behaviours given different subject characteristics and varying levels of external constraint. It is hoped that this research will be expanded to games played in a virtual world where the activity of individuals may be more realistic.

SAMSS BCSC Home