Research Projects

Trust in Virtual Reality-based Intelligent Agent Advisors

In this project we duplicated a Triesman & Gelade (1980) visual conjunction search, within a 3d Virtual Reality environment, using the HTC Vive lab of CRP member Mark Billinghurst, University of South Australia.  Verbal directional cues (‘left’, ‘right’) were provided by a Virtual Agent at baseline (no cue), 50%, and 100% accuracy, and EEG, HRV, GSR, Subjective Metrics, and a novel behavioural measure of trust were collected.  The behavioural metric was congruency/incongruency between the advised direction and subject’s actual head movement.  This work is published at the 2019 VRST conference, and followed up with a journal publication in 2020.   

Gameification of Mechatronic Stroke Rehabilitation Platform

The goal of this work was to develop a system for ‘gamification’ of common rehabilitation techniques, for a post-stroke recovery regimen. Whereas traditional physiotherapy can be monotonous and therefore difficult to carry out, we leveraged 1) EEG-based BCI, 2) eye tracking, 3) large LCD display, 4) motorized arm support, and 5) example games programmed specifically for the project by the ‘Serious Games Group’ at U Tasmania (led by CRP member Dr. Kristy de Salas), to demonstrate a next generation stroke rehabilitation platform.This project obtained $25,000k from the Royal Hobart Hospital.

Augmented Reality-based spider phobia reduction technique

We devised a system whereby an AR spider was trained to approach / avoid a user, according to novel GSR-based estimation of human physiological stress.  The algorithm controlling the spider adjusts agent behaviour to enable interaction with the AR spider in a way that does not cause sharp elevations in physiological stress response.  Possible clinical uses are discussed in the paper.

“Face Switch” open source accessibility software

Here we developed a novel accessibility software suite, which enables users to operate basic GUI system using facial ‘gestures’.  This interface was developed in tandem with David Rozado of Otago Polyltechnics, NZ, formerly of the ICT Centre at CSIRO, and our students.  It is available for free use at

link:   https://accessibilitysoftwarehub.github.io/software.html

Light sensor glasses for interactive pupillometry in real-world environments 

One of the barriers to using Pupillometry as an interactive technology, is that the pupils dilate in response to more than just cognitive phenomena – rather they are very reactive to light sources.   The goal of this work was to develop a set of glasses, equipped with a pair of light sensors, which would be able to monitor ambient lighting conditions, and use this signal to filter the pupillary signal, providing the ability to filter out pupil dilations in response to fluctuations in ambient light.

PlAntenna: (interactive sensing system for commercial plant health) 

In a novel human-technology interaction paradigm, we applied machine learning (supervised learning, Support Vector Networks) to high frequency sensor data, to classify events in the life of a plant:.  An Arduino was run at full speed, (approx. 300mhz), and this signal was connected to a potted plant.  The capacitance of the RF skin effect on the plant provided the input to a Weka-based machine learning platform.  In this implementation I successfully classified 1) presence of botrytis (a mould), 2) soil aridification, and 3) animal foraging events.  This work was done in partnership with Janz Vinyards, a prominent Tasmanian vineyard and wine producer, who provided wine-grape vines for research purposes. 

The Trust Watch (a.k.a. ‘The Trust App’)

Application: Neo-natal care ward, Royal Hobart Hospital.  Smart-watch based app that allows users (nurses) to manually enter their current trust levels when operating a novel automated neonatal oxygenation control system.  Current methodology employs pen-and-paper tests, which are cumbersome, easy to make mistakes on, and which have a fundamental flaw whereby data is collected after-the-fact, rather than immediately during task operation.  This device allows measurements to be taken 1) immediately during task operation, 2) without considerable data input errors (e.g. illegible writing), and will allow for rapid retrieval of data (direct to CSV, rather than transcribing hand-written pages). Project in collaboration with UTAS ‘Serious Games’ group.

Cognitive Workload Estimation Toolkit

Hardware/software system to classify human mental states during ‘critical task environment’ operations, where task failure could result in the loss of life or property. We have used a number of techniques to measure human physiology, (EEG, GSR, Eye tracking, Heart Rate), and use these data to classify the level of workload that an operator is undergoing.  We have applied this technology in a number of application domains (maritime shipping; mine exploration; road transportation).  Later work has also included classification of trust in autonomous systems.

Mineral Exploration in abandoned mine shafts using UAV-based spectroscopy

Our team designed and built a medium-cost autonomous UAV, which could enter confined spaces not accessible by humans – specifically, abandoned mine shafts on the remote island of Tasmania, Australia.  We experimented with both 4 engine and 6 engine UAV platforms.  The team consisted of 12 people, and my main tasks were to conduct user testing of the devices and also to design 3d-printed camera mounts for the system.

 

Chess Playing Robot Arm

We used on-screen eye tracking to control the x-y position of a robot arm, which would then pick-up and replace pieces on a chess board.  The control methodology was later employed in our research into human-robot teleoperation, published in the journal HCI. 

 

Eye tracking analysis of molecular chirality decisions

In order to develop intelligent systems that can perform tasks at which humans generally excel, we investigated the eye tracking behaviours of expert level chemists (senior chemistry research scientists at the CSIRO-Clayton Research Labs).  Participants viewed a series of molecular structure diagrams, and were asked to determine the chirality (a valence which is either negative, or positive, depending on molecule structure).  We obtained detailed behavioural data for the expert decision making process, which fed directly into the ‘E-Science Platform’ project listed above.