Industry & Research
Naoko Abe and Fetch, July 2020
In 2020 the Facility hosted a series of experiments by sociologist Dr Naoko Abe, then at the Australian Centre for Robotics. Abe’s experiments explored how a person came to understand that a robot was cooperating, or not cooperating, with them. The experiments involved experiment-specific trajectory planning, in-house customisation of code for the Facility’s Fetch robot platform (ROS Indigo) and augmentation of the robot’s hardware.
Abe, N., Rye, D., & Loke, L. (2022, August). A microsociological approach to understanding the boundary between robot cooperativeness and uncooperativeness in human-robot collaboration. In 2022 31st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN) (pp. 1085-1092). IEEE.
My Medic Watch, December 2018 and November 2022
My Medic Watch is a companion app for smartwatches that is designed to detect when its wearer falls or has a seizure, and alert emergency contacts. As part of medical trials to determine the efficacy of their approach to detecting falls, they contracted the National Facility for Human-Robot Interaction Research to hire a testing space and assist in the collection of high resolution time-stamped video from multiple viewpoints.
The experimenters were able to visit the space, take measurements, view data samples, and address specifics of bump-in, data retrieval, and experimental procedure. On the day of, they were able to set up crash mats and testing equipment, test ad-hoc networking, and check the synchronisation of their devices and ours to network time.
Various benefits of the facility showed in even small details: dedicated waiting rooms and interview spaces ensured a streamlined process for participants; ease-of-use with the Facility's experimental nterface allowed the experimenters to manage trial lifecycle themselves without the need for dedicated technical staff; and high-speed networking allowed for retrieval of trial data in a matter of seconds.
In all, the My Medic Watch team completed their experiment featuring dozens of participants with set up, integration, trial, and tear down all in a single day. Watch the video here!
Accademic Research and Teaching
COMP3431/COMP9434: Robotic Software Architecture major group project, Sep – Nov 2022
A group of students explored the use of modern gaze and pose detection algorithms in human-robot collaboration. A Fetch robot was given the ability to react to gaze and pose interactions detected through its onboard cameras. This project was done in collaboration with NVIDIA.
#robotics, #computer vision, #detection algorithms
Spatially Distributed Robot Sound: A Case Study
The potential of spatial sound to immerse, guide, and affect humans is well-known, but has so far received little attention in the field of human-robot interaction. In this study, we therefore explored how a robot emitting spatial sound affected study participants' impressions and behaviour. We created two immersive sound designs for robot Diamandini which were emitted across the robot’s body and through speakers in the environment. We deployed the robot in the National Facility for Human-Robot Interaction Research and conducted semi-structured interviews with 20 participants, after they had interacted with the system.
#robotics, #distributed sound
Robinson, F. A., Brown, O., Velonaki, M. (2023). Spatially Distributed Robot Sound: A Case Study [Unpublished manuscript]. In Proceedings of the DIS ’23: Designing Interactive Systems Conference. Designing Interactive Systems Conference. Pittsburgh, Pennsylvania: ACM.
Blake, M., ongoing 2023
Michael is an artist who works with VR and electro- mechanical devices, and has a background in film, television, and computer games. He is at the Creative Robotics Lab doing a PhD investigating the possibilities of using a companion robot to help support people with emotional regulation issues.
Chand, A., ongoing 2023
Avish’s research is focused on developing a Human-Robot Collaboration scenario where a robot reads the gaze and body posture of a person to infer their intent during a task. The aim of his study is to assess how effective gaze and pose are at inferring a person's intent, and how these modalities effect the human during the collaboration.
Christos, August 2019
Researchers for the Brain and Mind Centre, University of Sydney, were looking to track symptoms of cognitive decline through the tracking of movements about the home: attempting to identify uncharacteristic pauses or “forgetting”. Early detection (in a privacy-preserving manner) of cognitive decline could lead to early intervention, delaying or preventing the onset of dementia and furthering the length of useful consciousness in an individual. As such, the researchers were eager to validate technology claims for the accuracy and repeatability of location tracking for indoor subjects, necessary for such predictive behavioural models.
Priscila Duarte Guerra, P., 2022-23
Postgraduate Program in Visual Arts of the School of Communications and Arts of the University of São Paulo (PPGAV-ECA-USP)
The project entitled 安, 2023, has been developed mainly during an internship held at Creative Robotics Lab, UNSW Sydney, Australia under the co-supervision of Professor Mari Velonaki, funded by the CAPES-PRINT program, process 88887.695175/2022-00. It consists of an interactive agent whose movement is the fundamental characteristic for its formal and gestural design. The subtlety is an important quality of the movement of the interactive agent. Issues such as the degree to which that subtlety is perceived by interactors and which questions suggest with regard to the nature of the artwork began to guide its elaboration. The ambiguity proposed by the nearly imperceptible movement may suggest doubts as to whether 安 owns life or not, and raises questions regarding the link between movement and life.
#social-robotics #interactive art
Potential Use Cases
Automated translation system
A group is working on developing an automated translation system of some sign language (say Auslan) and is looking for some higher quality footage on which to train their models. When searching around they contact the NFHRI about acquiring some video from multiple angles as well as performance capture of the body and hand poses. The NFHRI offers them some dates and quotes a single rate for the day. They arrive, the data are recorded and then handed over to the group for their exclusive use/ownership. They return a couple more times for more recording sessions set up the same way.
Hotel concierge robots
A hotel chain had previously acquired a number of concierge/room-service robots, but the manufacturer of the robots went out of business a few years later. They contact the NFHRI unsure what to do about the fleet of robots, hoping to repurpose them in some way. The NFHRI advises them that they are unable to help them directly with reintegrating the robots, but does connect the hotel chain with a number of partners/clients that they may be able to work together with on doing so.
Indoor autonomous aerial vehicle
Researchers are trialling the build of an indoor autonomous aerial vehicle (say, a quadrotor) but have received some complaints about the noise it produces. They come to the NHFRI to do some sound-level testing: possible due to the well-engineered low-noise profile of the main experimental space. Satisfied, they later return to do a series of trials in a simulated office environment with their now-quieter AAV.
Medical companion robot
A small seed-round company is building a prototype of a medical companion robot. They have a talented team working on a whole host of compassionate behaviours but are struggling to build something that invites initial engagement and trust. Initial discussions with the NFHRI researchers and partners help guide them through the stages of product design and the NFHRI works with them through the experiment design and ethics approval of a trial to evaluate a series of shells for their robot.