The University of Sydney and the University of Technology Sydney are developing novel, multimodal auditory sensory augmentation technologies to assist the visually impaired using assistive technologies based on wearable glasses with machine vision ( https://sites.google.com/view/masa-dec ). They have a large team exploring: (1) using sensors and machine artificial intelligence to extract information for a targeted objective; (2) rendering this information via the auditory channel as sound; (3) and enabling feedback control via hand/wrist or other sensors. They will be conducting psychophysical experiments to explore behavioural performance and coupling these experiments with EEG and/or fMRI studies. Experiments are typically run using motion capture and the latest AR/VR/XR equipment.

The team has a PhD position available for a student with an audio and music background and an interest or background in cognitive science to explore mainly non-verbal audio signals and their real-time rendering to convey spatial and navigational information about the surrounding environment.

This is a fully funded 3.5-year PhD scholarship, open to both domestic and international applicants.

Further details and contact details for David Alais and Craig Jin can be found here: https://sites.google.com/view/masa-dec/openings