Skip to main content
EURAXESS Researchers in motion
  • FUNDING
  • United Kingdom

Audio Target Calibration and Localisation in Augmented Reality (AR)

Details

Deadline
Applications accepted all year round
Research Field
Professions and applied sciences

About

Outline

Beyond the initial benefit to the general idea of ‘immersion’, the ‘uses’ of audio within AR technology are many and varied. There are Cognitive, User Experience and User Interface benefits to utilising audio more deeply in AR. Including audio in moving cues (e.g. skill based or shooting games) can reduce reaction time up to 50% (Barde, Ward, Helton & Billinghurst (2016)). Localising targets that emit sounds as users travel through space in games — whether it’s a whirring engine, a shuffling footstep or smooth servo sound adds to immersive experience; audio cues help users to understand the world around them before they have the time to turn their head in the direction the sound is coming from.

For highly immersive spatial audio, these spectral cues can be mimicked by calibrating headphones to head-related transfer functions (HRTFs). Since we all have different-sized heads and torsos, delivering audio through head-tracking headphones, calibrated to our unique, custom-measured HRTFs, is the challenge timely for AR and VR technologies.

One approach for customised HRTF for an individual AR headset-wearer would be HRTF data is captured by placing microphones inside user’s ears, and recording data on how sound waves hit the body as audio is played in a 360-degree manner around the user. This process is too onerous to do at scale, “generalist” HRTFs are encoded into the algorithms of many spatial audio technologies. (The generalist HRTF data comes from publicly available data sets, obtained from a generalist set of people’s heads and torsos.) However, emerging AR technologies are equipped with sensors designed to begin a custom “HRTF anatomy calibration” process.

This proposal aims to investigate the ability to acoustically conform the sound in the Experience to the environment that the device is placing it in. All AR technologies (handheld and head-worn) currently available have restrictions on the field of view that they have with their ability to be aware of the room that the device is in. Whether this is through inside-out tracking or stereo cameras capable of depth perception. Mobile AR is able to detect surfaces and allow movement around AR objects with the double-camera array that is included on iPhone X and AR-ready iPads (2017). Hololens, Meta and other AR headsets are able to detect the extent and depth of a room. Lower end mobile devices with only one camera are still able to detect surfaces, but do not have the same level of depth detection.

This work will investigate the efficiency and efficacy of some capture or calibration of the space to use this data for advanced acoustical rendering (similar to how a user must calibrate a VR setup to ensure the external cameras ‘know’ where the headset is in space). This type of calibration can be either stored on a mobile device to be used for processing or a real-time geometry-based rendering algorithm can be executed on non-tethered devices. Currently, physical rendering is only possible with a small number of software spatialisers (such as SteamAudio and NVIDIA).

This idea could be expanded to:

  • Sonification of the world visible through the AR window
  • Haptic feedback in AR / Mobile AR
  • Understanding and modelling the contents in the AR window

Supervisory team:

What is funded

Self-Funding Students Only

Eligibility

Academic Criteria:

A 2:1 Honours undergraduate degree or a master's degree, in computing or a related subject. Applicants with appropriate professional experience are also considered. Degree-level mathematics (or equivalent) is required for research in some project areas.

Applicants for whom English is not their first language must demonstrate proficiency by obtaining an IELTS score of at least 6.5 overall, with a minimum of 6.0 in each skills component.

How to Apply

To apply please complete the online application - https://www.cardiff.ac.uk/study/postgraduate/research/programmes/progra…, select the ‘self-funding’ option, and state the project title and supervisor name.

Organisation

Organisation name
Cardiff University
Organisation Country
More Information

Share this page
Disclaimer:

The responsibility for the funding offers published on this website, including the funding description, lies entirely with the publishing institutions. The application is handled uniquely by the employer, who is also fully responsible for the recruitment and selection processes.