Understanding Personalised Auditory-Visual Associations in Multi-Modal Interactions

Research output: Chapter in Book/Report/Conference proceedingsConference proceedingpeer-review

Abstract

Can we sharpen our auditory and visual senses and better understand the relationship between these modalities to benefit our interactions in human-computer interfaces? This research paper proposes a framework to understand auditory-visual associations and explore the impact of emotion, personality, age and gender in understanding information from both modalities. Studies into the areas of emotion and personality as well as their association with the auditory and visual senses have increased within the fields of psychology, neuroscience, affective computing and human-computer interaction (HCI). From a HCI perspective, advances in technologies and machine learning techniques provide a new way to understand people and to develop systems where computers work along side people to help develop efficient interactions and clearer perceptions of our environment. The proposed framework will be developed along side a personalised auditory-visual interface that can be used to provide intelligent interactions with users that can help them learn from efficient associations between their senses. This research can be used to create personalised auditory-visual-emotion-personality profiles that can be use in adaptive musical teaching platforms, as well as mental health and wellness applications for more personalised care programs.

Original languageEnglish
Title of host publicationICMI 2021 - Proceedings of the 2021 International Conference on Multimodal Interaction
PublisherAssociation for Computing Machinery, Inc
Pages812-816
Number of pages5
ISBN (Electronic)9781450384810
DOIs
Publication statusPublished - 18 Oct 2021
Event23rd ACM International Conference on Multimodal Interaction, ICMI 2021 - Virtual, Online, Canada
Duration: 18 Oct 202122 Oct 2021

Publication series

NameICMI 2021 - Proceedings of the 2021 International Conference on Multimodal Interaction

Conference

Conference23rd ACM International Conference on Multimodal Interaction, ICMI 2021
Country/TerritoryCanada
CityVirtual, Online
Period18/10/2122/10/21

Keywords

  • auditory-visual associations
  • machine learning
  • multi-modal interactions
  • music
  • synaesthesia

Fingerprint

Dive into the research topics of 'Understanding Personalised Auditory-Visual Associations in Multi-Modal Interactions'. Together they form a unique fingerprint.

Cite this