TY - GEN
T1 - Understanding Personalised Auditory-Visual Associations in Multi-Modal Interactions
AU - O'Toole, Patrick
N1 - Publisher Copyright:
© 2021 Owner/Author.
PY - 2021/10/18
Y1 - 2021/10/18
N2 - Can we sharpen our auditory and visual senses and better understand the relationship between these modalities to benefit our interactions in human-computer interfaces? This research paper proposes a framework to understand auditory-visual associations and explore the impact of emotion, personality, age and gender in understanding information from both modalities. Studies into the areas of emotion and personality as well as their association with the auditory and visual senses have increased within the fields of psychology, neuroscience, affective computing and human-computer interaction (HCI). From a HCI perspective, advances in technologies and machine learning techniques provide a new way to understand people and to develop systems where computers work along side people to help develop efficient interactions and clearer perceptions of our environment. The proposed framework will be developed along side a personalised auditory-visual interface that can be used to provide intelligent interactions with users that can help them learn from efficient associations between their senses. This research can be used to create personalised auditory-visual-emotion-personality profiles that can be use in adaptive musical teaching platforms, as well as mental health and wellness applications for more personalised care programs.
AB - Can we sharpen our auditory and visual senses and better understand the relationship between these modalities to benefit our interactions in human-computer interfaces? This research paper proposes a framework to understand auditory-visual associations and explore the impact of emotion, personality, age and gender in understanding information from both modalities. Studies into the areas of emotion and personality as well as their association with the auditory and visual senses have increased within the fields of psychology, neuroscience, affective computing and human-computer interaction (HCI). From a HCI perspective, advances in technologies and machine learning techniques provide a new way to understand people and to develop systems where computers work along side people to help develop efficient interactions and clearer perceptions of our environment. The proposed framework will be developed along side a personalised auditory-visual interface that can be used to provide intelligent interactions with users that can help them learn from efficient associations between their senses. This research can be used to create personalised auditory-visual-emotion-personality profiles that can be use in adaptive musical teaching platforms, as well as mental health and wellness applications for more personalised care programs.
KW - auditory-visual associations
KW - machine learning
KW - multi-modal interactions
KW - music
KW - synaesthesia
UR - https://www.scopus.com/pages/publications/85118973598
U2 - 10.1145/3462244.3481277
DO - 10.1145/3462244.3481277
M3 - Conference proceeding
AN - SCOPUS:85118973598
T3 - ICMI 2021 - Proceedings of the 2021 International Conference on Multimodal Interaction
SP - 812
EP - 816
BT - ICMI 2021 - Proceedings of the 2021 International Conference on Multimodal Interaction
PB - Association for Computing Machinery, Inc
T2 - 23rd ACM International Conference on Multimodal Interaction, ICMI 2021
Y2 - 18 October 2021 through 22 October 2021
ER -