
Slide

Centre Interdisciplinaire
de Recherche et d’Innovation
en Cybersécurité et Société
de Recherche et d’Innovation
en Cybersécurité et Société
1.
Galaup, C.; Séoud, L.; Renaud, P.
Multimodal HCI: A Review of Computational Tools and Their Relevance to the Detection of Sexual Presence Article de journal
Dans: Applied Human Factors and Ergonomics International, vol. 119, p. 137–143, 2024, ISSN: 27710718 (ISSN).
Résumé | Liens | BibTeX | Étiquettes: EEG analysis, Multimodal HCI, Physiological signal analysis
@article{galaup_multimodal_2024,
title = {Multimodal HCI: A Review of Computational Tools and Their Relevance to the Detection of Sexual Presence},
author = {C. Galaup and L. Séoud and P. Renaud},
url = {https://www.scopus.com/inward/record.uri?eid=2-s2.0-105031152209&doi=10.54941%2Fahfe1004477&partnerID=40&md5=90b218deab22b451986584c149aab11c},
doi = {10.54941/ahfe1004477},
issn = {27710718 (ISSN)},
year = {2024},
date = {2024-01-01},
journal = {Applied Human Factors and Ergonomics International},
volume = {119},
pages = {137–143},
abstract = {Cybersexuality, referring to sexual interactions facilitated by or involving sexual technologies, for better or worse, is poised to play an increasingly significant role in people’s lives. The psychophysiological states stemming from such interactions with sexual technologies, and especially virtual reality (VR) scenarios, is termed “sexual presence” (SP). This work aims to review the different methods used to analyse and algorithmically evaluate multimodal electroencephalography (EEG)-centric physiological signals through a multimodal human-computer interface (HCI) and to pinpoint those who prove relevant to the detection of SP. Multimodal HCI are defined as the processing of combined natural modalities with multimedia system or environment. Each modality engages different human capabilities (cognitive, sensory, motion, perceptual). These capabilities, in response to the multimedia environment, can be quantified through psychophysiological signals such as EEG, electrocardiography (ECG), skin conductance, skin temperature, respiration rate, eye gaze, head movements, to name only the most common. While existing surveys have focused on the specific use of EEG to analyse emotions or on the measurement techniques and methods that have been used to record psycho-physiological signals, this work reviews the computational tools, mostly using machine and deep learning, to process, analyse and combine various physiological signals in HCI. Papers published in the last 10 years, combining at least two psycho-physiological signals in an HCI system were collected and reviewed, regardless of the field of application. The focus was mostly on the methodological aspects such as signal synchronization and calibration, fusion approach, model architecture, learning strategy. We put an emphasis on the methods that can be used to detect a subject’s condition in real time. At the light of this review, we can identify a research gap in terms of computational tools for multimodal data classification and prediction. This review will allow us to draw on existing work in other fields of application to address our specific application: to analyse EEG, oculometry and sexual plethysmography (penile for the men and vaginal for the women) signals together, using deep learning, to detect SP in subject immersed in an VR environment presenting sexual content. © 2024. Published by AHFE Open Access. All rights reserved.},
keywords = {EEG analysis, Multimodal HCI, Physiological signal analysis},
pubstate = {published},
tppubtype = {article}
}
Cybersexuality, referring to sexual interactions facilitated by or involving sexual technologies, for better or worse, is poised to play an increasingly significant role in people’s lives. The psychophysiological states stemming from such interactions with sexual technologies, and especially virtual reality (VR) scenarios, is termed “sexual presence” (SP). This work aims to review the different methods used to analyse and algorithmically evaluate multimodal electroencephalography (EEG)-centric physiological signals through a multimodal human-computer interface (HCI) and to pinpoint those who prove relevant to the detection of SP. Multimodal HCI are defined as the processing of combined natural modalities with multimedia system or environment. Each modality engages different human capabilities (cognitive, sensory, motion, perceptual). These capabilities, in response to the multimedia environment, can be quantified through psychophysiological signals such as EEG, electrocardiography (ECG), skin conductance, skin temperature, respiration rate, eye gaze, head movements, to name only the most common. While existing surveys have focused on the specific use of EEG to analyse emotions or on the measurement techniques and methods that have been used to record psycho-physiological signals, this work reviews the computational tools, mostly using machine and deep learning, to process, analyse and combine various physiological signals in HCI. Papers published in the last 10 years, combining at least two psycho-physiological signals in an HCI system were collected and reviewed, regardless of the field of application. The focus was mostly on the methodological aspects such as signal synchronization and calibration, fusion approach, model architecture, learning strategy. We put an emphasis on the methods that can be used to detect a subject’s condition in real time. At the light of this review, we can identify a research gap in terms of computational tools for multimodal data classification and prediction. This review will allow us to draw on existing work in other fields of application to address our specific application: to analyse EEG, oculometry and sexual plethysmography (penile for the men and vaginal for the women) signals together, using deep learning, to detect SP in subject immersed in an VR environment presenting sexual content. © 2024. Published by AHFE Open Access. All rights reserved.



