
Slide

Centre Interdisciplinaire
de Recherche et d’Innovation
en Cybersécurité et Société
de Recherche et d’Innovation
en Cybersécurité et Société
1.
Joyal, C. C.; Jacob, L.; Cigna, M. -H.; Guay, J. -P.; Renaud, P.
Virtual faces expressing emotions: An initial concomitant and construct validity study Article de journal
Dans: Frontiers in Human Neuroscience, vol. 8, no SEP, p. 1–6, 2014, ISSN: 16625161, (Publisher: Frontiers Media S. A.).
Résumé | Liens | BibTeX | Étiquettes: adult, anger, article, computer program, construct validity, corrugator supercilii muscle, disgust, Electromyography, emotion, emotionality, face muscle, Facial Expression, Fear, female, gaze, happiness, human, human experiment, male, Middle Aged, muscle contraction, normal human, positive feedback, sadness, surprise, task performance, virtual reality, Young Adult, zygomatic major muscle
@article{joyal_virtual_2014,
title = {Virtual faces expressing emotions: An initial concomitant and construct validity study},
author = {C. C. Joyal and L. Jacob and M. -H. Cigna and J. -P. Guay and P. Renaud},
url = {https://www.scopus.com/inward/record.uri?eid=2-s2.0-84933679803&doi=10.3389%2ffnhum.2014.00787&partnerID=40&md5=c51b26765fb1e2152cede99adcd519b0},
doi = {10.3389/fnhum.2014.00787},
issn = {16625161},
year = {2014},
date = {2014-01-01},
journal = {Frontiers in Human Neuroscience},
volume = {8},
number = {SEP},
pages = {1–6},
abstract = {Objectives: The goal of this study was to initially assess concomitants and construct validity of a newly developed set of virtual faces expressing six fundamental emotions (happiness, surprise, anger, sadness, fear, and disgust). Recognition rates, facial electromyography (zygomatic major and corrugator supercilii muscles), and regional gaze fixation latencies (eyes and mouth regions) were compared in 41 adult volunteers (20 ♂, 21 ♀) during the presentation of video clips depicting real vs. virtual adults expressing emotions. Background: Facial expressions of emotions represent classic stimuli for the studyofsocial cognition. Developing virtual dynamic facial expressions ofemotions, however, would open-up possibilities, both for fundamental and clinical research. For instance, virtual faces allow real-time Human–Computer retroactions between physiological measures and the virtual agent. Results: Emotions expressed by each set of stimuli were similarly recognized, both by men and women. Accordingly, both sets of stimuli elicited similar activation of facial muscles and similar ocular fixation times in eye regions from man and woman participants. Conclusion: Further validation studies can be performed with these virtual faces among clinical populations known to present social cognition difficulties. Brain–Computer Interface studies with feedback–feedforward interactions based on facial emotion expressions can also be conducted with these stimuli. © 2014 Joyal, Jacob, Cigna, Guay and Renaud.},
note = {Publisher: Frontiers Media S. A.},
keywords = {adult, anger, article, computer program, construct validity, corrugator supercilii muscle, disgust, Electromyography, emotion, emotionality, face muscle, Facial Expression, Fear, female, gaze, happiness, human, human experiment, male, Middle Aged, muscle contraction, normal human, positive feedback, sadness, surprise, task performance, virtual reality, Young Adult, zygomatic major muscle},
pubstate = {published},
tppubtype = {article}
}
Objectives: The goal of this study was to initially assess concomitants and construct validity of a newly developed set of virtual faces expressing six fundamental emotions (happiness, surprise, anger, sadness, fear, and disgust). Recognition rates, facial electromyography (zygomatic major and corrugator supercilii muscles), and regional gaze fixation latencies (eyes and mouth regions) were compared in 41 adult volunteers (20 ♂, 21 ♀) during the presentation of video clips depicting real vs. virtual adults expressing emotions. Background: Facial expressions of emotions represent classic stimuli for the studyofsocial cognition. Developing virtual dynamic facial expressions ofemotions, however, would open-up possibilities, both for fundamental and clinical research. For instance, virtual faces allow real-time Human–Computer retroactions between physiological measures and the virtual agent. Results: Emotions expressed by each set of stimuli were similarly recognized, both by men and women. Accordingly, both sets of stimuli elicited similar activation of facial muscles and similar ocular fixation times in eye regions from man and woman participants. Conclusion: Further validation studies can be performed with these virtual faces among clinical populations known to present social cognition difficulties. Brain–Computer Interface studies with feedback–feedforward interactions based on facial emotion expressions can also be conducted with these stimuli. © 2014 Joyal, Jacob, Cigna, Guay and Renaud.