Show simple item record

dc.contributor.authorO'Dwyer, Jonny
dc.contributor.authorMurray, Niall
dc.contributor.authorFlynn, Ronan
dc.date.accessioned2020-04-14T10:02:03Z
dc.date.available2020-04-14T10:02:03Z
dc.date.copyright2019-09
dc.date.issued2019-12
dc.identifier.citationO'Dwyer, J., Murray, N., Flynn, R. (2019). Eye-based continuous affect prediction. Paper in2019 8th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW). September 3rd-16th 2019 Cambridge, UK.en_US
dc.identifier.otherOther - Faculty of Engineering & Informatics AITen_US
dc.identifier.urihttp://research.thea.ie/handle/20.500.12065/3092
dc.description.abstractEye-based information channels include the pupils, gaze, saccades, fixational movements, and numerous forms of eye opening and closure. Pupil size variation indicates cognitive load and emotion, while a person's gaze direction is said to be congruent with the motivation to approach or avoid stimuli. The eyelids are involved in facial expressions that can encode basic emotions. Additionally, eye-based cues can have implications for human annotators of affect. Despite these facts, the use of eye-based cues in affective computing is in its infancy and this work is intended to start to address this. Eye-based feature sets, incorporating data from all of the aforementioned information channels, that can be estimated from video are proposed. Feature set refinement is provided by way of continuous arousal and valence learning and prediction experiments on the RECOLA validation set. The eye-based features are then combined with a speech feature set to provide confirmation of their usefulness and assess affect prediction performance compared with group-of-humans-level performance on the RECOLA test set. The core contribution of this paper, a refined eye-based feature set, is shown to provide benefits for affect prediction. It is hoped that this work stimulates further research into eye-based affective computing.en_US
dc.formatPDFen_US
dc.language.isoenen_US
dc.publisherIEEE Xploreen_US
dc.relation.ispartof2019 8th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW)en_US
dc.rightsAttribution-NonCommercial-NoDerivs 3.0 Ireland*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/3.0/ie/*
dc.subjectEye gazeen_US
dc.subjectPupillometryen_US
dc.subjectEye closureen_US
dc.subjectAffective computingen_US
dc.subjectFeature engineeringen_US
dc.titleEye-based continuous affect prediction.en_US
dc.typeOtheren_US
dc.description.peerreviewyesen_US
dc.identifier.conference2019 8th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW) September 3rd-16th 2019 Cambridge, UK.
dc.identifier.doiDOI: 10.1109/ACII.2019.8925470
dc.identifier.orcidhttps://orcid.org/0000-0002-6073-567X
dc.identifier.orcidhttps://orcid.org/0000-0002-5919-0596
dc.identifier.orcidhttps://orcid.org/0000-0002-6475-005X
dc.rights.accessOpen Accessen_US
dc.subject.departmentFaculty of Engineering & Informatics AITen_US


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record

Attribution-NonCommercial-NoDerivs 3.0 Ireland
Except where otherwise noted, this item's license is described as Attribution-NonCommercial-NoDerivs 3.0 Ireland