Guidelines for Collecting Automatic Facial Expression Detection Data Synchronized with a Dynamic Stimulus in Remote Moderated User Tests
2021
Because of the COVID-19 pandemic, telework policies have required many user experience (UX) labs to restrict their research activities to remote user testing. Automatic Facial Expression Analysis (AFEA) is an accessible psychophysiological measurement that can be easily implemented in remote user tests. However, to date, the literature on Human Computer Interaction (HCI) has provided no guidelines for remote moderated user tests that collect facial expression data and synchronize them with the state of a dynamic stimulus such as a webpage. To address this research gap, this article offers guidelines for effective AFEA data collection that are based on a methodology developed in a concrete research context and on the lessons learned from applying it in four remote moderated user testing projects. Since researchers have less control over test environment settings, we maintain that they should pay greater attention to factors that can affect face detection and\or emotion classification prior, during, and after remote moderated user tests. Our study contributes to the development of methods for including psychophysiological and neurophysiological measurements in remote user tests that offer promising opportunities for information systems (IS) research, UX design, and even digital health research.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
0
References
3
Citations
NaN
KQI