Stella X. Yu : Papers / Google Scholar

BEAT: Berkeley Emotion and Affect Tracking Dataset
Zhihang Ren and Jefferson Ortega and Yunhui Guo and Stella X. Yu and David Whitney
IEEE/CVF CVPR Workshop on Populating Empty Cities: Virtual Humans for Robotics and Autonomous Driving, Seattle, Washington, 17 June 2024
Paper | Poster

Abstract
Recognizing the emotions of other humans is critical for our lives. Along with the rapid development of robotics, it is also crucial to enable machine recognition of human emotion. Many previous studies have focused on designing automatic emotion perception algorithms to understand the emotions of human characters. Limited by dataset curation procedures and small numbers of annotators, these algorithms heavily rely on facial expressions and fail to accurately reveal various emotional states. In this work, we build the first large video-based Emotion and Affect Tracking Dataset (BEAT) that contains not only facial expressions but also rich contextual information. BEAT has $124$ videos involving Hollywood movie cuts, documentaries, and homemade videos, and is annotated with continuous arousal and valence ratings as well as $11$ categorical emotional states. We recruited $245$ annotators, which guarantees the robustness of our annotations. The emotional annotations of BEAT span a wide range of arousal and valence values and contain various emotion categories. BEAT will be of great benefit to psychology studies on understanding human emotion perception mechanisms and the computer vision community to develop social-aware intelligent machines that are able to perceive human emotions.

Keywords
emotion, visual context, affect recognition