top of page

Neckface: Continuously tracking full facial expressions
on neck-mounted wearables.

Chen, Tuochao, Yaxuan Li, Songyun Tao, Hyunchul Lim, Mose Sakashita, Ruidong Zhang, Francois Guimbretiere, and Cheng Zhang.

Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies 5, no. 2 (2021)

Screenshot 2023-11-20 at 3.50.39 PM.png

Facial expressions are highly informative for computers to understand and interpret a person’s mental and physical activities. However, continuously tracking facial expressions, especially when the user is in motion, is challenging. This paper presents NeckFace, a wearable sensing technology that can continuously track the full facial expressions using a neck-piece embedded with infrared (IR) cameras. A customized deep learning pipeline called NeckNet based on Resnet34 is developed to learn the captured infrared (IR) images of the chin and face and output 52 parameters representing the facial expressions. We demonstrated NeckFace on two common neck-mounted form factors: a necklace and a neckband (e.g., neck-mounted headphones), which was evaluated in a user study with 13 participants. The study results showed that NeckFace worked well when the participants were sitting, walking, or after remounting the device. We discuss the challenges and opportunities of using NeckFace in real-world applications.

bottom of page