Research Projects
Tracking Body
My research focuses on understanding human motions, including facial expressions, body poses, and finger poses, utilizing innovative wearable sensing techniques. To achieve this, I have developed new wearable devices such as a neck-mounted device, a wristband, or a ring, to track human movements.
-
Facial expression: NeckFace (IMWUT'21)
-
Body poses: BodyTrak (IMWUT'22)
-
Finger poses: Pose-a-Ring (submitted to CHI'24)
-
Applications
-
Activity recognition: D-Touch (IUI'23)
-
Authentication: C-Auth (ISWC'23)
-
Silent Speech: SpeeChin (IMWUT'21)
-
Intelligent User Interface: HandyTrak (UIST'21)
-
I specialize in designing, implementing, and evaluating wearable sensing systems that track human body movement using various sensors (e.g., optical, acoustic, inertial). My expertise includes rapid prototyping, signal processing, computer vision, deep learning, and HCI methodology. Leveraging human tracking data, I also have developed applications for novel input techniques and accessibility. Employing qualitative and quantitative HCI methodologies, my goal is to enhance user experience and connectivity in ubiquitous and spatial computing environments.
BodyTrak (IMWUT'22)
Novel Input Techniques
The human body's movement serves as an input medium for interacting with computer systems within a ubiquitous computing environment. My project aims to harness the potential of these movements as novel inputs. Additionally, considering the growing prevalence of using multiple devices, I have explored how each device can be integrated to generate innovative interactions.
-
Input techniques
-
Multi-device interactions
Touch+Finger (UIST'18)
Accessibility
American Sign Language (ASL) is the primary language for many North Americans who are deaf. ASL is a complex visual sign language encompassing various human motions, including facial expressions, face orientation, and hand pose. However, many existing wearable systems lack the capability to track all these types of information, resulting in unsatisfactory performance in ASL translation. Therefore, I am currently developing a low-powered acoustic sensor-based wearable system that track the various motions to facilitate communication between hearing and deaf individuals by translating ASL. This project empowers deaf individuals to use an ASL translation system in real-world scenarios, enhancing clear communication with those who may not be familiar with sign language.
-
ASL fingerspelling recognition system
(be submitted to IMWUT'24) -
Emotional sign language translation system
(to be submitted to ASSETS'24)