Research Projects
Tracking Body
My research focuses on understanding human motions, including facial expressions, body poses, and finger poses, utilizing innovative wearable sensing techniques. To achieve this, I have developed new wearable devices such as a neck-mounted device, a wristband, or a ring, to track human movements.
​
-
Facial expression: NeckFace​ (IMWUT'21)
-
Body poses: BodyTrak​ (IMWUT'22)
-
Finger poses: Ring-a-Pose​​ (IMWUT'25, to be appeared)
-
Applications​
-
Activity recognition: D-Touch (IUI'23)
-
Authentication: C-Auth (ISWC'23)
-
Silent Speech: SpeeChin (IMWUT'21)
-
Intelligent User Interface: HandyTrak (UIST'21)
-
BodyTrak (IMWUT'22)
I specialize in designing, implementing, and evaluating wearable sensing systems that track human body movement using various sensors (e.g., optical, acoustic, inertial and so on). My expertise includes rapid prototyping, signal processing, computer vision, deep learning, LLM, and HCI methodology. Leveraging human tracking data, I also have developed applications for novel input techniques and accessibility. Employing qualitative and quantitative HCI methodologies, my goal is to enhance user experience and connectivity in ubiquitous and spatial computing environments.
Novel Input Techniques
The human body's movement serves as an input medium for interacting with computer systems within a ubiquitous computing environment. My project aims to harness the potential of these movements as novel inputs. Additionally, considering the growing prevalence of using multiple devices, I have explored how each device can be integrated to generate innovative interactions.
-
Input techniques
-
Multi-device interactions
Touch+Finger (UIST'18)
Accessibility
American Sign Language (ASL) is the primary language for many North Americans who are deaf. ASL is a complex visual sign language encompassing various human motions, including facial expressions, face orientation, and hand pose. However, many existing wearable systems lack the capability to track all these types of information, resulting in unsatisfactory performance in ASL translation. Therefore, I am currently developing a low-powered acoustic sensor-based wearable system that track the various motions to facilitate communication between hearing and deaf individuals by translating ASL. This project empowers deaf individuals to use an ASL translation system in real-world scenarios, enhancing communication with those who may not be familiar with sign language.
-
ASL Translation on AR glasses with LLM
(To be submitted to IMWUT'25) -
Real-Time ASL Fingerspelling Recognition System
(Submitted to a HCI conference) -
Emotional Voice for ASL Translation System
-
Emotional voice integration in sign-to-speech translators
for deaf-to-hearing communication.
(Submitted to a HCI conference) -
Recognizing emotional and linguistic facial expressions
for facilitating communication between hearing and deaf individuals.
(To be submitted to IMWUT'25)
-