Xiyuan Shen

I am a second-year PhD student at the University of Washington, jointly advised by Prof. Jacob O. Wobbrock in the ACE Lab and Prof. Shwetak Patel in the Ubicomp Lab. My research aims to develop next-generation sensing technologies that unobtrusively perceive and interpret human behavior in everyday environments. I am particularly interested in sensing for interaction and health, and in creating systems that integrate seamlessly into daily life.

Before joining UW, I earned my B.Eng. degree in Intelligence Engineering and Creative Design and M.S. in Computer Science at Tsinghua University, where I had the privilege of working with Prof. Chun Yu and Prof. Yuanchun Shi.

Outside of research, I enjoy singing, playing tennis, and playing volleyball.

I’m excited to continue developing human-centered sensing systems and to collaborate with researchers across HCI and ubiquitous computing. If you’re interested in working together or discussing shared research directions, feel free to reach out!

Selected Publications

Touchscreens in Motion: Quantifying the Impact of Cognitive Load on Distracted Drivers

Xiyuan Shen, Seokhyun Hwang, Junhan Kong, Alexandre L. S. Filipowicz, Andrew Best, Jean Costa, Scott Carter, James Fogarty, Jacob O. Wobbrock

UIST '25: Proceedings of the 38th Annual ACM Symposium on User Interface Software and Technology, Article No.: 74, Pages 1 - 21, 2025 Link to paper

Touchscreens in Motion: Quantifying the Impact of Cognitive Load on Distracted Drivers

We use a multimodal sensing system—encompassing pupil diameter, electrodermal activity, touchscreen telemetry, and driving kinematics—to examine how cognitive load shapes driver performance. The results show that increasing cognitive load degrades touchscreen interactions, disrupts motor–visual coordination, and reveals a characteristic hand-before-eye pattern under constrained attentional states.

WritingRing: Enabling Natural Handwriting Input with a Single IMU Ring

Zhe He, Zixuan Wang, Chun Yu, Chengwen Zhang, Xiyuan Shen, Yuanchun Shi

CHI '25: Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems, Article No.: 731, Pages 1 - 15, 2025 Link to paper

WritingRing: Enabling Natural Handwriting Input with a Single IMU Ring

WritingRing is a single-IMU ring system that enables natural, real-time 2D handwriting input.The system achieves 1.63 mm trajectory accuracy along with 88.7% letter and up to 84.36% word recognition accuracy. The approach is lightweight and can be integrated into existing ring devices, supporting a range of hands-free input applications.

MouseRing: Always-available Touchpad Interaction with IMU Rings

Xiyuan Shen, Chun Yu, Xutong Wang, Chen Liang, Haozhan Chen, Yuanchun Shi

CHI '24: Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems Link to paper

MouseRing: Always-available Touchpad Interaction with IMU Rings

We find that finger-motion patterns and the inherent structure of joints provide beneficial physical knowledge, which lead us to enhance motion perception accuracy by integrating physical priors into ML models. Thus, we propose MouseRing, a novel ring-shaped IMU device that enables continuous finger-sliding on unmodified physical surfaces like a touchpad.

ClenchClick: Hands-Free Target Selection Method Leveraging Teeth-Clench for Augmented Reality

Xiyuan Shen, Yuxuan Yan, Chun Yu, Yuanchun Shi

Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, Volume 6, Issue 3, Article No.: 139, Pages 1 - 26 Link to paper

ClenchClick: Hands-Free Target Selection Method Leveraging Teeth-Clench for Augmented Reality

We propose to explore teeth-clenching-based target selection in Augmented Reality, as the subtlety in the interaction can be beneficial to applications occupying the user hand or that are sensitive to social norms. To support the investigation, we implemented a novel EMG-based teeth-clenching detection system (ClenchClick) to support hand-free interactions in AR.

View all publications