Xiyuan Shen
I am a second-year PhD student at the University of Washington, jointly advised by Prof. Jacob O. Wobbrock in the ACE Lab and Prof. Shwetak Patel in the Ubicomp Lab. My research aims to develop next-generation sensing technologies that unobtrusively perceive and interpret human behavior in everyday environments. I am particularly interested in sensing for interaction and health, and in creating systems that integrate seamlessly into daily life.
Before joining UW, I earned my B.Eng. degree in Intelligence Engineering and Creative Design and M.S. in Computer Science at Tsinghua University, where I had the privilege of working with Prof. Chun Yu and Prof. Yuanchun Shi.
Outside of research, I enjoy singing, playing tennis, and playing volleyball.
I’m excited to continue developing human-centered sensing systems and to collaborate with researchers across HCI and ubiquitous computing. If you’re interested in working together or discussing shared research directions, feel free to reach out!
Selected Publications
We use a multimodal sensing system—encompassing pupil diameter, electrodermal activity, touchscreen telemetry, and driving kinematics—to examine how cognitive load shapes driver performance. The results show that increasing cognitive load degrades touchscreen interactions, disrupts motor–visual coordination, and reveals a characteristic hand-before-eye pattern under constrained attentional states.
WritingRing is a single-IMU ring system that enables natural, real-time 2D handwriting input.The system achieves 1.63 mm trajectory accuracy along with 88.7% letter and up to 84.36% word recognition accuracy. The approach is lightweight and can be integrated into existing ring devices, supporting a range of hands-free input applications.
We find that finger-motion patterns and the inherent structure of joints provide beneficial physical knowledge, which lead us to enhance motion perception accuracy by integrating physical priors into ML models. Thus, we propose MouseRing, a novel ring-shaped IMU device that enables continuous finger-sliding on unmodified physical surfaces like a touchpad.
We propose to explore teeth-clenching-based target selection in Augmented Reality, as the subtlety in the interaction can be beneficial to applications occupying the user hand or that are sensitive to social norms. To support the investigation, we implemented a novel EMG-based teeth-clenching detection system (ClenchClick) to support hand-free interactions in AR.