About me

★ I am currently looking for academic and industry positions. Feel free to reach out if you are interested in chatting or collaborating. I am always open to new ideas and opportunities.

I am a PhD candidate at the Human-Centered Computing & Intelligent Sensing Lab (HiLab), Electrical and Computer Engineering (ECE), University of California, Los Angeles (UCLA), advised by Professor Yang Zhang. My research lies at Human-Computer Interaction (HCI) and Sensing, focusing on how energy and information flow during physical human-object interactions can enhance the context-awareness, connectivity and ubiquity of embedded computing systems. I take a system-level research approach to creating innovative designs that make these systems low-cost, power-efficient, intelligent and easy to use, with the ultimate goal of seamlessly blending technology into everyday life.

Research illustration

I am a future technology imaginer, passionate about being part of researchers and engineers who are bringing J.A.R.V.I.S. (Just A Rather Very Intelligent System) closer to reality. My PhD training in HCI lets me go beyond daydreaming, turning ideas into tangible, functional prototypes by blending creativity, research, and engineering. My work often integrates computer vision, machine learning, electronics and hardware to deliver innovative solutions. I envision software intelligence and physical devices being developed jointly to truly support everyday tasks and enrich real-life experiences through enjoyable and meaningful interactions.

Research

My research has been published in top-tier HCI venues, including:

ACM Conference on Human Factors in Computing Systems (CHI)

ACM Symposium on User Interface Software and Technology (UIST)

ACM Interactive Mobile Wearable and Ubiquitous Technologies (IMWUT)

Click here to expand technical summaries for all research projects

LuxAct: Enhance Everyday Objects for Visual Sensing with Interaction-Powered Illumination (UIST 2025)

augmented reality visible light communication computer vision

Xiaoying Yang, Qian Lu, Jeeeun Kim and Yang Zhang

  • - Designed information encoding and decoding schemes for camera-LED communication
  • - Designed and implemented a real-time image processing pipeline in C++ (OpenCV), with spatiotemporal feature engineering and graph-based LED tracking
  • - Engineered custom hardware modules across ten AR use cases

Hapt-Aids: Self-Powered, On-Body Haptics for Activity Monitoring (IMWUT 2025)

wearable sensing energy harvesting haptics

Xiaoying Yang, Vivian Shen, Chris Harrison and Yang Zhang (Equal Contribution)

  • - Implemented custom analog circuits that convert harvested energy into haptics
  • - Developed and validated four real-world applications across diverse body activities

Interaction-Power Stations: Turning Environments into Ubiquitous Power Stations for Charging Wearables (CHI 2024, Late-Breaking Work)

energy harvesting wireless power transfer wearables

Xiaoying Yang, Jacob Sayono, Jess Xu and Yang Zhang

  • - Engineered hardware prototypes to harvest, modulate, and transmit energy through the body to wirelessly charge wearables
  • - Investigated equivalent capacitive coupling model for wireless power transfer
  • - Built activity classification applications using power signals from interaction events

Headar: Sensing Head Gestures for Confirmation Dialogs on Smartwatches with Wearable Millimeter-Wave Radar (IMWUT 2023)

mmWave machine learning gesture sensing

Xiaoying Yang, Xue Wang, Gaofeng Dong, Zihan Yan, Mani Srivastava, Eiji Hayashi, Yang Zhang

  • - Conducted signal simulation and analysis to extract head gesture features in MATLAB
  • - Characterized individual gesture features through OptiTrack motion capture
  • - Developed a real-time mmWave+IMU signal acquisition and processing pipeline in Python
  • - Trained a spatiotemporal, multimodal deep learning model to recognize head gestures
  • - Built an application with Wear OS for smartwatch-laptop data communication

CubeSense++: Smart Environment Sensing with Interaction-Powered Corner Reflector Mechanisms (UIST 2023)

mmWave backscatter battery-free activity sensing

Xiaoying Yang, Jacob Sayono and Yang Zhang

  • - Designed reflectors using optimization techniques to uniquely respond to interactions
  • - Developed a real-time signal processing pipeline in Python for object and activity recognition

MiniKers: Interaction-Powered Smart Environment Automation (IMWUT 2022)

energy harvesting home automation

Xiaoying Yang, Jacob Sayono, Jess Xu, Jiahao "Nick" Li, Josiah Hester and Yang Zhang

  • - Designed a custom energy management circuit that integrates energy harvesting, sensing, and actuation using the nRF52
  • - Characterized power signal profiles from user interactions for activity sensing in Python

CubeSense: Wireless, Battery-Free Interactivity through Low-Cost Corner Reflector Mechanisms (CHI 2021, Late-Breaking Work)

mmWave backscatter battery-free input interfaces

Xiaoying Yang and Yang Zhang

  • - Investigated radar signal transmission and reflection properties
  • - Designed reflector mechanisms with Fusion 360 for interactive sensing and real-time signal processing pipelines in Python

Other Publications

LumosX: 3D Printed Anisotropic Light-Transfer (CHI 2025)

fabrication battery-free sensing

Qian Lu, Xiaoying Yang, Xue Wang, Jacob Sayono, Yang Zhang, and Jeeeun Kim

  • - Developed information encoding and decoding methods using cameras, light and reflectors

ForceSight: Non-Contact Force Sensing with Laser Speckle Imaging (UIST 2022)

computer vision gesture sensing 🏅 Honorable Mention for Demo

Siyou Pei, Pradyumna Chari, Xue Wang, Xiaoying Yang, Achuta Kadambi, Yang Zhang

  • - Implemented image processing algorithms for input sensing using Python (OpenCV)
  • - Built prototypes with cameras, lasers and 3D-printed models

Industrial Projects

Vibration Sensing with Communication-Centric mmWave (2024)

joint communication and sensing machine learning
  • - Implemented signal denoising and synthesis through target and channel feature disentanglement
  • - Implemented data collection and signal processing pipelines in C#
  • - Designed and trained machine learning models for sensing applications using PyTorch
  • - Deployed pre-trained models for real-time inference using ONNX

Investigation of Model Predictive Control Strategy for Autonomous Vehicles (2019)

autonomous driving control
  • - Implemented model predictive control and sliding mode control for trajectory tracking using Python, CARLA
  • - Integrated control algorithms into autonomous car codebase running on Linux

NAO Robot Kicking Ball (2019)

robotics computer vision object detection
  • - Implemented speech recognition, object detection and localization in Python

CV

My "HCI"

I am an HCI researcher by day, aiming to creat technologies that shape a better future. By night, I switch to a different kind of HCI — Human-Cat Interaction, where my furry supervisors set the agenda.

They are key contributors to both the breakthroughs and the breakdowns of my research, snoring their opinions into my zoom meetings, shutting down my computer with a single paw while I am on Overleaf, waking me up at 5am to make sure I never miss the AOE deadline, and chewing through my breadboard jumper wires for quality control. True collaborators.

  • Preview Animated
  • Preview Animated
  • Preview Animated