I am a Sr. User Experience Researcher at Synaptics. My work is situated at the intersection of Human-Computer Interaction and Human Augmentation. I design, implement and study novel input technology in the areas of mobile, tangible & non-visual interaction, assistive augmentation and automotive UX.
Beyond code & electronics, I am in to guitars, volleyball and cooking. I am a bike nerd, so-so photographer and snowboard newbie.
- Sep 06 Article on finger augmentation accepted to ACM Computing Surveys.
- Sep 04 I will act as program co-chair for ACM TVX 2016 in Chicago.
- Jul 20 ACM Computing Surveys article on video interaction accepted.
- Jun 15 Joined Synaptics' UX design team. Thrilled!
- Mar 06 Bunch of new projects and AH '15 about to kick off. Busy times.
FingerReader: A Wearable Device to Explore Printed Text on the Go
Shilkrot*, Huber*, Wong, Maes and Nanayakkara
Full Paper. In Proceedings of CHI '15. [* equal contribution]
EarPut: Augmenting Ear-worn Devices for Ear-based Interaction
Lissermann, Huber, Hadjakos, Nanayakkara and Mühlhäuser
Full Paper. In Proceedings of OzCHI ’14.
A Research Overview of Mobile Projected User Interfaces
Journal Article. Informatik Spektrum. 2014.
Permulin: Mixed-Focus Collaboration on Multi-View Tabletops
Lissermann, Huber, Schmitz, Steimle and Mühlhäuser
Full Paper. In Proceedings of CHI ’14.