Hand Tracking Interaction Research
Tools: C++, C#, JavaScript, Python, Unity, Processing, openFrameworks, depth cameras (Kinect etc)
In the Intel RealSense group, we were bringing many natural user interface technologies to the public (e.g. gestures, speech, face tracking and recognition, eye tracking and emotion recognition). Since these technologies still didn’t have well-defined interaction paradigms, we ourselves had to understand the potential and the challenges in the context of desktop and mobile PCs. This work was then used to inform and empower the community to develop better experiences based on this technology.
The following are some of the projects I worked on.
User research
User research was instrumental in figuring out many of the following questions. We must have done several hundred user studies in the first few years.
- How does the public imagine interacting through gestures? What are the expectations?
- What kind of gestures and body postures emerge?
- What kind of feedback mechanisms we need to provide?
- What are the behavioural challenges we need to design for?
- What are the technological requirements to provide a good experience (FOV, latency, accuracy…)?
- What are the technical limitations in real world use?
Generating, prototyping and evaluating ideas
We produced a number of ideas and prototypes, some for apps, some interactions concepts, technical evaluations and demos. Some of the prototypes failed the ideas spectacularly, forcing us to revisit our expectations (like my Windows 8 Metro hand interaction attempt). Others were tested with users in order to validate the assumptions, evaluate the appeal or improve interactions.
Outreach Events
Outreach events like hackathons and competitions were crucial in order to get public feedback, teach them about the tech and generate ideas.
Games and Apps
We developed dozens of games and apps, all of which were user tested regularly during development. In the beginning we were optimistic with the productivity apps but after realizing the difficulties of gesture interaction, we mostly ended up with entertainment and communication applications (e.g. with background segmentation).
Prototyping
Since no known NUI design paradigms existed at the time, we quickly realized that rapid prototyping was necessary in order to test our own ideas and assumptions. This included developing our own computer vision algorithms while the underlying SDKs were still being developed.
High-level Findings
Comfort
- Imprecise gestures are more comfortable and less fatiguing
- Use gestures with breaks or resting periods
- Allow elbow resting if possible
- Arc or horizontal menus are easier than vertical menus (as they follow your natural hand movement path)
Prototyping
- Imagining something is very different from trying it!
- Lack of tactile feedback, fatigue, tech issues… all cannot be imagined
- Low fidelity prototypes go a long way
Gratuitous gestures
- Novelty wears off very quickly
- Subtle but well designed experiences go a long way
- Still need good storytelling
Children interact differently
- More literal
- They make faster and bigger gestures
Realism linked to expectations
- More realistic visuals => higher functionality expectations => more disappointment if unmet
Importance of good feedback
- The whole feedback loop… affordances, actions, feedback
- Tracking problems and how to fix it
- Outside FOV
- Distance from camera
Design Guidelines
Ultimately, we gathered the best practices and recommendations from our experiences to share with the community of developers in the human interface design guidelines document.