Advancements in prototyping technologies – haptics and extended reality – are creating exciting new environments to enhance stakeholder and user interaction with design concepts. These interactions can now occur earlier in the design process, transforming feedback mechanisms resulting in greater and faster iterations. This is essential for bringing right-first-time products to market as quickly as possible.
While existing feedback tools, such as speak-aloud, surveys and/or questionnaires, are a useful means for capturing user feedback and reflections on interactions, there is a desire to explicitly map user feedback to their physical prototype interaction. Over the past decade, several hand-tracking tools have been developed that can, in principle, capture product user interaction.
In this paper, we explore the capability of the LeapMotion Controller, MediaPipe and Manus Prime X Haptic gloves to capture user interaction with prototypes. A broad perspective of capability is adopted, including accuracy as well as the practical aspects of knowledge, skills, and ease of use. In this study, challenges in accuracy, occlusion and data processing were elicited in the capture and translation of user interaction into design insights.