Back to Portfolio

Gesture interaction

R&D UXInternship

Digital Planning Lab, URBAN REDEVELOPMENT AUTHORITY

Research & development of 2 vastly different gesture interaction technologies onto the in house map data planner.

The 2 technologies; the Myo Armband, which utilises accelerometer & gyroscope and myography sensors, vs the Leap Motion, which utilises infra-red computer vision, are vastly different.

Their fundamental difference in technology demanded the need for completely different implementations for an appropriate user experience.

The Myo Armband is velocity based, with the gestures acting as state switches. Your arm becomes a controller, making it rather intuitive. However, native gestures and commands are insensitive; I had to experiment with the raw data and program my own custom interactions.

The Leap Motion is position based and wired to a location, unlike MYO. Navigating an invisible 3D space proved to be far less intuitive, this demanded guiding visual on the screen.