I’d like to persuade you that gestures are a fundamental building block of human language and thought. This begins a series of blog posts on gestures and how physical movement in VR & AR affects cognition.
Part one of this series will deal with why gestures provide a shortcut to human thought.
But first, on the tech front:
Devices to capture small hand gestures are already available (like Microsoft Hololens) and more are underway. Project Soli at Google can use radar to track micro-motions and twitches. The radar from the device senses how the user moves his hands and can interpret the intent. Link to the full Project Soli video here.
Why are gestures powerful shortcuts to cognition?
I’m reposting an article from Scientific American here that answers “Why is talking with gestures so much easier than trying to talk without gesturing?” Psychology professor Michael P. Kaschak responds:
Takeaways for VR/AR Designers:
- People process information more deeply when they are gesturing
- Verbal areas of the brain are more active when speech accompanies gestures
- The tech exists for picking up human micro-gestures