Yet another cool interface comes out of Carnegie Mellon: scratch input. Using nothing more than a microphone whose input is filtered to remove lower frequency sounds (like voice), scratches on surfaces like walls or tables can be captured and analyzed as gestures.
I am amazed at how well this works on regular walls in the home. The video demonstrates that scratches made in the corner and above/beside doors work just as well as those made right beside the mic!
The demo shows a user controlling a music player with his scratch gestures, which is definitely cool, but I'm wondering if this might be better employed in the attempt to recreate those Minority Report immersive environments. Why use video to capture human hand movements as gestures, which could arguably require much more resolution, when some simple sounds would suffice? It does mean the user would be required to actually touch a surface to interact with it, but let's face it, that's much more natural than waving our hands in the air anyway.