Skinput

The Skinput system rendering a series of buttons on the arm. Users can press the buttons directly, with their fingers, much like a touch screen.

Skinput is an input technology that uses bio-acoustic sensing to localize finger taps on the skin. When augmented with a pico-projector, the device can provide a direct manipulation, graphical user interface on the body. The technology was developed by Chris Harrison, Desney Tan and Dan Morris, at Microsoft Research's Computational User Experiences Group.[1] Skinput represents one way to decouple input from electronic devices with the aim of allowing devices to become smaller without simultaneously shrinking the surface area on which input can be performed. While other systems, like SixthSense have attempted this with computer vision, Skinput employs acoustics, which take advantage of the human body's natural sound conductive properties (e.g., bone conduction).[2] This allows the body to be annexed as an input surface without the need for the skin to be invasively instrumented with sensors, tracking markers, or other items.

Microsoft has not commented on the future of the projects, other than it is under active development. In 2010, it was reported that this would not appear in commercial devices for at least 2 years.[3]

  1. ^ "Skinput:Appropriating the Body as an Input Surface". Microsoft Research Computational User Experiences Group. Retrieved 26 May 2010.
  2. ^ Harrison, Dan (10–15 April 2010). "Skinput:Appropriating the Body as an Input Surface" (PDF). Proceedings of the ACM CHI Conference 2010.
  3. ^ Goode, Lauren (26 April 2010). "The Skinny on Touch Technology". Wall Street Journal.