The future of computers is wearable sensors: Xbox chief
Microsoft's Xbox head believes that people will be wearing up to 10 sensors on their body to interact with technology and that this change will come in less than a decade.business Updated: Mar 09, 2013 15:31 IST
Microsoft's Xbox head believes that people will be wearing up to 10 sensors on their body to interact with technology and that this change will come in less than a decade.
Speaking at Microsoft's annual TechForum event, Don Mattrick told The Verge: "My personal belief, 10 years from now, we'll be wearing 10 sensors on our body collecting data and applying that data to things that are valuable to us as users." These sensors would not only collect data about a person's physical well-being, they would also help to translate movements and gestures into commands that can be understood by anything from a games console to an automated home device.
At the same event, Microsoft also suggested that future of its Bing search engine was as a platform that could make use of data generated by these sensors. In terms of wearable sensors, it is not difficult to understand how the technology would work in reality.
In February, a company called Thalmic Labs demoed the MYO, an armband worn below the elbow that translates the electrical activity generated by the wearer's arm muscles into commands for operating smartphones, tablets or full desktop PCs. With the MYO wearers can click their fingers and skip a track on the media player or lift a hand and turn up the volume.
However, redeveloping a search engine to analyse sensor data suggests Microsoft is betting big on the Internet of Things, and although the company wasn't clear in its explanation, this future platform evokes memories of Gander. Gander is a project currently underway at the University of Texas that uses static movement and sound sensors, installed at bus terminals, coffee shops and study spaces, that can analyse occupancy. With the Gander app, a user can find out how many people are standing in line waiting to be served at a cafe, how busy an area is based on volume or how long it will take for a bus to arrive.
All of these technologies are designed to remove computers from the equation of computer interaction, either by using gesture or voice controls to interact with a device or service or by having information automatically pushed to users when they need it. By eliminating both the smartphone and the touch interface, this is the ultimate aim of Google Glass, for example. Simply say what it is that you want and the headset takes care of everything.
However, just like the Google Glass headset's ability to automatically record everything the wearer sees, being able to use a search engine to search data collected from a person's body sensors raises an awful lot of privacy issues.