Microsoft Explains Why Motion Sensor Fusion is Important for Windows 8 and Tablets

By Wesley Fenlon

Sensor fusion processes the information from accelerometers, gyroscopes and magnetometers to make the data more useful for application developers.

When the Wii was released in 2006, the Wii Remote's accelerometers brought something completely new to the gaming world. In the half decade since, accelerometers have become old hat. Every smartphone and tablet on the market has a motion sensor, and most include gyroscopes and magnetometers for more accurate motion control. The latest Building Windows 8 blog delves into the specifics of how those technologies work together to better utilize motion sensing as a whole in mobile devices.

"Sensor fusion," as Microsoft calls it, isn't unique to Windows tablets. A Google TechTalk from 2010 explains how sensor data from gyros, accelerometers and compasses are combined into one accessible stream of information. That just makes Microsoft's breakdown more interesting, since it applies to the the entire field of mobile motion sensing--sensor fusion is the way of the future.

The post begins with a problem: getting test Windows 8 tablet hardware to work with an accelerometer and a magnetometer. Microsoft had difficulty with noise in the sensor data; curing the jitter with a filter caused the motion sensitivity to be too sluggish. They also had trouble stably combining horizontal and vertical movements. Enter the gyroscope and sensor fusion.

As Microsoft's Gavin Gear writes, using all three sensors together creates a "9-axis sensor fusion" that's more than the sum of its parts. In some cases, data from one or two sensors will be passed directly to software with no interpretation. Microsoft uses a pedometer as an example: raw accelerometer data can be handed off to an exercise app for the simple function of counting steps. For more complex applications, the accelerometer, gyroscope and magnetometer data is combined and processed:

The "magic" of sensor fusion is to mathematically combine the data from all three sensors to produce more sophisticated outputs, including a tilt-compensated compass, an inclinometer (exposing yaw, pitch, and roll), and more advanced representations of device orientation. With this kind of data, more sophisticated apps can produce fast, fluid, and responsive reactions to natural motions.
...Sensor fusion in Windows solves the problems of jittery movement and jerky transitions, reduces data integrity issues, and provides data that allows a seamless representation of full device motion in 3D space (without any awkward transitions).

Unsurprisingly, Microsoft's made a Metro app API for accessing sensor fusion data. Compared to the information available about sensor fusion on Windows 8 and Android, there's little documentation available about how Apple has implemented its own sensor solutions. Fusion is still a new technology finding its legs: 9-axis sensors with hardware-level infusion just hit manufacturing in September 2011.

The Google TechTalk linked above dates back to 2010, so obviously the theory behind sensor fusion has existed for some time. Windows 8 tablets will be among the first to take fusion from theory to widespread application with cutting edge hardware and the APIs to match.