The sensor-driven user interface (UI) will be an emergent theme in the next wave of mobile device innovation -- turning objects, locations, and people into networked, interactive elements.
By 2013, according to the latest market study by ABI Research, 85 percent of smartphones will ship with GPS, over 50 percent will ship with accelerometers, and almost 50 percent will have gyroscopes.
"The growth of sensors in smartphones will be driven by applications such as gaming, location awareness, and augmented reality, as well as the expansion of motion-based commands," says senior analyst Victoria Fodale at ABI.
The high-level operating system of a smartphone, which provides open application programming interfaces (APIs), has facilitated the use of data from cameras, sensors, and GPS receivers.
When an accelerometer is combined with a gyroscope, developers are able to create applications that can sense motion on six axes: up and down, left and right, forward and backwards, as well as roll, pitch, and yaw rotations.
This interactive capability gives a mobile device similar functionality to a game controller -- such as the Nintendo Wii.
Prompted by Apple's UI innovations with the iPhone, smartphone OEMs have poured resources into UI design and development. Many OEMs, particularly those using Google's Android OS, developed their own custom UI overlays.
Sensors will also help OEMs to innovate beyond a touchscreen UI for differentiation in the marketplace. However added functionality must be balanced with ease of use.
"There is an inherent paradox with technology," says Fodale. "As mobile devices integrate more technology, the UI must be kept simple enough to be intuitive for the user."
By 2013, according to the latest market study by ABI Research, 85 percent of smartphones will ship with GPS, over 50 percent will ship with accelerometers, and almost 50 percent will have gyroscopes.
"The growth of sensors in smartphones will be driven by applications such as gaming, location awareness, and augmented reality, as well as the expansion of motion-based commands," says senior analyst Victoria Fodale at ABI.
The high-level operating system of a smartphone, which provides open application programming interfaces (APIs), has facilitated the use of data from cameras, sensors, and GPS receivers.
When an accelerometer is combined with a gyroscope, developers are able to create applications that can sense motion on six axes: up and down, left and right, forward and backwards, as well as roll, pitch, and yaw rotations.
This interactive capability gives a mobile device similar functionality to a game controller -- such as the Nintendo Wii.
Prompted by Apple's UI innovations with the iPhone, smartphone OEMs have poured resources into UI design and development. Many OEMs, particularly those using Google's Android OS, developed their own custom UI overlays.
Sensors will also help OEMs to innovate beyond a touchscreen UI for differentiation in the marketplace. However added functionality must be balanced with ease of use.
"There is an inherent paradox with technology," says Fodale. "As mobile devices integrate more technology, the UI must be kept simple enough to be intuitive for the user."