Skip to main content

Exploring the Development of Multi-Modal Computing

Smartphones have changed the way that users interact with devices, where touch is the main method of interaction. Moreover, the development of wearables that provide VR (virtual reality) and AR (augmented reality) technology is helping to advance gesture control applications.

While most software app developers are primarily focused on providing a method of control for VR and AR computing devices, such as head-mounted displays, there's an emerging ecosystem of companies working to bring this technology to smartphones and personal computers as well.

According to the latest worldwide market study by Juniper Research, gesture and motion control will become vital components for certain forms of human-computer interaction during the next decade.

New User Interface Market Development

The application of gesture and motion control technology in smartphone-based VR will be particularly important in promoting new usage. Juniper forecasts 128 million devices equipped with the technology by the end of 2016, rising to 492 million by 2020 -- that's a growth of over 280 percent.

The Juniper study found that progress has already occurred in the development of gesture and motion interfaces, from vendors such as Leap Motion and Thalmic Labs. Juniper says that they expect nearly 50 percent of all wearables and almost all VR to use the technology by 2021.

However, for more established platforms like PCs and smartphones, Juniper believes that usage will remain low -- with less than 5 percent of such devices using gesture control by that time.


The arrival of motion control for smartphone VR in 2017 will start a shift towards multi-modal computing -- utilizing peripherals, motion and gesture control. However, this will simply extend current functionalities, holding back adoption across devices as a whole, unless the user interface (UI) paradigm changes.

Gesture and motion control is currently an add-on for most consumer electronic devices, despite many smartphones and media tablets having sensors which can enable the technology.

"VR and wearables have shown the way that gesture and haptics can provide fresh ways to interact with technology," said James Moar, senior analyst at Juniper Research. "The game changer for other platforms will be when technology firms are brave enough to reinvent their UIs to incorporate gesture and motion control, rather than considering it an optional add-on."

Popular posts from this blog

How AI Reshapes a $360 Billion Foundry Market

Few technology sectors sit as close to the center of gravity in today's artificial intelligence (AI) economy as semiconductor manufacturing. Every AI chip that trains a frontier model, every GPU that powers a data center inference workload, and every power management IC that keeps hyperscaler facilities running traces its origins back to the global Foundry ecosystem. IDC's latest market study throws that reality into sharp relief, projecting that the broadly defined Foundry 2.0 market will surpass $360 billion in 2026, a 17 percent year-over-year gain that would have seemed optimistic even two years ago. For anyone advising boards or investment committees on technology and AI infrastructure strategy, this growth trajectory demands careful consideration. Foundry 2.0 Market Development The umbrella term covers four distinct verticals: pure-play foundry, non-memory integrated device manufacturer (IDM) production, outsourced semiconductor assembly and test (OSAT), and photomask fab...