Monday, 16 May 2016

Microsoft Research can (almost) read your mind!

In this highly demanding technological era, mobile devices have created a new way of interaction. Being a smart touch phone, interaction has become easier than before by using the capacitive touchscreen. However, Google and Apple have been trying since last few years to treat the fingers differently respective to the context. A new research by Microsoft has fixed its “one tap, one touch” paradigm.



Microsoft has implemented Google’s precognition of Force Touch’s pressure sensitivity where the mobile device can itself predict where the user’s finger is about to touch the screen. The fundamentals implemented on the new touchscreen is based on sensing an electrical disruption caused by two ways when finger approach near to the mobile screen and secondly the pressure detected while holding the edge of the mobile.

Self-capacitance sensor is what’s used in Microsoft prototype which is extremely sensitive towards the finger gestures. The finger touch can be detected before an inch or two away from the screen. This predictive touch feature can now pull up the controls by sensing the touch of how it has been held or be hovering on it. For instance, it can reveal the hyperlinks or pull up the video controls. It can also reduce the multiple actions by combining hover and touch to use the context menus.

The smart predictive touch by Microsoft can easily identify the rapid and precise tap. This means a precise tap will map to the exact button where the user wants to touch. Microsoft has created a demo right now and presented the research at the conference of Human-Computer Interaction (CHI) 2016 that may be used for real mobile devices in the future.

No comments:

Post a Comment