Tuesday, February 22, 2011

Android Leapfrogs iPhone on Gesture UI

    Android Leapfrogs iPhone on Gesture UI
  • Gesture user-interface features were included in Google's recently unveiled Gingerbread release of the Android operating system. As a result, Android smartphones have an opportunity to one-up Apple's iPhone with advanced gesture recognition, such as the "lift-to-answer" feature in LG's Optimus Black smartphone just announced at the recent Consumer Electronics Show in Las Vegas.
  • Gesture user-interface features were included in Google's recently unveiled Gingerbread release of the Android operating system. As a result, Android smartphones have an opportunity to one-up Apple's iPhone with advanced gesture recognition, such as the "lift-to-answer" feature in LG's Optimus Black smartphone just announced at the recent Consumer Electronics Show in Las Vegas.Android programmers can now leverage the new motion processing application programmers interface (API) in Gingerbread (version 2.3) to create natural acting algorithms that leapfrog Apple’s iPhone. OEMs can use any gesture to activate any feature on their phones--for instance, LG's Optimus Black uses "shake" to evoke camera mode and a "tap" to evoke its music player.
    In its official pronouncements, Google has focused on the improved performance enhancements in Android Gingerbread--such as its concurrent garbage collector, faster event distribution and updated video drivers. But deep in the code was also new native support for micro-electro-mechanical system (MEMS) sensors. Support was added for gyroscopes and barometers to complement existing support for accelerometers and magnetometers (eCompass).


     

    The world’s slimmest smartphone, at 9.2 millimeters, is also the first based on a gesture user interface. 


    Using these MEMS sensors together allows apps to precisely locate users' longitude and latitude as well as their location in the third dimension--height from the ground or floor in a building. But besides location-based services (LBSs), the real-time tracking of motion and orientation in 3D space also enables complex gestures to be recognized, including ones that Apple's iPhone does not yet support--such as answering the phone by merely lifting it to your ear.
    Apple's iPhone pioneered the use of MEMS sensors in smartphones when it started automatically switching from portrait to landscape orientation with an accelerometer. But once the API was available to access the data from the MEMS accelerometer, developers started using it to control everything, such as steering cars in video games.
    For Android programmers, a whole new array of MEMS sensor APIs in Gingerbread will enable Android 2.3 smartphones to leapfrog the iPhone with user-friendly gesture recognition, such as hanging-up the phone by merely placing it face down. APIs now cover rotation vectors, linear acceleration, gravity, barometric pressure and magnetic-heading.
    By performing sensor fusion on these multiple MEMS sensor data streams, apps can translate 3D location and orientation with enough accuracy and precision to recognize gestures as complex as a unique signature. For security, such as to unlock your phone, an air-signature is nearly impossible to duplicate, even if you perform in it front of other people. Android 2.3 Gingerbread can recognize a wide variety of such gestures, limited only on the creativity of the programmer. Google cites examples in its documentation of tilt, spin, thrust and slice.
    Mimicking gaming controllers like Nintendo's Wii and Sony's Move (which use light to locate users) the 2.3 release of Android also offers a new camera API. By accessing the attached camera and microphone, gesture recognition can be enhanced further, such as turning off the ringer if you snap your fingers over a ringing phone.
     

No comments: