From Android 9.0 Pie, Google has taken important steps in the way it is possible to navigate the Android OS. Starting with Android 12, Google offers a limited amount of “machine learning” to adapt gesture navigation to the way someone uses their phone.
Gesture Navigation in Android 12
Since the introduction of Gesture Navigation in Android to replace the buttons of outdated software, complaints have persisted about how the flow navigation method works. In many applications, the usual navigation actions within applications are seen as navigation actions in the system, as a result of which you are suddenly kicked out of the application, or unintentionally returning to the previous page. From Android 10, a solution was presented: developers can Manually set exclusion areas.
With these exclusion areas, gesture navigation has been prohibited within a certain area. Moreover, Android has been equipped with sensitivity settings. From the settings it was possible to determine how quickly the system responds to the navigation action. For Android 12, Google is working on solutions that will tailor the operation of gesture navigation according to users’ desires Dew On XDA Developers. The developer found two of his apps in a list of 43,000 apps being monitored in the new OS for navigation actions.
Current Customization Options for Android 12 Navigation, Photo: Android Police.
Google uses the “TensorFlow Lite” model for this purpose, through which machine learning can be done on the phone. According to Quinny899, Google offers a specific reference in EdgeBackGestureHandler, which deals with gesture navigation in Android 12, for a “file” in which the background gesture data is saved. With a machine learning model, it is possible to recognize specific behavior and adjust gesture navigation based on the model’s results.
Quick gesture navigation actions
Google also made another change to gesture navigation in Android 12 Android Police Described. In Android 12, the gesture navigation actions to return to the previous screen or home screen work from full screen in one go. In Android 11, you will first have to tap the screen once and then perform the navigation action. Looks like it requires an Android app tweak: this tweak doesn’t work for Twitter.
From the last modification to the gestures, you can expect that Google will actually work on the stable version of Android 12. Whether Google will also develop a machine learning model for the stable release – which will likely be towards the end of the third quarter launch – is a different story. Currently, the flag has to be changed in Android 12 to enable machine learning: thus the changes cannot be noticed automatically.
Are you hoping that Google continues the machine learning features of Android 12, or are you satisfied with the way the gesture navigation works on your phone? Let us know for sure in the comments, and don’t forget to mention which Android version you are using.