The Android 12 developer preview has been officially launched recently, and some developers can experience the features brought by the new system first. Android 12 has added support for AVIF format images, compatible with AVC video encoding, and optimized the notification interface to make it more beautiful.
According to foreign media TomsGuide, Android 10 supports gesture operation for the first time, but it still faces many problems when using it. Since most of the current mobile phones have full screens, and the physical/touch buttons are eliminated, the return operation becomes a sliding right on the left edge of the screen. However, this can lead to the false triggering of buttons and functions in the app.
Google has previously admitted this problem. According to the developer of the XDA forum, the Android 12 preview version seems to be using machine learning to try to predict the user’s intentions and avoid the problem of false touches. A developer named Quinny899 found a file in the TensorFlow Light module, listing the names of 43,000 applications.
Developers suspect that this is a list of applications that Google uses device learning models for training. The system will detect the start and endpoints of the user’s sliding, try to find the pattern, and determine whether the user intends to return or interact in the application.
At present, the Android 12 system UI has confirmed that this “smart gesture prediction” function is included. Once it is turned on, the system will judge gesture operations based on multiple indicators.
This feature is currently disabled by default in Android 12, but the developer said that because the system is currently in the initial beta version, the feature is closed. When Android 12 is stable enough for public testing, it is expected that the smart gesture prediction function will be enabled by default.