Who blocks camera access for applications running in the background
According to a recent commit on the Android Open Source Project (AOSP), Google engineers implemented a new security feature in the Linux Security-Enhanced Linux (SELinux) module, which prevents unused applications or services that are in the background to access the camera with a UID (User ID).
To be more precise, it is after a certain time, which can be defined by the user or the one defined by default, that the security function no longer allows inactive applications or services running backwards. to access the camera. A measure that probably comes to prevent potential spyware or other malicious programs that could spy on you through the camera camera.
“If the UID becomes inactive, we generate an error and close the cameras for this UID,” reads the commit. “If an application in an idle user ID tries to use the camera, we immediately generate an error because applications must already handle these errors, so it’s possible to apply this policy to all applications to protect privacy.” users. ”
This new security feature, which will be deployed with Android P to block access to inactive apps in the background, highlights the fact that Google would like to take more important steps to protect Android users. Andrew Ahn, Product Manager on the Google Play team, said in late January that “Google Play is committed to providing a safe experience for billions of Android users to find and discover such applications. Over the years, this commitment has made Google Play a more reliable and secure place. ”
According to him, last year, Google has reduced by more than half the probability that a user installs a bad application, “protecting people and their devices, and making the task more difficult for those who seek to abuse the ecosystem of apps on Google Play for their own benefit. ”
And continue by saying “That in 2017 we removed more than 700,000 apps that violated the rules of Google Play, 70% more than the applications removed in 2016. Not only have we removed other bad apps, but we could also identify them. In fact, 99% of applications with abusive content have been identified and rejected before anyone can install them. This has been possible because of significant improvements in our ability to detect abuse – such as identity theft, inappropriate content, or malware – through new models and machine learning techniques. ”
In addition, given that its application showcase has reached record highs (more than 19 billion in the fourth quarter of 2017 alone), it is important for Google to redouble its efforts to better protect its users.