Google is getting serious about app privacy. This week, the tech giant sent an email to developers across the globe, notifying them that if their apps violate the company’s User Data policy regarding privacy, they may be removed from the store, as reported by The Next Web.

“Google Play requires developers to provide a valid privacy policy when the app requests or handles sensitive user or device information,” the email message stated. “Your app requests sensitive permissions (e.g. camera, microphone, accounts, contacts, or phone) or user data, but does not include a valid privacy policy.”

The message goes on to state that the developers must include a link to a valid privacy policy in their app’s Store Listing page, and within the app itself as well. The privacy policy must “comprehensively disclose how your app collects, uses and shares user data, including the types of parties with whom it’s shared,” according to Google’s User Data policy.

Alternatively, developers can choose to opt out of this requirement by removing any requests for sensitive permissions or user data from the app, the message stated.

SEE: Dozens of iOS apps vulnerable to data theft, despite ATS mandate

Developers must meet these policy requirements by March 15, 2017. If they do not meet that deadline, they risk having Google “limit the visibility of your app,” or even remove it from the Play Store, according to the email.

“I think it is a great thing that Google is putting more focus on users’ privacy,” said Engin Kirda, professor of computer science at Northeastern University. It is especially important in light of past cases in which apps available in the Play Store collected large volumes of sensitive data from users without their knowledge, including the URLs they visited, he added.

“By enforcing Google’s own user data policies, and making app developers provide privacy policies, Google is trying to improve the security and safety of the app store,” Kirda said. “It is a step in the right direction.”

Besides requiring a privacy policy, Google’s User Data policy also mandates that apps that handle personal or sensitive user information handle that data securely, “including transmitting it using modern cryptography (for example, over HTTPS).”

Google’s move to protect app users from cybercrime follows Apple’s move to require all iOS apps to use HTTPS connections by a yet-to-be-determined deadline. In Apple’s case, once a deadline is set, app developers must enforce the App Transport Security (ATS) feature, which forces the connections to HTTPS instead of HTTP, in order to improve privacy.

However, sometimes even these protections are not sufficient. A verify.ly report released this week found that 76 popular iOS apps are vulnerable to data theft, regardless of whether or not developers are using ATS.

If you are an app developer, you should always make sure you have a privacy policy in place as a best practice, not just because it’s required by Google or Apple. And, if you find your app disappears from the Play Store next month, you’ll know why.

The Next Web story notes that the coming purge of apps that lack privacy policies will likely help rid the Play Store of “zombie apps” that contain security vulnerabilities, making it easier for users to find the safe apps they need.

The 3 big takeaways for TechRepublic readers

1. This week, Google emailed a number of developers to notify them that their apps collect sensitive user information but do not have a valid privacy policy, and if they do not add one, the app may be removed from the Play Store.

2. If an app requests sensitive permissions, such as camera, microphone, accounts, or user data, its privacy policy must disclose how it collects, uses and shares user data, including the types of parties with whom it’s shared, according to Google’s User Data policy.

3. Google’s move is meant to increase user security; however, app developers should be vigilant about security protections whether or not they are required by a provider.