Google's new approach to machine learning relies on data stored on mobile devices to train models, without the need to centrally store the data.
On Thursday, Google introduced a new approach to machine learning that uses the data stored on smartphone to build predictive models. Officially dubbed Federated Learning, the method allows the data to remain stored on the device, only sending encrypted updates to the model as needed.
Google scientists introduced the Federated Learning concept in a Google Research blog post. By bringing machine learning training locally to the device, users will all be able to collectively contribute to improving a model, without giving up their data, the post said.
According to the post, a user starts contributing by first downloading the current machine learning model that they want to contribute to. The model will learn from the data on the user's device, summarize the changes as a "small focused update," and then encrypt that update and send it to the cloud.
Once the update makes it to the cloud to join the parent model, "it is immediately averaged with other user updates to improve the shared model," the post said. "All the training data remains on your device, and no individual updates are stored in the cloud."
Being that the updates are encrypted, not stored in the cloud, and only used to average with other updates, Google's post said that the Federated Learning model will help ensure privacy when it comes to machine learning. Also, due to a special protocol, "no individual phone's update can be inspected before averaging," the post said. It could also help cut power use and latency.
Once the shared model makes it to a user's device, it will immediately be usable and had the potential to personalize certain services, the post said. Federated Learning is being tested in Gboard on Android, the Google keyboard.
"When Gboard shows a suggested query, your phone locally stores information about the current context and whether you clicked the suggestion," the post said. "Federated Learning processes that history on-device to suggest improvements to the next iteration of Gboard's query suggestion model."
Because the data on the smartphones used in Federated Learning is unevenly distributed, and users are typically dealing with high latency and low throughput, Google had to rely on its Federated Averaging algorithm. This algorithms used the on-board mobile processors to make higher quality updates to the model, and compresses the updates so they are easier to upload.
A scaled down version of TensorFlow is used on the device to help deploy the system and to perform and collect the update. Certain safeguards are in place to make sure the updates are only created when the phone isn't being used, and in a way that won't severely impact performance, the post said.
As noted by the post, Google will continue to work on traditional and cloud-based machine learning models as well.
The 3 big takeaways for TechRepublic readers
- Google's Federated Learning is a new method for training machine learning models, using data stored on user's smartphones to update the system.
- Updates from the phones are encrypted and cannot be accessed except to average the model, which Google said could help ensure user privacy.
- Google is currently testing Federated Learning on Gboard for Android, which provides suggestions for models as the user continues typing.
- Why AI and machine learning are so hard, Facebook and Google weigh in (TechRepublic)
- iPhone, Android hit by Broadcom Wi-Fi chip bugs: Now Apple, Google plug flaws (ZDNet)
- How Google's AI breakthroughs are putting us on a path to narrow AI (TechRepublic)
- PAX: Android patent protection consortium formed (ZDNet)
- Google DeepMind: The smart person's guide (TechRepublic)