On Monday, Google announced that customers can now use Cloud TPUs on the Cloud Machine Learning Engine (ML Engine) in beta to speed the training of machine learning models. Cloud TPU quota is also now available to all Google Cloud Platform (GCP) customers, the company announced via blog post.
Google's Cloud ML Engine allows businesses to train and deploy machine learning models on datasets of various types and sizes using TensorFlow. "As a managed service, ML Engine handles the infrastructure, compute resources, and job scheduling on your behalf, allowing you to focus on data and modeling," the post said. The move could help organizations better take advantage of machine learning capabilities.
Google first launched Cloud ML Engine in March 2017 as a managed TensorFlow service, allowing customers to scale machine learning workloads with distributed training and GPU acceleration, the post noted. Since its release, Google has added features including support for NVIDIA V100 GPUs, and online prediction as a deployment capability.
SEE: IT leader's guide to the future of artificial intelligence (Tech Pro Research)
Google Cloud TPUs were recently launched in beta, and are built for machine learning. With support for Cloud TPUs, ML Engine customers can train a number of high-performance, open-source reference models with "differentiated performance per dollar," the post said. Customers can also accelerate their own models written with TensorFlow APIs, it noted.
ML Engine automatically deals with the provisioning and management of Cloud TPU nodes, so customers can take advantage of them in the same way as they do CPUs and GPUs, the post said. Users can also tap ML Engine's hyperparameter tuning feature in Cloud TPU jobs to improve models with scale, performance, and algorithms. Models can be deployed with ML Engine to issue prediction requests or submit batch prediction jobs, according to the post.
To learn more about using Cloud TPUs with the machine learning engine for training jobs, you can check out this guide.
The big takeaways for tech leaders:
- Google announced that customers can now use Cloud TPUs on the Cloud Machine Learning Engine in beta to speed the training of machine learning models.
- With support for Cloud TPUs, ML Engine customers can train a number of high-performance, open-source reference models with differentiated performance per dollar.
- Special report: How to implement AI and machine learning (free PDF) (TechRepublic)
- Google makes it easier to incorporate machine learning into mobile apps (ZDNet)
- Machine learning: A cheat sheet (TechRepublic)
- Google preps TPU 3.0 for AI, machine learning, model training (ZDNet)
- Cloud AutoML highlights Google's ability to build open ecosystems and monetize them (TechRepublic)
Alison DeNisco Rayome has nothing to disclose. She does not hold investments in the technology companies she covers.
Alison DeNisco Rayome is a Staff Writer for TechRepublic. She covers CXO, cybersecurity, and the convergence of tech and the workplace.