Hogwild for Machine Learning on Multicore

In this webcast, the presenter provides both theoretical and experimental evidence demonstrating the achievement of linear speedups on multi-core workstations on several benchmark optimization problems. Stochastic Gradient Descent (SGD) is a popular optimization algorithm for solving data-driven machine learning problems such as classification, model selection, sequence labeling, and recommendation.

Provided by: University of Washington Topic: Data Centers Date Added: Jun 2012 Format: Webcast

Watch Now

Find By Topic