Hogwild for Machine Learning on Multicore

Provided by: University of Washington
Topic: Data Centers
Format: Webcast
In this webcast, the presenter provides both theoretical and experimental evidence demonstrating the achievement of linear speedups on multi-core workstations on several benchmark optimization problems. Stochastic Gradient Descent (SGD) is a popular optimization algorithm for solving data-driven machine learning problems such as classification, model selection, sequence labeling, and recommendation.

Find By Topic