Processors

Hogwild for Machine Learning on Multicore

Free registration required

Executive Summary

In this webcast, the presenter provides both theoretical and experimental evidence demonstrating the achievement of linear speedups on multi-core workstations on several benchmark optimization problems. Stochastic Gradient Descent (SGD) is a popular optimization algorithm for solving data-driven machine learning problems such as classification, model selection, sequence labeling, and recommendation.

  • Format: Webcast
  • Size: 0 KB