• Hogwild for Machine Learning on Multicore

    Webcasts // Jun 2012 // provided by University of Washington

    In this webcast, the presenter provides both theoretical and experimental evidence demonstrating the achievement of linear speedups on multi-core workstations on several benchmark optimization problems. Stochastic Gradient Descent (SGD) is a popular optimization algorithm for solving data-driven machine learning problems such as classification, model selection, sequence labeling, and recommendation.

    Watch Now
  • Adventures in Scaling the Multicore Memory Wall

    Webcasts // Mar 2012 // provided by University of Washington

    In this webcast, the presenter covers three different approaches to multicore cache management that can help bridge the "Memory wall." If the application thread mapping and the cache topology are both static (i.e., do not change during runtime), then compiler enhancements that support cache topology-aware code optimization can be used...

    Watch Now
  • GraphLab: A Distributed Abstraction for Machine Learning

    Webcasts // Feb 2012 // provided by University of Washington

    Today, machine learning (ML) methods play a central role in industry and science. In this webcast, the presenter describes the GraphLab framework, which naturally expresses asynchronous, dynamic graph computations that are key for state-of-the-art ML algorithms.

    Watch Now