Asynchronous Distributed Learning of Topic Models

Provided by: University of California, Davis
Topic: Hardware
Format: PDF
Distributed learning is a problem of fundamental interest in machine learning and cognitive science. In this paper, the authors present asynchronous distributed learning algorithms for two well-known unsupervised learning frameworks: Latent Dirichlet Allocation (LDA) and Hierarchical Dirichlet Processes (HDP). In the proposed approach, the data are distributed across P processors, and processors independently perform Gibbs sampling on their local data and communicate their information in a local asynchronous manner with other processors. They demonstrate that their asynchronous algorithms are able to learn global topic models that are statistically as accurate as those learned by the standard LDA and HDP samplers, but with significant improvements in computation time and memory.

Find By Topic