Browser

Parallel Boosted Regression Trees for Web Search Ranking

Free registration required

Executive Summary

Gradient Boosted Regression Trees (GBRT) are the current state-of-the-art learning paradigm for machine learned web-search ranking - a domain notorious for very large data sets. In this paper, the authors propose a novel method for parallelizing the training of GBRT. Their technique parallelizes the construction of the individual regression trees and operates using the master-worker paradigm as follows. The data are partitioned among the workers. At each iteration, the worker summarizes its data-partition using histograms. The master processor uses these to build one layer of a regression tree, and then sends this layer to the workers, allowing the workers to build histograms for the next layer.

  • Format: PDF
  • Size: 438.55 KB