Comparative Study of Network Boosting With AdaBoost and Bagging

Download Now Free registration required

Executive Summary

Network Boosting (NB) is an ensemble learning method that combines weak learners together based on a network and can learn the target hypothesis asymptotically. The experiment results show that NB can improve the classification accuracy significantly compared to Bagging and AdaBoost. NB draws merit from Bagging and AdaBoost and shows higher generalization ability. To explore the influence of network topology on the performance of the algorithm, random graph, small-world network and scale-free-network are employed. The goal of machine learning is to program computers to use example data or past experience to solve a given problem. In recent years, the ensemble learning methods have become a hot topic in the machine learning community.

  • Format: PDF
  • Size: 478.03 KB