Comparative Study of Network Boosting With AdaBoost and Bagging

Network Boosting (NB) is an ensemble learning method that combines weak learners together based on a network and can learn the target hypothesis asymptotically. The experiment results show that NB can improve the classification accuracy significantly compared to Bagging and AdaBoost. NB draws merit from Bagging and AdaBoost and shows higher generalization ability. To explore the influence of network topology on the performance of the algorithm, random graph, small-world network and scale-free-network are employed. The goal of machine learning is to program computers to use example data or past experience to solve a given problem. In recent years, the ensemble learning methods have become a hot topic in the machine learning community.

Provided by: International Journal of Engineering Science and Technology (IJEST) Topic: Mobility Date Added: Aug 2010 Format: PDF

Download Now

Find By Topic