Provided by: Institute of Electrical & Electronic Engineers
Topic: Big Data
Date Added: Mar 2009
In this paper, the authors study empirically the effect of sampling and threshold-moving in training cost-sensitive neural networks. Both over-sampling and under-sampling are considered. These techniques modify the distribution of the training data such that the costs of the examples are conveyed explicitly by the appearances of the examples. Threshold-moving tries to move the output threshold toward inexpensive classes such that examples with higher costs become harder to be misclassified. Moreover, hard-ensemble and soft-ensemble, i.e. the combination of above techniques via hard or soft voting schemes, are also tested.