Training deep neural networks on imbalanced data sets
- Publication Type:
- Conference Proceeding
- Citation:
- Proceedings of the International Joint Conference on Neural Networks, 2016, 2016-October pp. 4368 - 4374
- Issue Date:
- 2016-10-31
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
07727770.pdf | Published version | 187.89 kB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
© 2016 IEEE. Deep learning has become increasingly popular in both academic and industrial areas in the past years. Various domains including pattern recognition, computer vision, and natural language processing have witnessed the great power of deep networks. However, current studies on deep learning mainly focus on data sets with balanced class labels, while its performance on imbalanced data is not well examined. Imbalanced data sets exist widely in real world and they have been providing great challenges for classification tasks. In this paper, we focus on the problem of classification using deep network on imbalanced data sets. Specifically, a novel loss function called mean false error together with its improved version mean squared false error are proposed for the training of deep networks on imbalanced data sets. The proposed method can effectively capture classification errors from both majority class and minority class equally. Experiments and comparisons demonstrate the superiority of the proposed approach compared with conventional methods in classifying imbalanced data sets on deep neural networks.
Please use this identifier to cite or link to this item: