Parameter Distribution Balanced CNNs.
- Publisher:
- Institute of Electrical and Electronics Engineers (IEEE)
- Publication Type:
- Journal Article
- Citation:
- IEEE transactions on neural networks and learning systems, 2020, 31, (11), pp. 4600-4609
- Issue Date:
- 2020-11
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
08960268.pdf | Published Version | 3.55 MB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
Convolutional neural network (CNN) is the primary technique that has greatly promoted the development of computer vision technologies. However, there is little research on how to allocate parameters in different convolution layers when designing CNNs. We research mainly on revealing the relationship between CNN parameter distribution, i.e., the allocation of parameters in convolution layers, and the discriminative performance of CNN. Unlike previous works, we do not append more elements into the network, such as more convolution layers or denser short connections. We focus on enhancing the discriminative performance of CNN through varying its parameter distribution under strict size constraint. We propose an energy function to represent the CNN parameter distribution, which establishes the connection between the allocation of parameters and the discriminative performance of CNN. Extensive experiments with shallow CNNs on three public image classification data sets demonstrate that the CNN parameter distribution with a higher energy value will promote the model to obtain better performance. According to the motivated observation, the problem of finding the optimal parameter distribution can be transformed into an optimization problem of finding the biggest energy value. We present a simple yet effective guideline that uses balanced parameter distribution to design CNNs. Extensive experiments on ImageNet with three popular backbones, i.e., AlexNet, ResNet34, and ResNet101, demonstrate that the proposed guideline can make consistent improvements upon different baselines under strict size constraint.
Please use this identifier to cite or link to this item: