AB - © 2019 IEEE. We aim to divide the problem space of fine-grained recognition into some specific regions. To achieve this, we develop a unified framework based on a mixture of experts. Due to limited data available for the fine-grained recognition problem, it is not feasible to learn diverse experts by using a data division strategy. To tackle the problem, we promote diversity among experts by combing an expert gradually-enhanced learning strategy and a Kullback-Leibler divergence based constraint. The strategy learns new experts on the dataset with the prior knowledge from former experts and adds them to the model sequentially, while the introduced constraint forces the experts to produce diverse prediction distribution. These drive the experts to learn the task from different aspects, making them specialized in different subspace problems. Experiments show that the resulting model improves the classification performance and achieves the state-of-the-art performance on several fine-grained benchmark datasets. AU - Zhang, L AU - Huang, S AU - Liu, W AU - Tao, D CY - Piscataway, USA DA - 2020 DO - 10.1109/ICCV.2019.00842 EP - 8339 JO - IEEE/CVF International Conference on Computer Vision PB - IEEE PY - 2020 SP - 8330 TI - Learning a mixture of granularity-specific experts for fine-grained categorization VL - 2019-October Y1 - 2020 Y2 - 2024/03/29 ER -