Adaptive tree-like neural network: Overcoming catastrophic forgetting to classify streaming data with concept drifts
- Publisher:
- ELSEVIER
- Publication Type:
- Journal Article
- Citation:
- Knowledge-Based Systems, 2024, 293
- Issue Date:
- 2024-06-07
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
1-s2.0-S0950705124002715-main.pdf | Published version | 3.3 MB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
With the development of deep neural networks (DNNs), classifying streaming data with concept drifts based on DNNs is becoming more and more effective. However, the continuous and infinite characteristic of streaming data makes it difficult to set an appropriate depth for a DNN-based model in advance. Moreover, how to improve the model's adaptability to concept drifts while overcoming catastrophic forgetting still remains a difficult issue. To address these issues, an Adaptive Tree-like Neural Network (ATNN) is proposed in this paper. ATNN adaptively increases the depth of its active branch according to the weight of the deepest node in the active branch. Once a new concept is detected, it chooses a suitable position on its trunk to grow a branch for the new concept according to the relation between the Fisher information and the gradient of parameters. Experiments demonstrate the rationality of ATNN to adaptively increase the depth of its branch or to choose a suitable position to grow a new branch, and illustrate that ATNN can more quickly adapt to concept drifts, and continuously improve classification performance for recurring concepts. The code of the proposed algorithm is available at https://github.com/mlmmwym/ATNN.
Please use this identifier to cite or link to this item: