Fast Evolutionary Neural Architecture Search by Contrastive Predictor with Linear Regions
- Publisher:
- ASSOC COMPUTING MACHINERY
- Publication Type:
- Conference Proceeding
- Citation:
- GECCO 2023 - Proceedings of the 2023 Genetic and Evolutionary Computation Conference, 2023, pp. 1257-1266
- Issue Date:
- 2023-07-15
Closed Access
Filename | Description | Size | |||
---|---|---|---|---|---|
3583131.3590452.pdf | Published version | 1.32 MB |
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
Evolutionary neural architecture search (ENAS) has emerged as a promising approach to finding high-performance neural architectures. However, widespread application has been limited by the expensive computational costs due to the nature of evolutionary algorithms. In this study, we aim to significantly reduce the computational costs of ENAS by involving a training-free performance metric. Specifically, the network performance can be estimated by the training-free metric with only a single forward pass. However, training-free metrics have their own challenges, in particular, an insufficient correlation with ground-Truth performance. We adopt a Graph Convolutional Network (GCN) based contrastive predictor which can leverage the low cost of the training-free performance metric yet improve the correlation between the estimated performance and the true performance of the candidate architectures. Combining a training-free metric-the number of linear regions with the GCN-based contrastive predictor and an active learning scheme, we propose Fast-ENAS which can achieve superior search efficiency and performance on the benchmark NAS-Bench-201 and DARTS search spaces. Furthermore, with a single GPU searching on the DARTS space, Fast-ENAS requires only 0.02 (29 minutes) and 0.026 (37 minutes) GPU days to achieve test error rates of 2.50% and 24.30% on CIFAR-10 and ImageNet respectively.
Please use this identifier to cite or link to this item: