PRE-NAS: Evolutionary Neural Architecture Search With Predictor

Publisher:
Institute of Electrical and Electronics Engineers (IEEE)
Publication Type:
Journal Article
Citation:
IEEE Transactions on Evolutionary Computation, 2023, 27, (1), pp. 26-36
Issue Date:
2023-02-01
Filename Description Size
PRE-NAS_Evolutionary_Neural_Architecture_Search_With_Predictor.pdfPublished version4.54 MB
Adobe PDF
Full metadata record
Neural architecture search (NAS) aims to automate architecture engineering in neural networks. This often requires a high computational overhead to evaluate a number of candidate networks from the set of all possible networks in the search space. Prediction of the performance of a network can alleviate this high computational overhead by mitigating the need for evaluating every candidate network. Developing such a predictor typically requires a large number of evaluated architectures which may be difficult to obtain. We address this challenge by proposing a novel evolutionary-based NAS strategy, predictor-assisted evolutionary NAS (PRE-NAS) which can perform well even with an extremely small number of evaluated architectures. PRE-NAS leverages new evolutionary search strategies and integrates high-fidelity weight inheritance over generations. Unlike one-shot strategies, which may suffer from bias in the evaluation due to weight sharing, offspring candidates in PRE-NAS are topologically homogeneous. This circumvents bias and leads to more accurate predictions. Extensive experiments on the NAS-Bench-201 and DARTS search spaces show that PRE-NAS can outperform state-of-the-art NAS methods. With only a single GPU searching for 0.6 days, a competitive architecture can be found by PRE-NAS which achieves 2.40% and 24% test error rates on CIFAR-10 and ImageNet, respectively.
Please use this identifier to cite or link to this item: