PRE-NAS: Evolutionary Neural Architecture Search With Predictor
2022; Institute of Electrical and Electronics Engineers; Volume: 27; Issue: 1 Linguagem: Inglês
10.1109/tevc.2022.3227562
ISSN1941-0026
AutoresYameng Peng, Andy Song, Vic Ciesielski, Haytham M. Fayek, Xiaojun Chang,
Tópico(s)Metaheuristic Optimization Algorithms Research
ResumoNeural architecture search (NAS) aims to automate architecture engineering in neural networks. This often requires a high computational overhead to evaluate a number of candidate networks from the set of all possible networks in the search space. Prediction of the performance of a network can alleviate this high computational overhead by mitigating the need for evaluating every candidate network. Developing such a predictor typically requires a large number of evaluated architectures which may be difficult to obtain. We address this challenge by proposing a novel evolutionary-based NAS strategy, predictor-assisted evolutionary NAS (PRE-NAS) which can perform well even with an extremely small number of evaluated architectures. PRE-NAS leverages new evolutionary search strategies and integrates high-fidelity weight inheritance over generations. Unlike one-shot strategies, which may suffer from bias in the evaluation due to weight sharing, offspring candidates in PRE-NAS are topologically homogeneous. This circumvents bias and leads to more accurate predictions. Extensive experiments on the NAS-Bench-201 and DARTS search spaces show that PRE-NAS can outperform state-of-the-art NAS methods. With only a single GPU searching for 0.6 days, a competitive architecture can be found by PRE-NAS which achieves 2.40% and 24% test error rates on CIFAR-10 and ImageNet, respectively.
Referência(s)