Differentiable neural architecture search in equivalent space with exploration enhancement

Publication Type:
Conference Proceeding
Citation:
Advances in Neural Information Processing Systems, 2020, 2020-December
Issue Date:
2020-01-01
Full metadata record
Recent works on One-Shot Neural Architecture Search (NAS) mostly adopt a bilevel optimization scheme to alternatively optimize the supernet weights and architecture parameters after relaxing the discrete search space into a differentiable space. However, the non-negligible incongruence in their relaxation methods is hard to guarantee the differentiable optimization in the continuous space is equivalent to the optimization in the discrete space. Differently, this paper utilizes a variational graph autoencoder to injectively transform the discrete architecture space into an equivalently continuous latent space, to resolve the incongruence. A probabilistic exploration enhancement method is accordingly devised to encourage intelligent exploration during the architecture search in the latent space, to avoid local optimal in architecture search. As the catastrophic forgetting in differentiable One-Shot NAS deteriorates supernet predictive ability and makes the bilevel optimization inefficient, this paper further proposes an architecture complementation method to relieve this deficiency. We analyze the proposed method’s effectiveness, and a series of experiments have been conducted to compare the proposed method with state-of-the-art One-Shot NAS methods.
Please use this identifier to cite or link to this item: