Skip to main content
placeholder image

Toward gradient bandit-based selection of candidate architectures in AutoGAN

Journal Article


Abstract


  • The neural architecture search (NAS) method provides a new approach to the design of generative adversarial network (GAN)’s architecture. The existing state-of-the-art algorithm, AutoGAN, is an example in discovering GAN’s architecture using reinforcement learning (RL)-based NAS. However, performance differences between the candidate architectures are not taken into consideration. In this paper, one new ImprovedAutoGAN algorithm based on AutoGAN is proposed. We found that which candidate architectures to use can affect the performance of the final network. So in the process of architecture search, compared with AutoGAN, in which candidate architectures are selected randomly, our method uses gradient bandit algorithm to increase the probability of selecting networks with better performance. This paper also introduces a temperature coefficient in the algorithm to prevent the search results from getting trapped in the local optimum. The GANs are searched using the same search space as AutoGAN, and the discovered GAN has a Frechet inception distance (FID) score of 11.60 on CIFAR-10, reaching the best level in the current RL-based NAS methods. Experiments also show that the transferability of this GAN is satisfying.

Publication Date


  • 2021

Citation


  • Fan, Y., Zhou, G., Shen, J., & Dai, G. (2021). Toward gradient bandit-based selection of candidate architectures in AutoGAN. Soft Computing, 25(6), 4367-4378. doi:10.1007/s00500-020-05446-x

Scopus Eid


  • 2-s2.0-85098719666

Start Page


  • 4367

End Page


  • 4378

Volume


  • 25

Issue


  • 6

Abstract


  • The neural architecture search (NAS) method provides a new approach to the design of generative adversarial network (GAN)’s architecture. The existing state-of-the-art algorithm, AutoGAN, is an example in discovering GAN’s architecture using reinforcement learning (RL)-based NAS. However, performance differences between the candidate architectures are not taken into consideration. In this paper, one new ImprovedAutoGAN algorithm based on AutoGAN is proposed. We found that which candidate architectures to use can affect the performance of the final network. So in the process of architecture search, compared with AutoGAN, in which candidate architectures are selected randomly, our method uses gradient bandit algorithm to increase the probability of selecting networks with better performance. This paper also introduces a temperature coefficient in the algorithm to prevent the search results from getting trapped in the local optimum. The GANs are searched using the same search space as AutoGAN, and the discovered GAN has a Frechet inception distance (FID) score of 11.60 on CIFAR-10, reaching the best level in the current RL-based NAS methods. Experiments also show that the transferability of this GAN is satisfying.

Publication Date


  • 2021

Citation


  • Fan, Y., Zhou, G., Shen, J., & Dai, G. (2021). Toward gradient bandit-based selection of candidate architectures in AutoGAN. Soft Computing, 25(6), 4367-4378. doi:10.1007/s00500-020-05446-x

Scopus Eid


  • 2-s2.0-85098719666

Start Page


  • 4367

End Page


  • 4378

Volume


  • 25

Issue


  • 6