ABSTRACT
The automated design of generative adversarial networks (GAN) is currently being solved well by neural architecture search (NAS), although there are still some issues. One problem is the vast majority of NAS for GANs methods are only based on a single evaluation metric or a linear superposition of multiple evaluation metrics. Another problem is that the conventional evolutionary neural architecture search (ENAS) is unable to adjust its mutation probabilities in accordance with the NAS process, making it simple to settle into a local optimum. To address these issues, we firstly design a two-factor cooperative mutation mechanism that can control the mutation probability based on the current iteration rounds of the population, population fitness and other information. Secondly, we divide the evolutionary process into three stages based on the properties of NAS, so that the different stages can adaptively adjust the mutation probability according to the population state and the expected development goals. Finally, we incorporate multiple optimization objectives from GANs based on image generation tasks into ENAS. And we construct an adaptive multiobjective ENAS based on a two-factor cooperative mutation mechanism. We test and ablate our algorithm on the STL-10 and CIFAR-10 datasets, and the experimental results show that our method outperforms the majority of traditional NAS-GANs.
- Martin Arjovsky, Soumith Chintala, and Léon Bottou. 2017. Wasserstein generative adversarial networks. In International conference on machine learning. PMLR, 214--223.Google Scholar
- Shane Barratt and Rishi Sharma. 2018. A note on the inception score. arXiv preprint arXiv:1801.01973 (2018).Google Scholar
- Andrew Brock, Jeff Donahue, and Karen Simonyan. 2018. Large scale GAN training for high fidelity natural image synthesis. arXiv preprint arXiv:1809.11096 (2018).Google Scholar
- Daoyuan Chen, Yaliang Li, Minghui Qiu, Zhen Wang, Bofang Li, Bolin Ding, Hongbo Deng, Jun Huang, Wei Lin, and Jingren Zhou. 2020. Adabert: Taskadaptive bert compression with differentiable neural architecture search. arXiv preprint arXiv:2001.04246 (2020).Google Scholar
- Yukang Chen, Gaofeng Meng, Qian Zhang, Shiming Xiang, Chang Huang, Lisen Mu, and Xinggang Wang. 2019. Renas: Reinforced evolutionary neural architecture search. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 4787--4796.Google ScholarCross Ref
- Victor Costa, Nuno Lourenço, João Correia, and Penousal Machado. 2019. Coegan: evaluating the coevolution effect in generative adversarial networks. In Proceedings of the genetic and evolutionary computation conference. 374--382.Google ScholarDigital Library
- Xiyang Dai, Dongdong Chen, Mengchen Liu, Yinpeng Chen, and Lu Yuan. 2020. Da-nas: Data adapted pruning for efficient neural architecture search. In Computer Vision--ECCV 2020: 16th European Conference, Glasgow, UK, August 23--28, 2020, Proceedings, Part XXVII 16. Springer, 584--600.Google Scholar
- Sivan Doveh and Raja Giryes. 2021. DEGAS: differentiable efficient generator search. Neural Computing and Applications 33 (2021), 17173--17184.Google ScholarDigital Library
- Yi Fan, Xiulian Tang, Guoqiang Zhou, and Jun Shen. 2020. EfficientAutoGAN: Predicting the rewards in reinforcement-based neural architecture search for generative adversarial networks. IEEE Transactions on Cognitive and Developmental Systems 14, 1 (2020), 234--245.Google ScholarCross Ref
- Vayangi Vishmi Vishara Ganepola and Torin Wirasingha. 2021. Automating generative adversarial networks using neural architecture search: A review. In 2021 International Conference on Emerging Smart Computing and Informatics (ESCI). IEEE, 577--582.Google Scholar
- Chen Gao, Yunpeng Chen, Si Liu, Zhenxiong Tan, and Shuicheng Yan. 2020. Adversarialnas: Adversarial neural architecture search for gans. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 5680--5689.Google ScholarCross Ref
- Xinyu Gong, Shiyu Chang, Yifan Jiang, and Zhangyang Wang. 2019. Autogan: Neural architecture search for generative adversarial networks. In Proceedings of the IEEE/CVF International Conference on Computer Vision. 3224--3234.Google ScholarCross Ref
- Hao He, Hao Wang, Guang-He Lee, and Yonglong Tian. 2019. Probgan: Towards probabilistic gan with theoretical guarantees. In International Conference on Learning Representations.Google Scholar
- Qiuzhen Lin, Zhixiong Fang, Yi Chen, Kay Chen Tan, and Yun Li. 2022. Evolutionary architectural search for generative adversarial networks. IEEE Transactions on Emerging Topics in Computational Intelligence 6, 4 (2022), 783--794.Google ScholarCross Ref
- Feng Liu, Hanyang Wang, Jiahao Zhang, Ziwang Fu, Aimin Zhou, Jiayin Qi, and Zhibin Li. 2022. EvoGAN: An evolutionary computation assisted GAN. Neurocomputing 469 (2022), 81--90.Google ScholarDigital Library
- Hanxiao Liu, Karen Simonyan, and Yiming Yang. 2018. Darts: Differentiable architecture search. arXiv preprint arXiv:1806.09055 (2018).Google Scholar
- Shiqing Liu, Haoyu Zhang, and Yaochu Jin. 2022. A survey on surrogate-assisted efficient neural architecture search. arXiv preprint arXiv:2206.01520 (2022).Google Scholar
- Yonghong Luo, Ying Zhang, Xiangrui Cai, and Xiaojie Yuan. 2019. E2gan: Endto- end generative adversarial network for multivariate time series imputation. In Proceedings of the 28th international joint conference on artificial intelligence. AAAI Press, 3094--3100.Google ScholarCross Ref
- Sebastian Lutz, Konstantinos Amplianitis, and Aljosa Smolic. 2018. Alphagan: Generative adversarial networks for natural image matting. arXiv preprint arXiv:1807.10088 (2018).Google Scholar
- Takeru Miyato, Toshiki Kataoka, Masanori Koyama, and Yuichi Yoshida. 2018. Spectral normalization for generative adversarial networks. arXiv preprint arXiv:1802.05957 (2018).Google Scholar
- Yameng Peng, Andy Song, Vic Ciesielski, Haytham M Fayek, and Xiaojun Chang. 2022. PRE-NAS: Evolutionary Neural Architecture Search with Predictor. IEEE Transactions on Evolutionary Computation (2022).Google ScholarDigital Library
- Hieu Pham, Melody Guan, Barret Zoph, Quoc Le, and Jeff Dean. 2018. Efficient neural architecture search via parameters sharing. In International conference on machine learning. PMLR, 4095--4104.Google Scholar
- Alec Radford, Luke Metz, and Soumith Chintala. 2015. Unsupervised representation learning with deep convolutional generative adversarial networks. arXiv preprint arXiv:1511.06434 (2015).Google Scholar
- Michael Soloveitchik, Tzvi Diskin, Efrat Morin, and Ami Wiesel. 2021. Conditional frechet inception distance. arXiv preprint arXiv:2103.11521 (2021).Google Scholar
- Mandavilli Srinivas and Lalit M Patnaik. 1994. Adaptive probabilities of crossover and mutation in genetic algorithms. IEEE Transactions on Systems, Man, and Cybernetics 24, 4 (1994), 656--667.Google ScholarCross Ref
- Mingxing Tan, Bo Chen, Ruoming Pang, Vijay Vasudevan, Mark Sandler, Andrew Howard, and Quoc V Le. 2019. Mnasnet: Platform-aware neural architecture search for mobile. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition. 2820--2828.Google ScholarCross Ref
- Chaoyue Wang, Chang Xu, Xin Yao, and Dacheng Tao. 2019. Evolutionary generative adversarial networks. IEEE Transactions on Evolutionary Computation 23, 6 (2019), 921--934.Google ScholarDigital Library
- Hanchao Wang and Jun Huan. 2019. Agan: Towards automated design of generative adversarial networks. arXiv preprint arXiv:1906.11080 (2019).Google Scholar
- Wei Wang, Yuan Sun, and Saman Halgamuge. 2018. Improving MMD-GAN training with repulsive loss function. arXiv preprint arXiv:1812.09916 (2018).Google Scholar
- Zhaohui Yang, Yunhe Wang, Xinghao Chen, Boxin Shi, Chao Xu, Chunjing Xu, Qi Tian, and Chang Xu. 2020. Cars: Continuous evolution for efficient neural architecture search. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 1829--1838.Google ScholarCross Ref
- Shan You, Tao Huang, Mingmin Yang, Fei Wang, Chen Qian, and Changshui Zhang. 2020. Greedynas: Towards fast one-shot nas with greedy supernet. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 1999--2008.Google ScholarCross Ref
- Zhi-Hui Zhan, Zi-Jia Wang, Hu Jin, and Jun Zhang. 2019. Adaptive distributed differential evolution. IEEE transactions on cybernetics 50, 11 (2019), 4633--4647.Google Scholar
- Tong Zhang, Chunyu Lei, Zongyan Zhang, Xian-Bing Meng, and CL Philip Chen. 2021. AS-NAS: Adaptive scalable neural architecture search with reinforced evolutionary algorithm for deep learning. IEEE Transactions on Evolutionary Computation 25, 5 (2021), 830--841.Google ScholarCross Ref
Index Terms
- Adaptive Multiobjective Evolutionary Neural Architecture Search for GANs based on Two-Factor Cooperative Mutation Mechanism
Recommendations
Hybridizing adaptive and non-adaptive mutation for cooperative exploration of complex multimodal search space
ACST'07: Proceedings of the third conference on IASTED International Conference: Advances in Computer Science and TechnologyOne of the most efficient Real-Coded Genetic Algorithms (RCGAs) for function optimization currently is the G3-PCX (Parent-Centric Crossover) algorithm. However, its performance for solving complex multimodal problems with highly deceptive fitness ...
Constrained differential evolution with multiobjective sorting mutation operators for constrained optimization
The proposed constrained differential evolution framework uses nondominated sorting mutation operator based on fitness and diversity information for constrained optimization. This study proposes a new constraint differential evolution framework.Parents ...
Use of the q-Gaussian mutation in evolutionary algorithms
Special issue on advances in computational intelligence and bioinformaticsThis paper proposes the use of the q-Gaussian mutation with self-adaptation of the shape of the mutation distribution in evolutionary algorithms. The shape of the q-Gaussian mutation distribution is controlled by a real parameter q. In the proposed ...
Comments