Hyperparameter Optimization for CNNs using genetic algorithms
The project aims to explore the applications of various Deep Neural Network Architectures in practical problems and to optimize the process of selecting the proper hyperparameters (Dropout, Hidden Layers, etc.) for these tasks. We propose to carry out this optimization using Evolutionary Algorithms, such as Genetic Algorithms, to eliminate the chances of human error arising from the trial-and-error-based process. Evolutionary algorithms have also been shown to be computationally efficient, and hence, our project aims to see how effectively certain Deep Learning application-based problems can be solved using hyperparameters selected via such techniques. We also compare our models with existing evolutionary architectures and certain modified implementations of the same.
- Xiao, Xueli & Yan, Ming & Basodi, Sunitha & Ji, Chunyan & Pan, Yi. (2020). Efficient Hyperparameter Optimization in Deep Learning Using a Variable Length Genetic Algorithm.
- Nurshazlyn Mohd Aszemi and P.D.D Dominic, “Hyperparameter Optimization in Convolutional Neural Network using Genetic Algorithms” International Journal of Advanced Computer Science and Applications(IJACSA), 10(6), 2019.
- N. Gorgolis, I. Hatzilygeroudis, Z. Istenes and L. –. G. Gyenne, "Hyperparameter Optimization of LSTM Network Models through Genetic Algorithm," 2019 10th International Conference on Information, Intelligence, Systems and Applications (IISA), Patras, Greece, 2019, pp. 1-4, doi: 10.1109/IISA.2019.8900675.
- Erden, C. Genetic algorithm-based hyperparameter optimization of deep learning models for PM2.5 time-series prediction. Int. J. Environ. Sci. Technol. 20, 2959–2982 (2023)
- D. E. Puentes G., C. J. Barrios H. and P. O. A. Navaux, "Hyperparameter Optimization for Convolutional Neural Networks with Genetic Algorithms and Bayesian Optimization," 2022 IEEE Latin American Conference on Computational Intelligence (LA-CCI), Montevideo, Uruguay, 2022, pp. 1-5, doi: 10.1109/LA-CCI54402.2022.9981104.
- Esteban Real, Sherry Moore, Andrew Selle, Saurabh Saxena, Yutaka Leon Suematsu, Jie Tan, Quoc Le, and Alex Kurakin. Large-Scale Evolution of Image Classifiers. In Proceedings of the 34th International Conference on Machine Learning - Volume 70, pages 2902–2911, Sydney, NSW, Australia, 2017.
- James Bergstra and Yoshua Bengio. Random Search for Hyper-Parameter Optimization. J. Mach. Learn. Res., 13:281–305, 2012.
- Steven R. Young, Derek C. Rose, Thomas P. Karnowski, Seung-Hwan Lim, and Robert M. Patton. Optimizing deep learning hyper-parameters through an evolutionary algorithm. In Proceedings of the Workshop on Machine Learning in High-Performance Computing Environments - MLHPC ’15, pages 1–5, Austin, Texas, USA, 2015. ACM Press.