Jeongcheol Lee, Sunil Ahn*, Hyunseob Kim, Jongsuk Ruth Lee
Intelligent Automation & Soft Computing, Vol.31, No.1, pp. 255-277, 2022, DOI:10.32604/iasc.2022.018558
- 03 September 2021
Abstract Automated hyperparameter optimization (HPO) is a crucial and time-consuming part in the automatic generation of efficient machine learning models. Previous studies could be classified into two major categories in terms of reducing training overhead: (1) sampling a promising hyperparameter configuration and (2) pruning non-promising configurations. These adaptive sampling and resource scheduling are combined to reduce cost, increasing the number of evaluations done on more promising configurations to find the best model in a given time. That is, these strategies are preferred to identify the best-performing models at an early stage within a certain deadline. Although… More >