메뉴 건너뛰기
.. 내서재 .. 알림
소속 기관/학교 인증
인증하면 논문, 학술자료 등을  무료로 열람할 수 있어요.
한국대학교, 누리자동차, 시립도서관 등 나의 기관을 확인해보세요
(국내 대학 90% 이상 구독 중)
로그인 회원가입 고객센터 ENG
주제분류

추천
검색
질문

논문 기본 정보

자료유형
학술저널
저자정보
Abbas Jafar (Myongji University) Myungho Lee (Myongji University)
저널정보
대한전기학회 전기학회논문지 전기학회논문지 제72권 제5호
발행연도
2023.5
수록면
607 - 620 (14page)
DOI
10.5370/KIEE.2023.72.5.607

이용수

표지
📌
연구주제
📖
연구배경
🔬
연구방법
🏆
연구결과
AI에게 요청하기
추천
검색
질문

초록· 키워드

오류제보하기
Machine learning (ML) has proven to be highly effective in solving complex problems in various domains, thanks to its ability to identify specific data tasks, perform feature engineering, and learn quickly. However, designing and training ML models is a complicated task and requires optimization. The effectiveness of ML models is highly dependent on the selection of hyperparameters that determines their performance. Hyperparameter optimization (HPO) is a systematic search process to find the optimal combinations of hyperparameters to achieve robust performance. Traditional HPO methods such as grid and random search take a lot of computing time when used in large-scale applications. Recently, various automated search strategies, such as Bayesian optimization (BO) and evolutionary algorithms, have been developed to significantly reduce the computing time. In this paper, we use state-of-the-art HPO frameworks, namely BO, Optuna, HyperOpt, and Keras Tuner, for optimizing the ML and deep learning models for the classification tasks and evaluate their comparative performance using two different sets of experiments. The first one uses different ML classifiers to solve the optimal parameter selection problem with HPO. The second one attempts to optimize the convolutional neural network (CNN) architecture using HPO frameworks to improve its performance in the image classification task. We use four publicly available real-world datasets including one image dataset. The experimental results show that HyperOpt - TPE outperforms the other HPO frameworks for the ML classifiers and achieves up to 94.12% of accuracy with 30 minutes for performing the optimization. Similarly, for the CNN model, HyperOpt-TPE outperforms the other HPO frameworks by improving 34% of the classification accuracy, while taking 2 hours and 24 minutes of computing time.

목차

Abstract
1. Introduction
2. Hyperparameter Optimization
3. Hyperparameter Optimization Frameworks
4. Comparative Performance Evaluation of HPO Frameworks
5. Previous Research
6. Conclusion
References

참고문헌 (0)

참고문헌 신청

함께 읽어보면 좋을 논문

논문 유사도에 따라 DBpia 가 추천하는 논문입니다. 함께 보면 좋을 연관 논문을 확인해보세요!

이 논문의 저자 정보

최근 본 자료

전체보기

댓글(0)

0

UCI(KEPA) : I410-ECN-0101-2023-560-001445819