본문 바로가기
[학술저널]

  • 학술저널

류나현(고려대학교) 김형석(고려대학교) 강필성(고려대학교)

UCI(KEPA) : I410-ECN-0101-2017-530-001340005

표지

북마크 0

리뷰 0

이용수 408

피인용수 0

초록

The purpose of variable selection techniques is to select a subset of relevant variables for a particular learning algorithm in order to improve the accuracy of prediction model and improve the efficiency of the model. We conduct an empirical analysis to evaluate and compare seven well-known variable selection techniques for multiple linear regression model, which is one of the most commonly used regression model in practice. The variable selection techniques we apply are forward selection, backward elimination, stepwise selection, genetic algorithm (GA), ridge regression, lasso (Least Absolute Shrinkage and Selection Operator) and elastic net. Based on the experiment with 49 regression data sets, it is found that GA resulted in the lowest error rates while lasso most significantly reduces the number of variables. In terms of computational efficiency, forward/backward elimination and lasso requires less time than the other techniques.

목차

1. 서론
2. 다중선형회귀분석(Multiple Linear Regression)
3. 변수선택법(Variable Selection)
4. 실험 설계
5. 변수선택법 평가
6. 결론
참고문헌

리뷰(0)

도움이 되었어요.0

도움이 안되었어요.0

첫 리뷰를 남겨주세요.
DBpia에서 서비스 중인 논문에 한하여 피인용 수가 반영됩니다.
인용된 논문이 DBpia에서 서비스 중이라면, 아래 [참고문헌 신청]을 통해서 등록해보세요.
Insert title here