인문학
사회과학
자연과학
공학
의약학
농수해양학
예술체육학
복합학
지원사업
학술연구/단체지원/교육 등 연구자 활동을 지속하도록 DBpia가 지원하고 있어요.
커뮤니티
연구자들이 자신의 연구와 전문성을 널리 알리고, 새로운 협력의 기회를 만들 수 있는 네트워킹 공간이에요.
이용수
초록· 키워드
The purpose of variable selection techniques is to select a subset of relevant variables for a particular learning algorithm in order to improve the accuracy of prediction model and improve the efficiency of the model. We conduct an empirical analysis to evaluate and compare seven well-known variable selection techniques for multiple linear regression model, which is one of the most commonly used regression model in practice. The variable selection techniques we apply are forward selection, backward elimination, stepwise selection, genetic algorithm (GA), ridge regression, lasso (Least Absolute Shrinkage and Selection Operator) and elastic net. Based on the experiment with 49 regression data sets, it is found that GA resulted in the lowest error rates while lasso most significantly reduces the number of variables. In terms of computational efficiency, forward/backward elimination and lasso requires less time than the other techniques.
#Forward Selection
#Backward Elimination
#Stepwise Selection
#Genetic Algorithm
#Ridge Regression
#Lasso
#Elastic Net
상세정보 수정요청해당 페이지 내 제목·저자·목차·페이지정보가 잘못된 경우 알려주세요!
목차
- 1. 서론
- 2. 다중선형회귀분석(Multiple Linear Regression)
- 3. 변수선택법(Variable Selection)
- 4. 실험 설계
- 5. 변수선택법 평가
- 6. 결론
- 참고문헌
참고문헌
참고문헌 신청최근 본 자료
UCI(KEPA) : I410-ECN-0101-2017-530-001340005