메뉴 건너뛰기
.. 내서재 .. 알림
소속 기관/학교 인증
인증하면 논문, 학술자료 등을  무료로 열람할 수 있어요.
한국대학교, 누리자동차, 시립도서관 등 나의 기관을 확인해보세요
(국내 대학 90% 이상 구독 중)
로그인 회원가입 고객센터 ENG
주제분류

추천
검색
질문

이용수

표지
📌
연구주제
📖
연구배경
🔬
연구방법
🏆
연구결과
AI에게 요청하기
추천
검색
질문

초록· 키워드

오류제보하기
The Error Back-Propagation(EBP) algorithm is widely applied to train a multi-layer perceptron, which is a neural network model frequently employed in order to solve complex problems including pattern recognition and adaptive control. However, it suffers from two major problems: local minima and network structure design.
This paper presents a modified error back-propagation algorithm which have the capability to solve the local minima problem and the network structure design in a unified and efficient way. Our algorithm is basically the same as the conventional EBP algorithm except application of stochastic perturbation in order to escape a local minimum. In our algorithm when a local minimum is detected weights associated with hidden units are probabilistically reinitialized and the normal EBP learning is continued with the new set of weights. Addition of a new hidden unit also can be viewed as a special case of stochastic perturbation, i.e., reinitializing all-zero weights of a virtually existing unit. The results of our experiments with several benchmark test problems, the parity problem and the two-spirals problem, including "credit-screening" data, a practical problem of credit card approval, demonstrate that our algorithm is very efficient.

목차

Abstract

Ⅰ. Introduction

Ⅱ. The EBP/SP Learning Algorithm

Ⅲ. Simulation Results and Discussions

Ⅳ. Conclusions

Acknowledgement

References

저자소개

참고문헌 (0)

참고문헌 신청

함께 읽어보면 좋을 논문

논문 유사도에 따라 DBpia 가 추천하는 논문입니다. 함께 보면 좋을 연관 논문을 확인해보세요!

이 논문의 저자 정보

최근 본 자료

전체보기

댓글(0)

0

UCI(KEPA) : I410-ECN-0101-2009-569-017766502