메뉴 건너뛰기
.. 내서재 .. 알림
소속 기관/학교 인증
인증하면 논문, 학술자료 등을  무료로 열람할 수 있어요.
한국대학교, 누리자동차, 시립도서관 등 나의 기관을 확인해보세요
(국내 대학 90% 이상 구독 중)
로그인 회원가입 고객센터 ENG
주제분류

추천
검색

논문 기본 정보

자료유형
학술저널
저자정보
지인영 (한국체육대학교) 김희동 (한국외국어대학교)
저널정보
한국외국어대학교 통번역연구소 통번역학연구 통번역학연구 제24권 제3호
발행연도
2020.1
수록면
191 - 223 (33page)

이용수

표지
📌
연구주제
📖
연구배경
🔬
연구방법
🏆
연구결과
AI에게 요청하기
추천
검색

초록· 키워드

오류제보하기
There was a big technical progress in the research field of machine translation: the main approach has switched from statistical machine translation (SMT) to neural machine translation (NMT), leading to dramatic improvements in translation quality. Recently, another progress has been taking place from recurrent neural network(RNN)-based NMT to transformer-based NMT (T-NMT). As the performance of NMT has evolved, a lot of research papers for machine translation have been published in the field of interpretation and translation. Their main focus is on whether machine translation can replace human translation, and analyzing the quality of translation results. In this paper, we briefly explain the history of the machine translation research and review the mechanism of NMT. NMT is basically composed of three parts: encoder, attention mechanism, and decoder. Further we discuss the new transformer structure based on the encoder-decoder model. We also discuss the challenges in NMT and explain the research direction or solutions to the problems. Particular attention is given to the mistranslation of NMT, quality of the translation, and robustness against the noises in the training dataset as well as in the testing sentences. In order to test the performance of transformer-based NMT, we used the Google NMT (GNMT) service for 4 languages – Korean, English, German, and Japanese. We confirmed the robustness against sentences with noises. However, we found unexpected volatility of NMT models where the input sentence is semantically and syntactically correct, resulting in critical degradation of translation quality.

목차

등록된 정보가 없습니다.

참고문헌 (34)

참고문헌 신청

함께 읽어보면 좋을 논문

논문 유사도에 따라 DBpia 가 추천하는 논문입니다. 함께 보면 좋을 연관 논문을 확인해보세요!

이 논문의 저자 정보

이 논문과 함께 이용한 논문

최근 본 자료

전체보기

댓글(0)

0