본문 바로가기
  • 학술저널

표지

DBpia에서 서비스 중인 논문에 한하여 피인용 수가 반영됩니다. 내서재에 논문을 담은 이용자 수의 총합입니다.

초록·키워드 목차

Paralyzed people have difficulty communicating with the world for their daily basic needs, and their caretakers have difficulty understanding their needs. The development and implementation of a handheld device-based brain-computer interface system with machine learning will solve the above problem. On the other hand, a simple handheld device cannot satisfy the computation of hunger ML algorithms and will have more latency. This paper overcomes the limitations of the above by processing the data in the cloud. The handheld device reads and preprocesses the electroencephalogram (EEG) data and forwards it to the IoT-based Cloud server. The cloud server applies the machine-learning algorithm and classifies it in the text, representing the word thought by the user. This text information result is sent back to the handheld device and intimates the caretaker to know the patient"s needs. The evaluation result of the proposed system for ten words to deal with the basic needs highlights the feasibility of implementing it in practice. #AWS lambda #Brain-computer interface #Cloud computing #CNN #EEG signal #IoT #Imagined speech to text

Abstract
1. Introduction
2. Related Work
3. Materials and Methods
4. Performance Analysis and Result Discussion
5. Opportunities and Challenges
6. Conclusion
References

DBpia에서 서비스 중인 논문에 한하여 피인용 수가 반영됩니다.
Insert title here
논문의 정보가 복사되었습니다.
붙여넣기 하세요.