메뉴 건너뛰기
.. 내서재 .. 알림
소속 기관/학교 인증
인증하면 논문, 학술자료 등을  무료로 열람할 수 있어요.
한국대학교, 누리자동차, 시립도서관 등 나의 기관을 확인해보세요
(국내 대학 90% 이상 구독 중)
로그인 회원가입 고객센터 ENG
주제분류

추천
검색
질문

논문 기본 정보

자료유형
학술대회자료
저자정보
저널정보
대한인간공학회 대한인간공학회 학술대회논문집 대한인간공학회 2011 춘계학술대회 및 워크샵
발행연도
2011.5
수록면
355 - 358 (4page)

이용수

표지
📌
연구주제
📖
연구배경
🔬
연구방법
🏆
연구결과
AI에게 요청하기
추천
검색
질문

초록· 키워드

오류제보하기
Objective: The purpose of this study was to classify three negative emotions, i.e., anger, fear and surprise using physiological signals. Background: Previous researches show that anger, fear and surprise could cause disease, such as cardiovascular disease. If we can recognize what kinds of negative emotion people experience, it will help people to deal with the emotion or the disease which the specific emotion causes. Method: One hundred twenty-nine adults and 88 adolescents participated in this study. Anger, fear and surprise emotions were induced by using audio-visual film clips. The physiological signals were measured for 30 seconds each during the rest period and the emotional state. The obtained signals then analyzed, resulting in 18 features of physiological signals. Three different emotions were classified by two classifiers using Discriminant Analaysis and Support Vector Machine(SVM). Results: The analysis on the obtained three different emotions resulted in the classification rate on 72.6% on Discriminant Analaysis, 56.9% on SVM. Conclusion: Although this study showed moderate accuracy, it still has achieved emotional responses through various participants compared to other studies which collected data from one participant. In future studies, emotion classification methods are needed to include other signal such as face expression to improve classification rate. Application: By using these data and algorithms, we will recognize user’s emotion, and provide proper feedback.

목차

ABSTRACT
1. Introduction
2. Method
3. Results
4. Conclusion
Acknowledgements
References
Author listings

참고문헌 (0)

참고문헌 신청

이 논문의 저자 정보

이 논문과 함께 이용한 논문

최근 본 자료

전체보기

댓글(0)

0

UCI(KEPA) : I410-ECN-0101-2013-530-001644693