메뉴 건너뛰기
.. 내서재 .. 알림
소속 기관/학교 인증
인증하면 논문, 학술자료 등을  무료로 열람할 수 있어요.
한국대학교, 누리자동차, 시립도서관 등 나의 기관을 확인해보세요
(국내 대학 90% 이상 구독 중)
로그인 회원가입 고객센터 ENG
주제분류

추천
검색

논문 기본 정보

자료유형
학위논문
저자정보

양성민 (가천대학교, 가천대학교 일반대학원)

지도교수
정옥란
발행연도
2020
저작권
가천대학교 논문은 저작권에 의해 보호받습니다.

이용수18

표지
AI에게 요청하기
추천
검색

이 논문의 연구 히스토리 (2)

초록· 키워드

오류제보하기
최근 자연어처리 분야는 방대한 양의 말뭉치로 사전 학습된 언어 표현 모델을 활용하는 연구가 활발하다. 특히 자연어처리 분야 중 하나인 개체명인식은 대부분 지도학습 방식을 사용하는데, 충분히 많은 양의 학습 데이터 세트와 학습 연산량이 필요하다는 단점이 있다.
강화학습은 초기 데이터 없이 시행착오 경험을 통해 학습하는 방식으로 다른 기계학습 방법론보다 조금 더 사람이 학습하는 과정에 가까운 알고리즘으로 아직 자연어처리 분야에는 많이 적용되지 않은 분야이다. 아타리 게임이나 알파고 등 시뮬레이션 가능한 게임 환경에서 많이 사용된다.
BERT는 대량의 말뭉치와 연산량으로 학습된 구글에서 개발한 범용 언어 모델이다. 최근 자연어 처리 연구 분야에서 높은 성능을 보이고 있는 언어 모델이며 많은 자연어처리 하위분야에서도 높은 정확도를 나타낸다. 본 논문에서는 DQN과 BERT 두 모델을 이용하여 적은 양의 데이터 세트와 연산량으로 개체명 인식 문제에 적용하는 새로운 구조의 BERT 기반 DQN 모델을 제안한다. 또한 새로운 구조의 모델의 개체명 인식 성능평가를 위해 실험을 통해 검증한다.

목차

Ⅰ. 서론 ································································1
1. 연구 배경 ·························································1
2. 연구 목적 ·························································2
Ⅱ. 관련연구 ···························································4
1. 연구 동향 ························································4
2. 개체명인식 ························································5
3. 강화학습 ··························································8
Ⅲ. 제안 모델 ························································12
1. 모델 구조 ························································12
2. BERT 기반 전처리 모델 ······································16
3. DQN 기반 강화학습 모델 ·······································19
Ⅳ. 구현 및 실험 ·····················································23
1. 데이터세트 ·······················································23
2. 구현 및 실험환경 ···············································26
3. 실험 결과 ························································27
Ⅴ 결론 ·······························································29
참고문헌 ······························································30
ABSTRACT ························································32
감사의 글 ·····························································35

최근 본 자료

전체보기

댓글(0)

0