메뉴 건너뛰기
.. 내서재 .. 알림
소속 기관/학교 인증
인증하면 논문, 학술자료 등을  무료로 열람할 수 있어요.
한국대학교, 누리자동차, 시립도서관 등 나의 기관을 확인해보세요
(국내 대학 90% 이상 구독 중)
로그인 회원가입 고객센터 ENG
주제분류

추천
검색
질문

논문 기본 정보

자료유형
학술대회자료
저자정보
Seoung-Ho Choi (Hansung University) Kim Kyoungyeon (Hansung University)
저널정보
한국통신학회 한국통신학회 학술대회논문집 2020년도 한국통신학회 동계종합학술발표회 논문집
발행연도
2020.2
수록면
384 - 387 (4page)

이용수

표지
📌
연구주제
📖
연구배경
🔬
연구방법
🏆
연구결과
AI에게 요청하기
추천
검색
질문

초록· 키워드

오류제보하기
Regularization methods have been studied for efficient learning on deep learning. Regularization is quite a critical better performance in training. However, most existing regularization methods have used relatively simple linear combinations of L1 and L2 regularization. We observed that this linear combination of L1 and L2 regularization had a limitation that the effects of regularization are not enough for fast converge and finding solutions. We introduce a novel combination of L1 and L2 regularization by adopting the exponential function, called exponential nonlinear regularization. In exponential nonlinear regularization, the L1 regularization is multiplied with L2 regularization after applying the exponential function. From this, deep neural networks can find a better training path of gradients than linear regularization. This is because the loss with exponential nonlinear regularization can focus on a more critical loss in regularization. In addition, an exponential moving average regularization experiment was conducted. From two experiments on generation and segmentation tasks, we could find that our method showed excellent performances. We experimented with further improvements but included a drop in performance.

목차

Abstract
I. INTRODUCTION
II. OUR PROPOSAL
III. CONCLUSION
REFERENCES

참고문헌 (0)

참고문헌 신청

함께 읽어보면 좋을 논문

논문 유사도에 따라 DBpia 가 추천하는 논문입니다. 함께 보면 좋을 연관 논문을 확인해보세요!

이 논문의 저자 정보

이 논문과 함께 이용한 논문

최근 본 자료

전체보기

댓글(0)

0