메뉴 건너뛰기
Library Notice
Institutional Access
If you certify, you can access the articles for free.
Check out your institutions.
ex)Hankuk University, Nuri Motors
Log in Register Help KOR
Subject

An Object Detection Network through Feature and Prediction Distillation with Conlvolution Block Attention Module
Recommendations
Search
Questions

합성곱 블록 어텐션 모듈을 활용한 특징 및 예측 증류 객체 검출 네트워크

논문 기본 정보

Type
Proceeding
Author
Jaehong Yoon (중앙대학교) Heegwang Kim (중앙대학교) Chanyeong Park (중앙대학교) Junbo Jang (중앙대학교) Jiyoon Lee (중앙대학교) Joonki Paik (중앙대학교)
Journal
The Institute of Electronics and Information Engineers 대한전자공학회 학술대회 2024년도 대한전자공학회 하계학술대회 논문집
Published
2024.6
Pages
2,766 - 2,769 (4page)

Usage

cover
📌
Topic
📖
Background
🔬
Method
🏆
Result
An Object Detection Network through Feature and Prediction Distillation with Conlvolution Block Attention Module
Ask AI
Recommendations
Search
Questions

Abstract· Keywords

Report Errors
Traditional knowledge distillation methods in object detection face challenges due to feature discrepancies between Teacher and Student networks. Many current approaches rely exclusively on response-based techniques, where the Student network emulates the Teacher"s detection predictions. However, this can inadvertently transfer erroneous predictions from Teacher to Student. To address this, we present a knowledge distillation network that employs the CBAM technique alongside the Teacher network"s features to transfer knowledge to the Student network. Unlike conventional methods that focus solely on the Teacher model"s output predictions, our approach utilizes CBAM to emphasize what and where to focus of the Teacher and Student network, eventually enhancing the Student"s performance. Experiments on the COCO 2017 dataset demonstrate that our method achieves superior results compared to existing techniques.

Contents

Abstract
Ⅰ. 서론
Ⅱ. 제안하는 네트워크
Ⅲ. 실험 및 결과
Ⅳ. 결론 및 향후 연구 방향
참고문헌

References (0)

Add References

Recommendations

It is an article recommended by DBpia according to the article similarity. Check out the related articles!

Recently viewed articles

Comments(0)

0

Write first comments.