뇌공학/논문 정리
CBAM: Convolutional Block Attention Module
집사 몽이
2021. 1. 13. 17:04
반응형
| 읽은 날 | 2021.01.13. | 학술지 | Proceedings of the European conference on computer vision (ECCV) |
| 제목 | CBAM: Convolutional Block Attention Module | ||
| 저자 | Sanghyun Woo*1, Jongchan Park*†2, Joon-Young Lee3, and In So Kweon1 | ||
| 한줄요약 | Visual infomation을 현재 개발된 model보다 더 performance가 좋은 모듈의 개발 (CBAM) | ||
| 초록 | We propose Convolutional Block Attention Module (CBAM), a simple yet effective attention module for feed-forward convolutional neural networks. Given an intermediate feature map, our module sequentially infers attention maps along two separate dimensions, channel and spatial, then the attention maps are multiplied to the input feature map for adaptive feature refinement. Because CBAM is a lightweight and general module, it can be integrated into any CNN architectures seamlessly with negligible overheads and is end-to-end trainable along with base CNNs. We validate our CBAM through extensive experiments on ImageNet-1K, MS COCO detection, and VOC 2007 detection datasets. Our experiments show consistent improvements in classification and detection performances with various models, demonstrating the wide applicability of CBAM. The code and models will be publicly available. |
||
| 키워드 | Object recognition, attention mechanism, gated convolution | ||
| 의의 | 여러가지 이미지데이터(IMAGENET-1K,MS COCO, VOC 2007)의 classification 정확도를 높이는 데에 성공함. | ||
| 비판점 | |||
반응형