Skip to content

Latest commit

 

History

History
20 lines (12 loc) · 1.34 KB

Attention.md

File metadata and controls

20 lines (12 loc) · 1.34 KB

hard & soft attention

    • Xu, Kelvin, et al. "Show, attend and tell: Neural image caption generation with visual attention." arxiv(2015).[pdf](arXiv preprint arXiv:1502.03044)

global & local attention

    • Minh-Thang Luong, Hieu Pham, Christopher D. Manning "Effective Approaches to Attention-based Neural Machine Translation." arxiv(2015).[pdf]

Self Attention

    • Zhouhan Lin, Minwei Feng, Cicero Nogueira dos Santos, Mo Yu, Bing Xiang, Bowen Zhou, Yoshua Bengio. "A Structured Self-attentive Sentence Embedding." arxiv(2017).[pdf][github]
    • Jianpeng Cheng, Li Dong and Mirella Lapata. "Long Short-Term Memory-Networks for Machine Reading." arxiv(2016).[pdf]
    • Minh-Thang Luong, Hieu Pham, Christopher D. Manning "Attention Is All You Need." arxiv(2017).[pdf]

Detection

    • Jianfeng Wang, Ye Yuan, Gang Yu. "Face Attention Network: An effective Face Detector for the Occluded Faces." CVPR(2017). [pdf](FAN)

Segmentation

    • He, Gkioxari, et al. "Mask R-CNN." ICCV(2017). [pdf] (Mask RCNN)