-
Notifications
You must be signed in to change notification settings - Fork 1
/
Copy pathindex.html
208 lines (197 loc) · 13.9 KB
/
index.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
<meta name="description" content="Deng-Ping Fan's home page">
<link rel="shortcut icon" href="./images/logo-ethz2.png">
<link href='https://fonts.googleapis.com/css?family=Roboto:400,500,400italic,300italic,300,500italic,700,700italic,900,900italic' rel='stylesheet' type='text/css'>
<link rel="stylesheet" href="./assets/jemdoc.css" type="text/css">
<!--<link href="static/bootstrap/css/bootstrap.css" rel="stylesheet">
<link href="static/xin.css" rel="stylesheet">-->
<script async defer src="https://buttons.github.io/buttons.js"></script>
<script>
(function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
(i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
})(window,document,'script','https://www.google-analytics.com/analytics.js','ga');
ga('create', 'UA-88572407-1', 'auto');
ga('send', 'pageview');
</script>
<meta name="google-site-verification" content="F0Q0t5oLq1pGwXGMf_38oA2MxW_zfiMRsQTYD4_GJoQ"/>
<title>Deng-Ping Fan</title>
</head>
<body>
<div id="layout-content" style="margin-top:25px">
<table>
<tbody>
<div id="toptitle">
<h1>Homepage</h1>
</div>
<div class="navbar-collapse collapse">
<h2>
<a href="https://dengpingfan.github.io/index.html">Home|</a>
<a href="pages/People.html">People|</a>
<a href="pages/Publication.html">Publications|</a>
<a href="pages/Services.html">Services|</a>
<a href="pages/Accept.html">AcceptRate</a>
</h2>
</div>
<!--<p>
<a href="https://vision.ee.ethz.ch/the-institute.html">Computer Vision Lab</a>, ETH Zürich<br>
ETF C113.2, Sternwartstrasse 7, 8092 Zürich, Switzerland<br>
Email: dengpfan(AT)gmail(DOT)com<br>
</p>-->
</tbody>
</table>
<!-- News -->
<h2>
<a id="News" class="anchor" href="#News">News</a>
</h2>
<ul>
<!--<li>(2024/11), <a href="https://mp.weixin.qq.com/s/qRPOp0XFpYYUtiCha-I59A">《如何精准识别伪装场景》</a>一文被中央党校权威刊物《学习时报》(全党唯一一份专门讲学习的中央级报纸)的科技前沿版收录
<a href="https://dengpingfan.github.io/papers/[2024][Newspaper]-COD-20241127A6.pdf">[PDF]</a>。<br></li>-->
<li>(2025/01), I am serving as an Area Chair for <a href="https://conferences.miccai.org/2025/en/IMPORTANT-DATES.html">MICCAI 2025</a>.<br></li>
<li>(2024/09), I am serving as an Area Chair for <a href="https://cvpr.thecvf.com/Conferences/2025">CVPR 2025</a>.<br></li>
<!--<li>(2024/05), I am serving as an Area Chair for <a href="https://neurips.cc/">NeurIPS 2024</a>.<br></li>
<li>(2024/04), I am serving as an Area Chair for <a href="https://openreview.net/group?id=NeurIPS.cc/2024/Datasets_and_Benchmarks_Track">NeurIPS Datasets and Benchmarks 2024</a>.<br></li>
<li>(2024/02), I am serving as an Area Chair for <a href="https://conferences.miccai.org/2024/en/">MICCAI 2024</a>.<br></li>
<li>(2023/12), I am serving as an Senior Program Committee for <a href="https://ijcai24.org/call-for-papers/">IJCAI 2024</a>.<br></li>
<li>(2023/06), I am serving as an Area Chair for <a href="https://cvpr.thecvf.com/Conferences/2024">CVPR 2024</a>.<br></li>
<li>(2023/02), Call for paper: <a href="https://mp.weixin.qq.com/s/G8LjgU3GdEehOyaSIlrUbw">Multi-Modal Representation Learning</a>, Special Issue in Machine Intelligence Research (MIR), 2023 (Submission Deadline: July 1, 2023)</li>-->
<li>(2022/08),
Dichotomous Image Segmentation (DIS) Benchmark: <a href="https://xuebinqin.github.io/dis/index.html">link</a>;
Video SOD Benchmark: <a href="https://github.com/DengPingFan/DAVSOD"><u>link</u></a>;
Video Camouflaged Object Object Segmentation (VCOS) Benchmark: <a href="https://xueliancheng.github.io/SLT-Net-project/"><u>link</u></a>;
Camouflaged Object Detection (COD) Benchmark: <a href="https://dengpingfan.github.io/pages/COD.html"><u>link</u></a>;
Camouflaged Instance Segmentation (CIS) Benchmark: <a href="https://blog.patrickcty.cc/OSFormer-Homepage/"><u>link</u></a>;
Co-SOD Benchmark: <a href="https://github.com/DengPingFan/CoSOD3K"><u>link</u></a>;
RGB-D SOD Leaderboards: <a href="https://github.com/DengPingFan/D3NetBenchmark"><u>link</u></a>;
Light Field SOD Benchmark: <a href="https://github.com/kerenfu/LFSOD-Survey"><u>link</u></a>;
RGB SOD Leaderboards: <a href="https://github.com/DengPingFan/SODBenchmark"><u>link</u></a>
<br>
</li>
<li>(2021/05), <strong>机器之心走近全球顶尖实验室系列</strong>: <a href="https://app6ca5octe2206.pc.xiaoe-tech.com/detail/v_60a36389e4b0adb2d8652c35/3"><u>起源人工智能研究院(IIAI)-第三期:伪装目标检测:挑战、方法和应用</u>.</a><br>
</li>
</ul>
<!-- Open Positions -->
<h2>
<a id="Positions" class="anchor" href="#Position">Open Positions</a>
</h2>
<ul>
<!--<li>
<strong>特别研究助理:</strong> 中国科学院大学<a href="https://scholar.google.com/citations?user=z84rLjoAAAAJ&hl=zh-CN">邵岭</a>教授团队招聘
<a href="https://mp.weixin.qq.com/s/pbdzDrnhLNi59NfKOUC5kA">特别研究助理</a>,欢迎邮件([email protected])咨询并附上简历。<br>
</li>
<li>
<strong>Ph.D. Student:</strong> We are looking for <a href="https://vision.ee.ethz.ch/people.html">guest</a> Ph.D. students at Computer Vision Lab, ETH Zurich under the support of <a href="https://www.csc.edu.cn/chuguo">CSC</a> or other fundings.
If you are interested in working with Prof. <a href="https://vision.ee.ethz.ch/people-details.OTAyMzM=.TGlzdC8zMjcxLC0xOTcxNDY1MTc4.html">Luc Van Gool</a> and me, please drop me an email with your CV.<br>
</li>-->
<li>
<strong>Ph.D. Student:</strong> We are looking for visiting Ph.D. students from china at MBZUAI
under the support of <a href="https://www.csc.edu.cn/chuguo">CSC</a> or other fundings.
If you are interested in working with Prof. <a href="https://scholar.google.com/citations?user=zvaeYnUAAAAJ&hl=zh-CN">Fahad Shahbaz Khan</a> and me, please drop me an email with your CV.<br>
</li>
</ul>
<h2><a>Recent Publications</a> (<a href="https://scholar.google.com/citations?user=kakwJ5QAAAAJ&hl=zh-CN&oi=ao">Full</a>)</h2>
<ul>
<li>
<p>
<b>CamoFormer: Masked Separable Attention for Camouflaged Object Detection</b><br>
Bowen Yin, Xuying Zhang, <u>Deng-Ping Fan</u>, Shaohui Jiao, Ming-Ming Cheng, Luc Van Gool, Qibin Hou*<br>
<i>IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI)</i>, 2024. (IF: 20.8)<br>
<a href="https://arxiv.org/abs/2212.06570">[PDF]</a>
[中译版]
<a href="https://github.com/HVision-NKU/CamoFormer">[Code]</a>
[Official Version]
</p>
</li>
<li>
<p>
<b>Latent Semantic Consensus For Deterministic Geometric Model Fitting</b><br>
Guobao Xiao, Jun Yu*, Jiayi Ma, <u>Deng-Ping Fan</u> and Ling Shao<br>
<i>IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI)</i>, 2024, 46 (9): 6139-6153. (IF: 20.8)<br>
<a href="https://arxiv.org/pdf/2403.06444">[PDF]</a>
[中译版]
<a href="https://github.com/guobaoxiao/LSC">[Code]</a>
<a href="https://ieeexplore.ieee.org/document/10472101">[Official Version]</a>
</p>
</li>
<li>
<p>
<b>Vanishing-Point-Guided Video Semantic Segmentation of Driving Scenes</b><br>
Diandian Guo, <u>Deng-Ping Fan*</u>, Tongyu Lu, Christos Sakaridis, Luc Van Gool<br>
<i>IEEE Conference on Computer Vision and Pattern Recognition (CVPR)</i>, Seattle, USA, June 17-21, 2024<br>
[<strong><font color="red">Poster (Highlight)</font></strong>, Accept rate = 324 /11532 = 2.8%]<br>
<a href="https://arxiv.org/html/2401.15261v1">[PDF]</a>
[中译版]
<a href="https://github.com/RascalGdd/VPSeg">[Code]</a>
<a href="https://openaccess.thecvf.com/content/CVPR2024/papers/Guo_Vanishing-Point-Guided_Video_Semantic_Segmentation_of_Driving_Scenes_CVPR_2024_paper.pdf">[Official Version]</a>
</p>
</li>
<li>
<p>
<b>BA-SAM: Scalable Bias-Mode Attention Mask for Segment Anything Model</b><br>
Yiran Song, Qianyu Zhou, Xiangtai Li, <u>Deng-Ping Fan</u>, Xuequan Lu, Lizhuang Ma*<br>
<i>IEEE Conference on Computer Vision and Pattern Recognition (CVPR)</i>, Seattle, USA, June 17-21, 2024<br>
<a href="https://arxiv.org/abs/2401.02317">[PDF]</a>
[中译版]
<a href="https://github.com/zongzi13545329/BA-SAM">[Code]</a>
<a href="https://openaccess.thecvf.com/content/CVPR2024/papers/Song_BA-SAM_Scalable_Bias-Mode_Attention_Mask_for_Segment_Anything_Model_CVPR_2024_paper.pdf">[Official Version]</a>
<a href="https://openaccess.thecvf.com/content/CVPR2024/supplemental/Song_BA-SAM_Scalable_Bias-Mode_CVPR_2024_supplemental.pdf">[Supplemental]</a>
</p>
</li>
<li>
<p>
<b>VSCode: General Visual Salient and Camouflaged Object Detection with 2D Prompt Learning</b><br>
Ziyang Luo, Nian Liu*, Wangbo Zhao, Xuguang Yang, Dingwen Zhang, <u>Deng-Ping Fan</u>, Fahad Khan, Junwei Han<br>
<i>IEEE Conference on Computer Vision and Pattern Recognition (CVPR)</i>, Seattle, USA, June 17-21, 2024<br>
<a href="https://arxiv.org/pdf/2311.15011.pdf">[PDF]</a>
[中译版]
<a href="https://github.com/Sssssuperior/VSCode">[Code]</a>
<a href="https://openaccess.thecvf.com/content/CVPR2024/papers/Luo_VSCode_General_Visual_Salient_and_Camouflaged_Object_Detection_with_2D_CVPR_2024_paper.pdf">[Official Version]</a>
<a href="https://openaccess.thecvf.com/content/CVPR2024/supplemental/Luo_VSCode_General_Visual_CVPR_2024_supplemental.pdf">[Supplemental]</a>
</p>
</li>
<li>
<p>
<b>LAKE-RED: Camouflaged Images Generation by Latent Background Knowledge Retrieval-Augmented Diffusion</b><br>
Pancheng Zhao, Peng Xu, Pengda Qin, <u>Deng-Ping Fan</u>, Zhicheng Zhang, Guoli Jia, Bowen Zhou, Jufeng Yang<br>
<i>IEEE Conference on Computer Vision and Pattern Recognition (CVPR)</i>, Seattle, USA, June 17-21, 2024<br>
<a href="http://arxiv.org/abs/2404.00292">[PDF]</a>
<a href="https://dengpingfan.github.io/papers/[2024][CVPR]LAKE-RED_Chinese.pdf">[中译版]</a>
<a href="https://github.com/PanchengZhao/LAKE-RED">[Code]</a>
<a href="https://openaccess.thecvf.com/content/CVPR2024/papers/Zhao_LAKE-RED_Camouflaged_Images_Generation_by_Latent_Background_Knowledge_Retrieval-Augmented_Diffusion_CVPR_2024_paper.pdf">[Official Version]</a>
</p>
</li>
<li>
<p>
<b>Concealed Object Detection</b><br>
<strong>Deng-Ping Fan</strong>, Ge-Peng Ji, Ming-Ming Cheng*, Ling Shao<br>
<i>IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI)</i>, 2022, 44 (10): 6024-6042. (IF: 24.314)<br>
[<strong><font color="red">ESI Highly Cited Paper (1%)</font></strong> | Extension of <a href="https://openaccess.thecvf.com/content_CVPR_2020/html/Fan_Camouflaged_Object_Detection_CVPR_2020_paper.html" style="color:#2E41DC;"><u>CVPR 2020</u></a> | <strong><font color="red"><a href="https://dengpingfan.github.io/papers/SINet-V2-Award.pdf"><u>JDC 2021</u></a> Distinguish Paper</font></strong>]<br>
<a href="https://arxiv.org/abs/2102.10274">[PDF]</a>
<a href="https://dengpingfan.github.io/papers/[2021][PAMI]SINetV2_Chinese.pdf">[中译版]</a>
<a href="https://dengpingfan.github.io/pages/COD.html">[Project Page]</a>
<a href="http://mmcheng.net/cod/">[Online Demo]</a>
<a href="https://dengpingfan.github.io/papers/[2022][TPAMI]ConcealedOD_supp.pdf">[Supplementary Material]</a>
<a href="https://github.com/GewelsJI/SINet-V2">[Code-Python]</a>
<a class="github-button" href="https://github.com/GewelsJI/SINet-V2" data-icon="octicon-star" data-show-count="true">Star</a>
<a href="https://cg.cs.tsinghua.edu.cn/jittor/news/2021-06-11-00-00-cod/">[Code-Jittor]</a>
<a href="https://pan.baidu.com/s/1EtH2tUdbBt16w5dgve7JhQ">[2.25G COD10K_All (Baidu: w3up)]</a>
<a href="https://drive.google.com/file/d/1vRYAie0JcNStcSwagmCq55eirGyMYGm5/view?usp=sharing">[2.25G COD10K_All (Google)]</a>
<a href="">[微信群 (WeChat: CVer222)]</a>
<a href="https://ieeexplore.ieee.org/document/9444794">[Official Version]</a>
</p>
</li>
</ul>
<div id="footer">
<div id="footer-text"></div>
</div>
© Deng-Ping Fan
<div class="container">
<div style="display:inline-block;width:200px;">
<script type="text/javascript" src="//rf.revolvermaps.com/0/0/1.js?i=5d4snb3frum&s=220&m=0&v=true&r=false&b=000000&n=false&c=ff0000" async="async"></script>
</div>
</div>
</div>
</body>
</html>