VALSE

VALSE 首页 活动通知 查看内容

VALSE Webinar 20240717-20期 总第355期 面向事件相机的物体检测与跟踪 ...

2024-7-12 10:51| 发布者: 程一-计算所| 查看: 40| 评论: 0

摘要: 报告嘉宾:王逍 (安徽大学)报告题目:Visual Object Tracking using an Event Camera报告嘉宾:李家宁 (启元实验室)报告题目:Object Detection with Neuromorphic Cameras报告嘉宾:王逍 (安徽大学)报告时间:2024 ...

报告嘉宾:王逍 (安徽大学)

报告题目:Visual Object Tracking using an Event Camera


报告嘉宾:李家宁 (启元实验室)

报告题目:Object Detection with Neuromorphic Cameras


报告嘉宾:王逍 (安徽大学)

报告时间:2024年7月17日 (星期三)晚上20:00 (北京时间)

报告题目:Visual Object Tracking using an Event Camera


报告人简介:

Xiao Wang (Member, IEEE) received the Ph.D. degree in computer science from Anhui University, Hefei, China, in 2019, supervised by Professor Jin Tang and Bin Luo. He was a visiting student with Sun Yat-sen University and The University of Sydney at 2016 and 2019. He finished the Postdoctoral Research with Pengcheng Laboratory (Shenzhen, China). Currently, he is working at the School of Computer Science and Technology of Anhui University, Hefei 230601, China, as an Associate Professor.His current research interests are mainly computer vision, event-based vision, machine learning, and pattern recognition, and have published more than 30 papers in high-level conferences and journals at home and abroad, including CCF-A class conferences (CVPR, AAAI), top international journals (TPAMI, IJCV, TIP, TNNLS, TMM, TCSVT, TCYB, TGRS, PR, etc.), with 2000+ citations on Google Scholar. He has led projects funded by the National Natural Science Foundation (Youth Project), the Postdoctoral Innovation Program, and other postdoctoral projects. In 2022, he was selected as a Category D High-Level Talent in Hefei, Anhui Province. Dr. Wang serves as a Reviewer for a number of journals and conferences, such as IEEE TCSVT, TNNLS, TIP, IJCV, CVIU, CVPR, ICCV, ECCV, AAAI, ACCV, ACM-MM, and WACV.


个人主页:

https://wangxiao5791509.github.io/


报告摘要:

The event camera represents an innovative, bio-inspired sensory technology that differs significantly from conventional synchronous visible light cameras. Its unique output consists of spatially sparse yet temporally dense asynchronous pulse streams. This sensor excels in high-speed motion capture, complex environments, and low-light settings, making it increasingly applicable to a wide range of visual tasks, including object tracking tailored for event cameras. My presentation will begin with an overview of the event camera's working principles and its key application areas. I will then delve into the rationale behind incorporating event cameras into visual tracking systems and illustrate their integration through several of our projects. Lastly, I will offer insights into the future trends and promising avenues for research in the field of event camera-based tracking.

 

参考文献:

[1] Wang, Xiao, et al. "Event stream-based visual object tracking: A high-resolution benchmark dataset and a novel baseline." Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. 2024.

[2] Tang, Chuanming, et al. "Revisiting color-event based tracking: A unified network, dataset, and metric." arXiv preprint arXiv:2211.11010 (2022).

[3] Wang, Xiao, et al. "Visevent: Reliable object tracking via collaboration of frame and event flows." IEEE Transactions on Cybernetics (2023).

[4] Wang, Xiao, et al. "Long-term Frame-Event Visual Tracking: Benchmark Dataset and Baseline." arXiv preprint arXiv:2403.05839 (2024).


报告嘉宾:李家宁 (启元实验室)

报告时间:2024年7月17日 (星期三)晚上20:30 (北京时间)

报告题目:Object Detection with Neuromorphic Cameras


报告人简介:

Jianing Li received the Ph.D. degree from the National Engineering Research Center for Visual Technology, School of Computer Science, Peking University, Beijing, China, in 2022. His research interests include event-based vision, neuromorphic engineering, and robot learning. He is the author or coauthor of about 30 technical conferences and journals focusing on neuromorphic sensing and computing. He has also served as a reviewer for international conferences and journals (e.g., TPAMI, TIP, TNNLS, CVPR, ICCV, ECCV, NeurIPS, ICML and ICLR).

 

个人主页:

https://scholar.google.com/citations?user=xrYnfwcAAAAJ&hl=zh-CN&oi=ao

 

报告摘要:

Conventional cameras encounter challenges in detecting objects with high-speed motion blur or in low-light conditions. Neuromorphic cameras offer asynchronous visual streams with high temporal resolution and dynamic range, making them promising for computer vision applications. This talk will present neuromorphic object detection algorithms from various perspectives: event representation, temporal modeling, multimodal fusion, asynchronous processing, low-latency processing, and energy-efficient computing. We will also discuss open challenges in neuromorphic object detection and suggest promising future directions.


参考文献:

[1] Jianing Li, Jia Li, Lin Zhu, Xijie Xiang, Tiejun Huang, and Yonghong Tian. “Asynchronous spatiotemporal memory network for continuous event-based object detection,” IEEE Transactions on Image Processing, 2022, 31: 2975-2987.

[2] Dianze Li, Jianing Li*, and Yonghong Tian*. “SODFormer: Streaming object detection with transformer using events and frames,” IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023.

[3] Jianing Li#, Xiao Wang#, Lin Zhu, Jia Li, Tiejun Huang, and Yonghong Tian. “Retinomorphic object detection in asynchronous visual streams,” AAAI 2022. (Oral)

[4] Qiaoyi Su, Yuhong Chou, Yifan Hu, Jianing Li, Shijie Mei, Ziyang Zhang, and Guoqi Li. “Deep directly-trained spiking neural networks for object detection,” CVPR 2023.

[5] Haiqian Han, Jiacheng Lv, Jianing Li, Henglu Wei, Cheng Li, Yajing Wei, Chen Shu, and Xiangyang Ji. “Physical-based event camera simulator,” ECCV 2024.

[6] 李家宁, 田永鸿. “神经形态视觉传感器的研究进展及应用综述,” 计算机学报, 2021, 44 (6), 1258-1286.


主持人:江波 (安徽大学)


主持人简介:

江波,安徽大学教授、博士生导师,安徽大学计算机科学与技术学院副院长,安徽省计算机学会理事,安徽省计算机学会青工委副主任。主要从事结构模式识别、计算机视觉以及医学影像分析等方向的研究。以第一/通讯作者在中科院一区、IEEE Transactions汇刊和CCF-A类期刊和会议上发表论文50余篇(含第一/通讯作者发表IEEE T-PAMI, IJCV论文8篇)。单篇论文谷歌学术最高引用350余次。主持国家自然科学基金面上项目、安徽省优青项目、安徽省重点研发计划项目等。获安徽省计算机学会自然科学一等奖、ACM合肥分会新星奖等。


个人主页:

https://jiangboahu.github.io/



特别鸣谢本次Webinar主要组织者:

主办AC:江波 (安徽大学)

协办AC:郑乾 (浙江大学)


活动参与方式

1、VALSE每周举行的Webinar活动依托B站直播平台进行,欢迎在B站搜索VALSE_Webinar关注我们!

直播地址:

https://live.bilibili.com/22300737;

历史视频观看地址:

https://space.bilibili.com/562085182/ 


2、VALSE Webinar活动通常每周三晚上20:00进行,但偶尔会因为讲者时区问题略有调整,为方便您参加活动,请关注VALSE微信公众号:valse_wechat 或加入VALSE QQ T群,群号:863867505);


*注:申请加入VALSE QQ群时需验证姓名、单位和身份缺一不可。入群后,请实名,姓名身份单位。身份:学校及科研单位人员T;企业研发I;博士D;硕士M。


3、VALSE微信公众号一般会在每周四发布下一周Webinar报告的通知。


4、您也可以通过访问VALSE主页:http://valser.org/ 直接查看Webinar活动信息。Webinar报告的PPT(经讲者允许后),会在VALSE官网每期报告通知的最下方更新。



小黑屋|手机版|Archiver|Vision And Learning SEminar

GMT+8, 2024-7-16 08:14 , Processed in 0.014603 second(s), 14 queries .

Powered by Discuz! X3.4

Copyright © 2001-2020, Tencent Cloud.

返回顶部