设为首页收藏本站

VALSE

VALSE 首页 活动通知 好文作者面授招 查看内容

20170412-06 齐国君:Loss-Sensitive Generative Adversarial Networks on...

2017-4-5 10:17| 发布者: 程一-计算所| 查看: 2831| 评论: 0

摘要: 报告嘉宾1:齐国君(University of Central Florida)报告时间:2017年4月12日晚20:00(北京时间)报告题目:Loss-Sensitive Generative Adversarial Networks on Lipschitz Densities主持人:任传贤(中山大学)报 ...

报告嘉宾1:齐国君(University of Central Florida)

报告时间:2017年4月12日晚20:00(北京时间)

报告题目:Loss-Sensitive Generative Adversarial Networks on Lipschitz Densities

主持人:任传贤(中山大学)


报告摘要:In this talk, I will present a novel Loss-Sensitive GAN (LS-GAN) that learns a loss function to separate generated samples from their real examples.  An important property of the LS-GAN is it allows the generator to focus on improving poor data points that are far apart from real examples rather than wasting efforts on those samples that have already been well generated, and thus can improve the overall quality of generated samples. The theoretical analysis also shows that the LS-GAN can generate samples following the true data density. In particular, we present a regularity condition on the underlying data density, which allows us to use a class of Lipschitz losses and generators to model the LS-GAN.  It relaxes the assumption that the classic GAN should have infinite modeling capacity to obtain the similar theoretical guarantee. Furthermore, we show the generalization ability of the LS-GAN by bounding the difference between the model performances over the empirical and real distributions, as well as deriving a tractable sample complexity to train the LS-GAN model in terms of its generalization ability. We also derive a non-parametric solution that characterizes the upper and lower bounds of the losses learned by the LS-GAN, both of which are cone-shaped and have non-vanishing gradient almost everywhere. This shows there will be sufficient gradient to update the generator of the LS-GAN even if the loss function is over-trained, relieving the vanishing gradient problem in the classic GAN.

We also extend the unsupervised LS-GAN to a conditional model generating samples based on given conditions, and show its applications in both supervised and semi-supervised learning problems.

We conduct experiments to compare different models on both generation and classification tasks, and show the LS-GAN is resilient against vanishing gradient and model collapse even with overtrained loss function or mildly changed network architecture.


文章信息:Loss-Sensitive Generative Adversarial Networks on Lipschitz Densities, Guo-Jun Qi,   arXiv:1701.06264, 2017


报告人简介:Dr. Qi is an assistant professor of Computer Science in the University of Central Florida. Prior to joining UCF, he was a Research Staff Member at the IBM T.J. Watson Research Center (Yorktown Heights, NY). He worked with Professor Thomas Huang in the Image Formation and Processing Group at the Beckman Institute in the University of Illinois at Urbana-Champaign, and received the Ph.D. in Electrical and Computer Engineering in December 2013. His main research interests include computer vision, pattern recognition, data mining, and multimedia computing. In particular, he is interested in information and knowledge discovery, analysis and aggregation, from multiple data sources of diverse modalities (e.g., images, audios, sensors and text).  His research also aims at effectively leveraging and aggregating data shared in an open connected environment (e.g., social, sensor and mobile networks), as well as developing computational models and theory for general-purpose knowledge and information systems. 

Dr. Qi's researches have been published in several venues, including CVPR, ICCV, ACM Multimedia, KDD, ICML, IEEE T. PAMI, IEEE T. KDE, Proceedings of IEEE.   He has served or will serve as Area Chair (Senior Program Committee Member) for ICCV, ACM Multimedia, KDD, CIKM and ICME.  He also was a Program Committee Chair for MMM 2016.  In addition, he has co-edited two special issues of "Deep Learning for Multimedia Computing" and "Big Media Data: Understanding, Search and Mining" for IEEE T. Multimedia and IEEE T. Big Data respectively.


特别鸣谢本次Webinar主要组织者
VOOC责任委员:郭裕兰(国防科大)
VODB协调理事:贾伟(合肥工业大学),鲁继文(清华大学)

最新评论

Archiver|手机版|小黑屋|Vision And Learning SEminar    

GMT+8, 2017-10-23 17:49 , Processed in 0.024880 second(s), 19 queries .

Powered by Discuz! X3.2

© 2001-2013 Comsenz Inc.

返回顶部