VALSE

VALSE 首页 活动通知 好文作者面授招 查看内容

20181017-32 姚权铭:Efficient Learning of Nonconvex Sparse andLow-rank Models .. ...

2018-10-11 17:09| 发布者: 程一-计算所| 查看: 4519| 评论: 0

摘要: 报告嘉宾:姚权铭(第四范式研究员)报告时间:2018年10月17日(星期三)晚上20:00(北京时间)报告题目:Efficient Learning of Nonconvex Sparse andLow-rank Models主持人:王楠楠(西安电子科技大学)报告人简介 ...

报告嘉宾:姚权铭(第四范式研究员)

报告时间:2018年10月17日(星期三)晚上20:00(北京时间)

报告题目:Efficient Learning of Nonconvex Sparse andLow-rank Models

主持人:王楠楠(西安电子科技大学)


报告人简介:

姚权铭博士现在第四范式担任机器学习研究员,承担自动化机器学习方向的研究方向工作。他于2018年在香港科技大学计算机系取得博士学位,2013年在华中科技大学电子与信息工程系获得本科学士学位。他曾获得2016年Google全球博士奖研金(全球仅13人当选);2014年香港科技大学Tse-Cheuk-Ng-Tai杰出研究奖(全校1人);2013年华中科技大学启明之(全校5人)。同时,他也是21篇顶级国际学术/期刊论文的作者,其中包括了JMLR,TPAMI,TKDE,TIP,MLJ刊物和ICML,NIPS,KDD,ICLR,AAAI,IJCAI会议。


个人主页:

http://www.cse.ust.hk/~qyaoaa/


报告摘要:

The use of convex regularizers allow foreasy optimization, though they often produce biased estimation and inferiorprediction performance. Recently, nonconvex regularizers have attracted a lotof attention and outperformed convex ones. However, the resultant optimizationproblem is much harder. In this talk, I will introduce my works on efficientlearning of nonconvex sparse and low-rank models.

In the first part, I will talk about one generaltransformation scheme for nonconvex sparse regularization. It helps to move thenonconvexity from the regularizer to the loss. The nonconvex regularizer isthen transformed to a familiar convex regularizer, while the resultant lossfunction can still be guaranteed to be smooth. Learning with the convexified regularizercan be performed by existing efficient algorithms originally designed forconvex regularizers (such as the standard proximal algorithm and Frank-Wolfealgorithm). In the second part, I will introduce an efficient algorithm fornonconvex low-rank regularization. We show that for many commonly-used ones, acutoff can be derived to automatically threshold the singular values obtainedfrom the proximal operator. This allows such operator being efficientlyapproximated by power method. Based on it, we develop a proximal gradientalgorithm (and its accelerated variant) with inexact proximal splitting andprove that a convergence rate of O(1/T) where T is the number of iterations isguaranteed.


参考文献:

[1] Quanming Yao, James T. Kwok. EfficientLearning with Nonconvex Regularizers by Nonconvexity Redistribution. Journal ofMachine Learning Research (JMLR), 2018.

[2] Quanming Yao, James T. Kwok, Taifeng Wang,Tie-Yan Liu. Large-Scale Low-Rank Matrix Learning with Nonconvex Regularizers.IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI). 2018.

[3] Quanming Yao, James T. Kwok. EfficientLearning with a Family of Nonconvex Regularizers by RedistributingNonconvexity. International Conference on Machine Learning (ICML), 2016

[4] Quanming Yao, James T. Kwok, WenliangZhong. Fast Low-Rank Matrix Learning with Nonconvex Regularization. IEEEInternational Conference on Data Mining (ICDM), 2015.


18-32期VALSE在线学术报告参与方式:


长按或扫描下方二维码,关注“VALSE”微信公众号(valse_wechat),后台回复“32期”,获取直播地址。



特别鸣谢本次Webinar主要组织者:

王楠楠(西安电子科技大学)



活动参与方式:

1、VALSE Webinar活动依托在线直播平台进行,活动时讲者会上传PPT或共享屏幕,听众可以看到Slides,听到讲者的语音,并通过聊天功能与讲者交互;

2、为参加活动,请关注VALSE微信公众号:valse_wechat 或加入VALSE QQ群(目前A、B、C、D、E、F、G群已满,除讲者等嘉宾外,只能申请加入VALSE H群,群号:701662399);

*注:申请加入VALSE QQ群时需验证姓名、单位和身份,缺一不可。入群后,请实名,姓名身份单位。身份:学校及科研单位人员T;企业研发I;博士D;硕士M。

3、在活动开始前5分钟左右,讲者会开启直播,听众点击直播链接即可参加活动,支持安装Windows系统的电脑、MAC电脑、手机等设备;

4、活动过程中,请不要说无关话语,以免影响活动正常进行;

5、活动过程中,如出现听不到或看不到视频等问题,建议退出再重新进入,一般都能解决问题;

6、建议务必在速度较快的网络上参加活动,优先采用有线网络连接;

7、VALSE微信公众号会在每周一推送上一周Webinar报告的总结及视频(经讲者允许后),每周四发布下一周Webinar报告的通知及直播链接。


[slides]

最新评论

小黑屋|手机版|Archiver|Vision And Learning SEminar

GMT+8, 2024-3-19 12:28 , Processed in 0.013825 second(s), 15 queries .

Powered by Discuz! X3.4

Copyright © 2001-2020, Tencent Cloud.

返回顶部