论文题目: PreNAS: Preferred One-Shot Learning Towards Efficient Neural Architecture Search 作者列表: 王海滨 (阿里巴巴),戈策 (阿里巴巴,共同一作),陈鹤森 (阿里巴巴),孙修宇 (阿里巴巴) B站观看网址: 论文摘要: The wide application of pre-trained models is driving the trend of once-for-all training in one-shot neural architecture search (NAS). However, training within a huge sample space damages the performance of individual subnets and requires much computation to search for an optimal model. In this paper, we present PreNAS, a search-free NAS approach that accentuates target models in one-shot training. Specifically, the sample space is dramatically reduced in advance by a zero-cost selector, and weight-sharing one-shot training is performed on the preferred architectures to alleviate update conflicts. Extensive experiments have demonstrated that PreNAS consistently outperforms state-of-the-art one-shot NAS competitors for both Vision Transformer and convolutional architectures, and importantly, enables instant specialization with zero search cost. 论文信息: [1] Haibin Wang, Ce Ge, Hesen Chen, Xiuyu Sun. PreNAS: Preferred One-Shot Learning Towards Efficient Neural Architecture Search. ICML 2023. 论文链接: [https://arxiv.org/pdf/2304.14636v1.pdf] 代码链接: [https://github.com/tinyvision/PreNAS] 视频讲者简介: 王海滨,阿里巴巴算法工程师,北京大学硕士毕业。在ICML/ AIJ/ ACL等国际高水平会议或期刊发表多篇论文,研究的方向包括社会计算、NL2SQL、网络结构搜索。 特别鸣谢本次论文速览主要组织者: 月度轮值AC:杨帅 (南洋理工大学) 季度轮值AC:张磊 (重庆大学) |
小黑屋|手机版|Archiver|Vision And Learning SEminar
GMT+8, 2024-12-26 18:58 , Processed in 0.012801 second(s), 14 queries .
Powered by Discuz! X3.4
Copyright © 2001-2020, Tencent Cloud.