CFP:WORKSHOP @ ICDM 2015
CALL FOR PAPERS: PRACTICAL TRANSFER LEARNING WORKSHOP @ ICDM 2015 (Nov 14, 2015 in Atlantic City, NJ, USA)IEEE International Conference on Data Mining (ICDM) Workshops on Practical Transfer Learning 2015: https://sites.google.com/site/icdmwptl2015 or http://lxduan.info/cfp_icdm_workshop.htmlSubmission deadline: Jul 20, 2015 (23:59 PST)
Submission site: https://wi-lab.com/cyberchair/2015/icdm15/scripts/submit.php?subarea=S26
Acceptance notification: Sep 1, 2015SCOPETransfer Learning (TL) has attracted growing attention in the last two decades. It has been successfully applied to many applications, including text mining, video event recognition, sensor-based prediction problems, software engineering, image categorization and so forth. With the advance of data storage and Internet technology, data becomes more massive, noisier and more complex, which also brings good opportunities and challenges for TL, i.e., how to make practical use of massive data and how to effectively deal with data from different domains need to be addressed in this era of big data. The main purpose of this workshop is to document recent progress of transfer learning in different real-world applications and also to stimulate discussion about potential challenges that may open new directions of TL. We appreciate not only the manuscripts that dedicate to solve traditional transfer learning problems, but also those which aim to discuss the approaches and/or theories for handling the new TL issues when exploiting massive data of different formats or structures.TOPICSManuscripts are solicited to address a wide range of topics including the traditional transfer learning topics, as well as topics related to practical transfer learning. The list of topics below are for the reference of authors, however, the submissions are not restricted to the traditional transfer learning topics and the topics listed below:
[*]Transfer learning for big data: the source/target data is in large scale
[*]Transfer learning with noisy data: the source/target labels contain noise
[*]Transfer learning with multi-view data: the source/target data can be represented with multiple views of features
[*]Transfer learning with additional source features: the source data contains one or multiple features that are not available in target domain
[*]Transfer learning for social networks/bioinformatics networks/sensor networks: the source/target data is structured social networks/bioinformatics networks/sensor networks data
[*]Transfer learning for non-i.i.d. and/or heterogeneous data: the source/target data is non-i.i.d or represented with different features
[*]Zero-shot learning: no example available in the target task
[*]One-shot learning: only very a few examples available in the target task
[*]Transfer learning with multiple source domains
[*]Transfer learning from multiple structured data sources
[*]Theoretical analysis on the transfer learning algorithms of above problems
IMPORTANT DATESPaper submission deadline: Jul 20, 2015
Acceptance notification: Sep 1, 2015SUBMISSIONPaper submissions are limited to a maximum of 10 pages in the IEEE 2-column format and will follow the guidelines for ICDM’15 submission. All papers will be reviewed by the Program Committee based on technical quality, relevance to workshop theme, originality, significance, and clarity. A double blind reviewing process will be adopted. Authors should therefore avoid using identifying information in the text of the paper.ORGANIZERS
[*]Lixin Duan, Machine Learning Scientist, Amazon
[*]Wen Li, Research Associate, Nanyang Technological University
[*]Sinno Jialin Pan, Nanyang Assistant Professor, Nanyang Technological University
[*]Lin Chen, Research Scientist, Amazon
[*]Ivor Wai-Hung Tsang, Associate Professor, University of Technology, Sydney
[*]Dong Xu, Associate Professor, Nanyang Technological University
页:
[1]