While Deep learning (DL) has achieved great success in big data applications, transfer learning (TL) is an important paradigm for small/insufficient data applications, which utilizes the data/knowledge in one task to facilitate the learning in another relevant task. How to integrate DL and TL to combine their advantages is an interesting and important research topic. Deep-Transfer Learning (DTL) is proposed to address this issue. Deep learning extracts knowledge from big data, which can then be used by TL for a new task/domain with small/insufficient data.
Computational intelligence techniques, mainly including neural networks, fuzzy logic, and evolutionary computation, can be valuable in DTL. For example:
- Neural networks (NN) are the cornerstones of DL.
- Hierarchical/cascaded fuzzy logic systems (FLS) and fuzzy NNs may be viewed as fuzzy rule based DL models. FLSs can also capture interpretable knowledge, which may be easily transferrable to a new domain/task. Therefore, fuzzy logic is expected to play an important role in integrating DL and TL.
- Evolutionary computation (EC) has been widely used in optimizing shallow NNs and FLSs. TL can also be viewed as an evolutionary learning strategy because it adapts the model to the changing environment. It is interesting to see novel applications of EC in DTL.
- Other emerging forms of CI, such as (but not limited to) probabilistic computation, swarm intelligence, and artificial immune systems, can also contribute to DTL from different aspects. The aims of this special issue are: (1) present the state-of-theart research on novel CI based DTL methods and their applications, and (2) provide a forum for researchers to disseminate their views on future perspectives of the field.
II. TOPICS
Topics of interest for this special issue include, but are not limited to:
Theory and Methods:
- DTL theory and algorithms
- Fuzzy logic and fuzzy set based DTL
- Neural networks based DTL
- Evolutionary computation for DTL
- Novel/emerging forms of CI (in addition to NN/FLS/EC) in DTL
- Uncertainty theory based DTL
- DTL for feature learning, classification, regression, and clustering
- DTL for multi-task modeling, multi-view modeling and co-learning
Applications:
- CI based DTL for video analysis, text processing and natural language processing
- CI based DTL for brain-machine interfaces and medical signal analysis
III. SUBMISSIONS
Manuscripts should be prepared according to the “Information for Authors” section of the journal (http://cis.ieee.org/ieee-transactions-on-emerging-topics-incomputational-intelligence.html) and submissions should be done through the journal submission website: https://mc.manuscriptcentral.com/tetci-ieee, by selecting the Manuscript Type of “New Advances in Deep-Transfer Learning” and clearly marking “New Advances in Deep Transfer Learning Special Issue Paper” as comments to the Editor-in-Chief. Submitted papers will be reviewed by at least three different expert reviewers. Submission of a manuscript implies that it is the authors’ original unpublished work and is not being submitted for possible publication elsewhere.
IV. IMPORTANT DATES
Paper submission deadline: June 31, 2018
Notice of the first round review results: September 15, 2018
Revision due: November 15, 2018
Final notice of acceptance/reject: December 15, 2018
V. GUEST EDITORS
Zhaohong Deng, Jiangnan University, China; dengzhaohong@jiangnan.edu.cn
Jie Lu, University of Technology Sydney, Australia; jie.lu@uts.edu.au
Dongrui Wu, Huazhong University of Science and Technology, China; drwu@hust.edu.cn
Kup-Sze Choi, Hong Kong Polytechnic University, Hong Kong, China; kschoi@ieee.org Shiliang Sun, East China Normal University, China; slsun@cs.ecnu.edu.cn
Yusuke Nojima, Osaka Prefecture University, Japan; nojima@cs.osakafu-u.ac.jp
No comments:
Post a Comment
Note: only a member of this blog may post a comment.