陈霸东

教授 、 博士生导师

访问量:

最后更新时间:--

En 登录

陈霸东

教授 、 博士生导师

访问量:

最后更新时间:--

我的新闻

中文主页

Entropy Special Issue "Information Theoretic Learning", Deadline: 15 December 2015

发布时间:2015-03-18
点击次数:
发布时间:
2015-03-18
文章标题:
Entropy Special Issue "Information Theoretic Learning", Deadline: 15 December 2015
内容:

http://www.mdpi.com/journal/entropy/special_issues/TheoreticLearning

Dear Colleague,

In the past decades, especially in recent years, entropy and related information theoretic measures (e.g. mutual information) have been successfully applied in machine learning (supervised or unsupervised) and signal processing. Information theoretic quantities can capture higher-order statistics and offer potentially significant performance improvement in machine learning applications. In information theoretic learning (ITL), the measures from information theory (entropy, mutual information, divergences, etc.) are often used as an optimization cost instead of the conventional second-order statistical measures such as variance and covariance. For example, in supervised learning, such as regression, the problem can be formulated as that of minimizing the entropy of the error between model output and desired response. This optimization criterion is called in ITL the minimum error entropy (MEE) criterion. The information theoretic learning also links information theory, nonparametric estimators, and reproducing kernel Hilbert spaces (RKHS) in a simple and unconventional way. In particular, the correntropy as a nonlinear similarity measure in kernel space has its root in Renyi's entropy. Since correntropy (especially with a small kernel bandwidth) is insensitive to outliers, it is naturally a robust cost for machine learning. The correntropy induced metric (CIM) as an approximation of the l0 norm can also be used as a sparsity penalty in sparse learning.

In this Special Issue, we seek contributions that apply information theoretic quantities (entropy, mutual information, divergences, etc.) and related measures, such as correntropy, to deal with machine learning problems. The scope of the contributions will be very broad, including theoretical research and practical applications to regression, classification, clustering, graph and kernel learning, deep learning, and so on.

Prof. Dr. Badong Chen
Prof. Dr. Jose C. Principe
Guest Editors

 

 

 

Submission

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on theInstructions for Authors page. Entropy is an international peer-reviewed Open Access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1400 CHF (Swiss Francs).