Computer Science

A family of Chisini mean based Jensen-Shannon divergence kernels

Document Type

Conference Paper

Abstract

Jensen-Shannon divergence is an effective method for measuring the distance between two probability distributions. When the difference between these two distributions is subtle, Jensen-Shannon divergence does not provide adequate separation to draw distinctions from subtly different distributions. We extend Jensen-Shannon divergence by reformulating it using alternate operators that provide different properties concerning robustness. Furthermore, we prove a number of important properties for this extension: the lower limits of its range, and its relationship to Shannon Entropy and Kullback-Leibler divergence. Finally, we propose a family of new kernels, based on Chisini mean Jensen-Shannon divergence, and demonstrate its utility in providing better SVM classification accuracy over RBF kernels for amino acid spectra. Because spectral methods capture phenomenon at subatomic levels, differences between complex compounds can often be subtle. While the impetus behind this work began with spectral data, the methods are generally applicable to domains where subtle differences are important.

Publication Title

Proceedings - 2015 IEEE 14th International Conference on Machine Learning and Applications, ICMLA 2015

Publication Date

2016

First Page

109

Last Page

115

ISBN

9781509002870

DOI

10.1109/ICMLA.2015.86

Keywords

Jensen-Shannon divergence, Kernel, LIBS

APA Citation

Sharma, P. K., Holness, G., Markushin, Y., & Melikechi, N. (2015, December). A family of chisini mean based jensen-shannon divergence kernels. In 2015 IEEE 14th International Conference on Machine Learning and Applications (ICMLA) (pp. 109-115). IEEE.

Share

COinS