Computer Science

Dilation of Chisini-Jensen-Shannon divergence

Document Type

Conference Paper

Abstract

Jensen-Shannon divergence (JSD) does not provide adequate separation when the difference between input distributions is subtle. A recently introduced technique, Chisini Jensen Shannon Divergence (CJSD), increases JSD's ability to discriminate between probability distributions by reformulating with operators from Chisini mean. As a consequence, CJSDs also carry additional properties concerning robustness. The utility of this approach was validated in the form of two SVM kernels that give superior classification performance. Our work explores why the performance improvement to JSDs is afforded by this reformulation. We characterize the nature of this improvement based on the idea of relative dilation, that is how Chisini mean transforms JSD's range and prove a number of propositions that establish the degree of this separation. Finally, we provide empirical validation on a synthetic dataset that confirms our theoretical results pertaining to relative dilation.

Publication Title

Proceedings - 3rd IEEE International Conference on Data Science and Advanced Analytics, DSAA 2016

Publication Date

2016

First Page

174

Last Page

183

ISBN

9781509052066

DOI

10.1109/DSAA.2016.25

Keywords

Chisini-Jensen-Shannon divergence, CJSD Kernel, Dilation

APA Citation

Sharma, P. K., & Holness, G. (2016, October). Dilation of chisini-jensen-shannon divergence. In 2016 IEEE International Conference on Data Science and Advanced Analytics (DSAA) (pp. 174-183). IEEE.

Share

COinS