TY  - JOUR
TI  - Delta Divergence: A Novel Decision Cognizant Measure of Classifier Incongruence
EP  - 2343
AV  - public
Y1  - 2019/06//
KW  - Classifier incongruence
KW  -  divergence clutter
KW  - 
f-divergences
KW  -  total variation distance
ID  - discovery10067254
N2  - In pattern recognition, disagreement between two classifiers regarding the predicted class membership of an observation can be indicative of an anomaly and its nuance. Since, in general, classifiers base their decisions on class a posteriori probabilities, the most natural approach to detecting classifier incongruence is to use divergence. However, existing divergences are not particularly suitable to gauge classifier incongruence. In this paper, we postulate the properties that a divergence measure should satisfy and propose a novel divergence measure, referred to as delta divergence. In contrast to existing measures, it focuses on the dominant (most probable) hypotheses and, thus, reduces the effect of the probability mass distributed over the non dominant hypotheses (clutter). The proposed measure satisfies other important properties, such as symmetry, and independence of classifier confidence. The relationship of the proposed divergence to some baseline measures, and its superiority, is shown experimentally.
N1  - This work is licensed under a Creative Commons Attribution 3.0 License. For more information, see http://creativecommons.org/licenses/by/3.0/
IS  - 6
VL  - 49
SP  - 2331
JF  - IEEE Transactions on Cybernetics
A1  - Kittler, J
A1  - Zor, C
SN  - 2168-2275
UR  - http://doi.org/10.1109/TCYB.2018.2825353
ER  -