TY - JOUR
T1 - Towards explaining anomalies
T2 - A deep Taylor decomposition of one-class models
AU - Kauffmann, Jacob
AU - Müller, Klaus Robert
AU - Montavon, Grégoire
N1 - Funding Information:
This research was supported by the Institute for Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (No. 2017-0-00451, No. 2017-0-01779); the Deutsche Forschungsgemeinschaft (DFG) [grant MU 987/17-1]; the German Ministry for Education and Research as Berlin Big Data Center (BBDC) [01IS14013A] and Berlin Center for Machine Learning (BZML) [01IS18037A]. We are grateful to Guido Schwenk for the valuable discussion.
Funding Information:
This research was supported by the Institute for Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (No. 2017-0-00451 , No. 2017-0-01779 ); the Deutsche Forschungsgemeinschaft (DFG) [grant MU 987/17-1 ]; the German Ministry for Education and Research as Berlin Big Data Center (BBDC) [01IS14013A] and Berlin Center for Machine Learning (BZML) [01IS18037A]. We are grateful to Guido Schwenk for the valuable discussion.
Publisher Copyright:
© 2020
PY - 2020/5
Y1 - 2020/5
N2 - Detecting anomalies in the data is a common machine learning task, with numerous applications in the sciences and industry. In practice, it is not always sufficient to reach high detection accuracy, one would also like to be able to understand why a given data point has been predicted to be anomalous. We propose a principled approach for one-class SVMs (OC-SVM), that draws on the novel insight that these models can be rewritten as distance/pooling neural networks. This ‘neuralization’ step lets us apply deep Taylor decomposition (DTD), a methodology that leverages the model structure in order to quickly and reliably explain decisions in terms of input features. The proposed method (called ‘OC-DTD’) is applicable to a number of common distance-based kernel functions, and it outperforms baselines such as sensitivity analysis, distance to nearest neighbor, or edge detection.
AB - Detecting anomalies in the data is a common machine learning task, with numerous applications in the sciences and industry. In practice, it is not always sufficient to reach high detection accuracy, one would also like to be able to understand why a given data point has been predicted to be anomalous. We propose a principled approach for one-class SVMs (OC-SVM), that draws on the novel insight that these models can be rewritten as distance/pooling neural networks. This ‘neuralization’ step lets us apply deep Taylor decomposition (DTD), a methodology that leverages the model structure in order to quickly and reliably explain decisions in terms of input features. The proposed method (called ‘OC-DTD’) is applicable to a number of common distance-based kernel functions, and it outperforms baselines such as sensitivity analysis, distance to nearest neighbor, or edge detection.
KW - Deep Taylor decomposition
KW - Explainable machine learning
KW - Kernel machines
KW - Outlier detection
KW - Unsupervised learning
UR - http://www.scopus.com/inward/record.url?scp=85078086602&partnerID=8YFLogxK
U2 - 10.1016/j.patcog.2020.107198
DO - 10.1016/j.patcog.2020.107198
M3 - Article
AN - SCOPUS:85078086602
SN - 0031-3203
VL - 101
JO - Pattern Recognition
JF - Pattern Recognition
M1 - 107198
ER -