Counterfactual explanations provide interpretable insights into model decisions by identifying minimal changes to an instance that would lead to a different classification. This paper proposes a method for generating counterfactual explanations for non-linear Support Vector Machines (SVMs). Unlike prior approaches that rely on heuristic optimization or gradient-based methods, our approach leverages high-confidence examples as reference points, ensuring that counterfactuals are both realistic and reliably classified by the model. Our method guarantees the generation of a valid counterfactual for any given instance under mild conditions. We demonstrate its effectiveness through experiments on real-world tabular and image datasets, showing that it produces meaningful and interpretable counterfactuals across different domains under proximity and plausibility metrics.

An investigation into creating counterfactual examples for non-linear Support Vector Machines

Bergamin, Luca
;
Aiolli, Fabio
2025

Abstract

Counterfactual explanations provide interpretable insights into model decisions by identifying minimal changes to an instance that would lead to a different classification. This paper proposes a method for generating counterfactual explanations for non-linear Support Vector Machines (SVMs). Unlike prior approaches that rely on heuristic optimization or gradient-based methods, our approach leverages high-confidence examples as reference points, ensuring that counterfactuals are both realistic and reliably classified by the model. Our method guarantees the generation of a valid counterfactual for any given instance under mild conditions. We demonstrate its effectiveness through experiments on real-world tabular and image datasets, showing that it produces meaningful and interpretable counterfactuals across different domains under proximity and plausibility metrics.
2025
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/3569497
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
  • OpenAlex ND
social impact