Brussels, Belgium Belgium
Dec. 10, 2012 to Dec. 10, 2012
Social discrimination against certain sensitive groups within society (e.g., females, blacks, minorities) is prohibited by law in many countries. To prevent discrimination arising from the use of discriminatory data, recent data mining research has focused on methods for making classifiers learned over discriminatory data discrimination-aware. Most of these methods have been tested on standard classification datasets that have been tweaked for discrimination analysis rather than over actual discriminatory data. In this paper, we study discrimination-aware classification when applied to a real world dataset of Statistics Netherlands, which is a census body in the Netherlands. Specifically, we consider the use of classifiers for predicting whether an individual is a crime suspect, or not, to support law enforcement and security agencies' decision making. Our results show that discrimination does exist in real world datasets and blind use of classifiers learned over such datasets can exacerbate the discrimination problem. We demonstrate that discrimination-aware classification methods can mitigate the discriminatory effects and that they lead to rational and legally acceptable decisions.
Communities, Data mining, Law, Sociology, Statistics, Standards, classification, discrimination
Faisal Kamiran, Asim Karim, Sicco Verwer, Heike Goudriaan, "Classifying Socially Sensitive Data Without Discrimination: An Analysis of a Crime Suspect Dataset", ICDMW, 2012, 2013 IEEE 13th International Conference on Data Mining Workshops, 2013 IEEE 13th International Conference on Data Mining Workshops 2012, pp. 370-377, doi:10.1109/ICDMW.2012.117