Brussels, Belgium Belgium
Dec. 10, 2012 to Dec. 10, 2012
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/ICDMW.2012.51
Data mining is gaining societal momentum due to the ever increasing availability of large amounts of human data, easily collected by a variety of sensing technologies. Data mining comes with unprecedented opportunities and risks: a deeper understanding of human behavior and how our society works is darkened by a greater chance of privacy intrusion and unfair discrimination based on the extracted patterns and profiles. Although methods independently addressing privacy or discrimination in data mining have been proposed in the literature, in this context we argue that privacy and discrimination risks should be tackled together, and we present a methodology for doing so while publishing frequent pattern mining results. We describe a combined pattern sanitization framework that yields both privacy and discrimination-protected patterns, while introducing reasonable (controlled) pattern distortion.
Data privacy, Privacy, Additives, Context, Itemsets, Data models, Frequent pattern mining, Data mining, Privacy, Anti-discrimination
Sara Hajian, Anna Monreale, Dino Pedreschi, Josep Domingo-Ferrer, Fosca Giannotti, "Injecting Discrimination and Privacy Awareness Into Pattern Discovery", ICDMW, 2012, 2013 IEEE 13th International Conference on Data Mining Workshops, 2013 IEEE 13th International Conference on Data Mining Workshops 2012, pp. 360-369, doi:10.1109/ICDMW.2012.51