The increasing diffusion in many important sectors of algorithmic decision systems, often generated by machine learning, has led to the study of the possible biases that may be introduced in the various phases of creation implementation and use of these algorithms. The possibility of discriminatory biases, whose identification is made difficult by the often opaque nature of algorithms, preventing a direct analysis of their internal logic, appears especially relevant. After an introductory overview of predictive and decisional algorithms and the different types of algorithmic bias, some recent cases of algorithmic discrimination will be described. Then we will present a field study, conducted in the city of Pisa in November 2019, about the perception by foreign customers of discrimination in selfscan and selfcheckout security audits in supermarkets. The study showed how algorithmic discrimination is not felt strongly as an issue by foreign respondents, even when faced with the possibility of discriminatory profiling and with a low satisfaction about selfcheckout systems.

Hasanaj, S., Chiuppesi, M. (2021). Algoritmi decisionali e discriminazione percepita: uno studio sugli audits di sicurezza nei sistemi di self-checkout della Gdo. SOCIOLOGIA E RICERCA SOCIALE(125), 118-137 [10.3280/SR2021-125006].

Algoritmi decisionali e discriminazione percepita: uno studio sugli audits di sicurezza nei sistemi di self-checkout della Gdo

Hasanaj, Shkelzen;
2021-01-01

Abstract

The increasing diffusion in many important sectors of algorithmic decision systems, often generated by machine learning, has led to the study of the possible biases that may be introduced in the various phases of creation implementation and use of these algorithms. The possibility of discriminatory biases, whose identification is made difficult by the often opaque nature of algorithms, preventing a direct analysis of their internal logic, appears especially relevant. After an introductory overview of predictive and decisional algorithms and the different types of algorithmic bias, some recent cases of algorithmic discrimination will be described. Then we will present a field study, conducted in the city of Pisa in November 2019, about the perception by foreign customers of discrimination in selfscan and selfcheckout security audits in supermarkets. The study showed how algorithmic discrimination is not felt strongly as an issue by foreign respondents, even when faced with the possibility of discriminatory profiling and with a low satisfaction about selfcheckout systems.
2021
Hasanaj, S., Chiuppesi, M. (2021). Algoritmi decisionali e discriminazione percepita: uno studio sugli audits di sicurezza nei sistemi di self-checkout della Gdo. SOCIOLOGIA E RICERCA SOCIALE(125), 118-137 [10.3280/SR2021-125006].
File in questo prodotto:
File Dimensione Formato  
Hasanaj&Chiuppesi_I.pdf

non disponibili

Tipologia: PDF editoriale
Licenza: NON PUBBLICO - Accesso privato/ristretto
Dimensione 2.89 MB
Formato Adobe PDF
2.89 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11365/1235115