ǂa ǂsurvey of issues
Ana Farič (Author), Ivan Bratko (Author)

Abstract

Some recent applications of Artificial Intelligence, particularly machine learning, have been strongly criticised in general media and professional literature. Applications in domains of justice, employment and banking are often mentioned in this respect. The main critic is that these applications are biased with respect to so called protected attributes, such as race, gender and age. The most notorious example is the system COMPAS which is still in use in the American justice system despite severe criticism. The aim of our paper is to analyse the trends of discussion about bias in machine learning algorithms using the COMPAS as an example. The main problem we observed is that even in the field of AI, there is no generally agreed upon definition of bias which would enable operational use in preventing bias. Our conclusions are that (1) improved general education concerning AI is needed to enable better understanding of AI methods in everyday applications, and (2) better technical methods must be developed for reliably implementing generally accepted societal values such as equality and fairness in AI systems.

Keywords

machine learning;artificial intelligence;bias;fairness;discrimination;COMPAS;

Data

Language: English
Year of publishing:
Typology: 1.01 - Original Scientific Article
Organization: UL FRI - Faculty of Computer and Information Science
UDC: 004.85
COBISS: 229142787 Link will open in a new window
ISSN: 0350-5596
Views: 101
Downloads: 28
Average score: 0 (0 votes)
Metadata: JSON JSON-RDF JSON-LD TURTLE N-TRIPLES XML RDFA MICRODATA DC-XML DC-RDF RDF

Other data

Secondary language: Slovenian
Secondary keywords: strojno učenje;umetna inteligenca;pristranskost;pravičnost;diskriminacija;COMPAS;
Type (COBISS): Article
Pages: str. 205-212
Volume: ǂVol. ǂ48
Issue: ǂno. ǂ2
Chronology: Jun. 2024
DOI: 10.31449/inf.v48i2.5971
ID: 26070791
Recommended works:
, ǂa ǂsurvey of issues
, s Pythonom do prvega klasifikatorja
, no subtitle data available