When AI Is Gender-biased

Humana Mente 13 (37) (2020)
  Copy   BIBTEX

Abstract

AI algorithms might be gender biased as evidenced from translation programs, credit calculators and autocomplete features, to name a few. This article maps gender biases in technologies according to the postphenomenological formula of I-technology-world. This is the basis for mapping the gender biases in AI algorithms, and for proposing up-dates to the postphenomenological formula. The updates include refereces to I-algo-rithm-dataset, and the reversal of the intetionality arrow to reflect the lower position of the human user. The last section reviews three ethical analyses for AI algorithms - dis-tributive justice, ethics of care and mediation theory's ethics.

Other Versions

No versions found

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 101,219

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Analytics

Added to PP
2020-07-19

Downloads
35 (#649,724)

6 months
13 (#265,352)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

Galit Wellner
Tel Aviv University

Citations of this work

No citations found.

Add more citations

References found in this work

No references found.

Add more references