Abstract
AI algorithms might be gender biased as evidenced from translation programs, credit calculators and autocomplete features, to name a few. This article maps gender biases in technologies according to the postphenomenological formula of I-technology-world. This is the basis for mapping the gender biases in AI algorithms, and for proposing up-dates to the postphenomenological formula. The updates include refereces to I-algo-rithm-dataset, and the reversal of the intetionality arrow to reflect the lower position of the human user. The last section reviews three ethical analyses for AI algorithms - dis-tributive justice, ethics of care and mediation theory's ethics.