Gender bias in AI-based decision-making systems: a systematic literature review

Publisher:
Australian Journal of Information Systems
Publication Type:
Journal Article
Citation:
Australasian Journal of Information Systems, 2022, 26
Issue Date:
2022-01-01
Full metadata record
The related literature and industry press suggest that artificial intelligence (AI)-based decision-making systems may be biased towards gender, which in turn impacts individuals and societies. The information system (IS) field has recognised the rich contribution of AI-based outcomes and their effects; however, there is a lack of IS research on the management of gender bias in AI-based decision-making systems and its adverse effects. Hence, the rising concern about gender bias in AI-based decision-making systems is gaining attention. In particular, there is a need for a better understanding of contributing factors and effective approaches to mitigating gender bias in AI-based decision-making systems. Therefore, this study contributes to the existing literature by conducting a Systematic Literature Review (SLR) of the extant literature and presenting a theoretical framework for the management of gender bias in AI-based decision-making systems. The SLR results indicate that the research on gender bias in AI-based decision-making systems is not yet well established, highlighting the great potential for future IS research in this area, as articulated in the paper. Based on this review, we conceptualise gender bias in AI-based decision-making systems as a socio-technical problem and propose a theoretical framework that offers a combination of technological, organisational, and societal approaches as well as four propositions to possibly mitigate the biased effects. Lastly, this paper considers future research on the management of gender bias in AI-based decision-making systems in the organisational context.
Please use this identifier to cite or link to this item: