Human Value Requirements in AI Systems: Empirical Analysis of Amazon Alexa

Publisher:
IEEE
Publication Type:
Conference Proceeding
Citation:
2023 IEEE 31st International Requirements Engineering Conference Workshops (REW), 2023, 00, pp. 138-145
Issue Date:
2023-09-28
Filename Description Size
Human_Value_Requirements_in_AI_Systems_Empirical_Analysis_of_Amazon_Alexa.pdfPublished version933.69 kB
Adobe PDF
Full metadata record
The importance of incorporating human values e g transparency privacy social recognition tradition in the Requirements Engineering RE process is well acknowledged but there is a paucity of empirical research for integrating human values in RE This shortfall becomes more pronounced when designing Artificial Intelligence AI systems due to their significant societal impact Ignoring or violating human values in AI systems can lead to user dissatisfaction negative socio economic repercussions and in some instances societal harm However there is a lack of guidance on addressing human values within the RE process for specific contexts of AI system development In this paper we explore human value requirements from the end users feedback for an AI system We conduct an empirical analysis of the Amazon Alexa app as a case study examining 1003 users reviews to identify relevant human values and assess the extent to which these values are addressed or ignored in the app We identified 34 values of the end users of Amazon Alexa Among them only one value is addressed self discipline and 23 of them are ignored freedom equality obedience in the app The feedback provided mixed experiences both addressed and ignored on the rest of the ten values Through this analysis we have tailored an approach for identifying human values from a specific type of AI system We posit that this approach has the potential for utility across different AI systems and a broad range of contexts providing guidance for developing human value requirements for values based AI systems
Please use this identifier to cite or link to this item: