Privacy by design in machine learning data collection: A user experience experimentation
- Publication Type:
- 2017, pp. 439 - 442
- Issue Date:
Copyright Clearance Process
- Recently Added
- In Progress
- Closed Access
This item is closed access and not available.
© Copyright 2017, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved. Designing successful user experiences that use machine learning systems is an area of increasing importance. In supervised machine learning for biometric systems, such as for face recognition, the user experience can be improved. In order to use biometric authentication systems, users are asked for their biometric information together with their personal information. In contexts where there is a frequent and large amount of users to be enrolled, the human expert assisting the data collection process is often replaced in favour of software with a step-by-step user interface. However, this may introduce limitations to the overall user experience of the system. User experience should be addressed from the very beginning, during the design process. Furthermore, data collection might also introduce privacy concerns in users and potentially lead them to not use the system. For these reasons, we propose a privacy by design approach in order to maximize the user experience of the system while reducing privacy concerns of users. To do so we suggest a novel experiment in a Human-Robot interaction setting. We investigate the effects of embodiment and transparency on privacy and user experience. We expect that embodiment would enhance the overall user experience of the system, independently from transparency, whereas we expect that transparency would reduce privacy concerns of the participants. In particular, we forecast that transparency, together with embodiment, would significantly reduce privacy considerations of participants, thus maximising the amount of personal information provided by a user.
Please use this identifier to cite or link to this item: