IMPORTANT INFORMATION
Your voice assistant may reproduce gender biases which could potentially be harmful. For example Amazon‘s Alexa and Apple‘s Siri are programmed to use female default voices. This can contribute to the reinforcement of gender stereotypes and support societal expectations about gender roles.
It is important to be mindful of the potential for AI systems to reproduce biases and to take steps to prevent this from occurring.
Download the guide here to learn more.
You can get the the full text version by clicking here.
BA-Thesis: Weaponised Assistant (2023)
Hannah Charlotte Krause