<
https://theconversation.com/most-ai-assistants-are-feminine-and-its-fuelling-dangerous-stereotypes-and-abuse-272335>
"In 2024, artificial intelligence (AI) voice assistants worldwide surpassed 8
billion, more than one per person on the planet. These assistants are helpful,
polite – and almost always default to female.
Their names also carry gendered connotations. For example, Apple’s Siri – a
Scandinavian feminine name – means “beautiful woman who leads you to victory”.
Meanwhile, when IBM’s Watson for Oncology launched in 2015 to help doctors
process medical data, it was given a male voice. The message is clear: women
serve and men instruct.
This is not harmless branding – it’s a design choice that reinforces existing
stereotypes about the roles women and men play in society.
Nor is this merely symbolic. These choices have real-world consequences,
normalising gendered subordination and risking abuse."
Cheers,
*** Xanni ***
--
mailto:xanni@xanadu.net Andrew Pam
http://xanadu.com.au/ Chief Scientist, Xanadu
https://glasswings.com.au/ Partner, Glass Wings
https://sericyb.com.au/ Manager, Serious Cybernetics