On Gender Equity, Imagination And GenAI
With CRN’s AI Week at a close, I am left with several questions about the biases inherent in various technologies. Is there a way to use artificial intelligence to build a world free from bias and discrimination? Or are emerging algorithms destined to reinforce past and present stereotypes?
Much of what we understand about gender is born within the imagination. What our cultures believe to be true about who a woman or man should be, and how they should act or behave, shapes how many of us present ourselves in our work lives. The binaries of gender impact not only how we do business but also how we perceive others as impactful leaders. One need only reference CRN’s research presented at Women in the Channel - West. According to the study, the desires of women leaders differ from those of their male counterparts. Those polled saw value in effective communication, authenticity, and the ability to achieve desired results. Yet, one could argue that these top characteristics of effective leadership are not exclusive to women or men. They are universal, but what certain subgroups have access to becomes a question within the imaginative principles of gender and performance.
Technology is also based on our imagination. We dream of a world where certain processes are made easier by a tool or software. These imaginative principles become reality because, at some point, someone says, “Oh yes, we can do that!” or “Yes, we can make that better.”
With CRN’s AI Week behind us, I questioned how we are making the world better through AI and what, if anything, we are leaving out. Are we exploring every chance to create the world we want at the intersection of our gender and technology imaginations?
According to Crystal Washington, a futurist and expert on GenAI, we are at a crossroads. Specifically referencing AI assistants, Washington told me in an interview that it is “the voices we are giving [them] and the information we are training it on” that require a curious use of imagination. “We may want to train [GenAI models] on the principles we want to bring forward in the future,” which includes a reevaluation of gender roles.
The futurist thinks about technology and ways that it can impact us up to 30-plus years from now, GenAI is not just about the immediate ROI. Washington encourages us to think about things that are not at the forefront of AI conversations. “I noticed this a few years ago,” Washington continues. “I was like, why are all of these assistants’ default voices feminine?”
It is a great question. Many AI assistants such as Alexa, Siri, and Google use female-sounding voices as their default settings. While some studies have shown users respond more positively to female voice assistance, this research is mostly anecdotal and may stem from the social conditioning of the participants. The same body of research shares that feminine-sounding voices are “easier to understand, particularly in workplace settings.”
But I would suggest both analyses are based on a bias towards the softness of feminine-sounding voices. In our desire to be collectively nurtured, developers lean on AI assistants to be feminine-sounding and thus more palatable. It is a question that Washington has grappled with in their work. “If we think that men and women are equal, then what does it mean if all of our assistants sound like women?”
Indeed. But I would add another layer: if we are focused on building an equitable world within our imagination, why are we committed to maintaining women as the primary nurturers within technological advances in AI?
The Inclusive Leadership Newsletter is a must-read for news, tips, and strategies focused on advancing successful diversity, equity, and inclusion initiatives in technology and across the IT channel. Subscribe today!
Photo by Solen Feyissa on Unsplash