<iframe src="//www.googletagmanager.com/ns.html?id=GTM-KJ4LCR" height="0" width="0" style="display:none;visibility:hidden"></iframe>
digitas logo

Could your drive for diversity in the real world be undone by AI?

Digitas

Nic Howell

Could your drive for diversity in the real world be undone by AI?

AsCorporate Britain inchestowardsgender equality intheboardroom, itnowhas to watch out forbias in its virtual workforce. Digitas' Head of Strategy, Nic Howell, explores whether digital assistants need a gender at all.

In 2011 only one in eight FT-SE 100 board members were women, while 152 boards in the FT-SE 350 had no women at all. B2017, 27.7% of FT-SE 100 board members were female and there are only eight all-male boards left in the FT-SE 350. It’s progress – but last month’s publication of gender pay gap information will remind CEOs that there’s still much to be done  

Yet as Corporate Britain inches towards gender equality in the boardroom, it now has to watch out for bias in its virtual workforceOver 80% of businesses expect to have introduced some sort of chatbot automation over the next two years, adding to an estimated 30,000 in the market. Such assistants can subtly or overtly reinforce stereotypes about male and female roles and behavioursHow do businesses make sure that their commitment to empowering women in their workforce and the wider world isn’t undermined by a servile female chatbot? 

A 2016 survey by Integrate.ai analysed more than 300 chatbots, assistants, and artificial intelligence movie characters inferring gender from names, avatars and pronouns. They found that chatbots and assistants most commonly have female or neutral names. Often this is blamed on social science. A 2017 Statista report found US consumers say they prefer virtual assistants with a female voiceadding to research that men and women respond more positively to female voices. Amazon cites consumer research as the reason it made Alexa female. 

The problem with giving your virtual assistant a female identity is that can reinforce the deep-rooted idea that this is the gender you give orders to, rather than take them from. A study of PA announcements on the New York subway found that commands and warnings were given in a male voice, while information was delivered by a woman.  

Online, things can get pretty ugly pretty quickly, as users decide they will first flirt, then obstruct, then bully chatbot they treat as a girl. Sometimes it’s worse. On one bot platform designed to assist lorry drivers with routes and logistics, one in 20 interactions are explicit.  

You can’t eliminate such behaviour through interaction design, but you can make it less likely. As start, ask whether your brand’s virtual assistant even needs to have a gendered identity. Our own quick survey of chatbots in the financial service sector found 13 out of 22 examples had gender neutral names. We think this trend will continue. When the novelty of talking to a machine wears off, consumers will need gender less to humanise the experience.  

Already it’s possible to focus on chatbots function rather than personalityThe banking bot Kasisto, used by financial services brands for handling online customer conversations, deliberately has a robot-specific identity and priorities, politely steering the conversation away rather from attempts to chat up” as opposed to “chat”.  

Another problem is the training datasets used to create chatbot AI can be dated and flawed. AI isn’t itself inherently biased, but if it hasn’t been trained on a dataset that represents a wide range of human behaviour, it will take a narrow view.  Make sure you’re training your chatbot on the widest possible range of interactions, and make sure your testing, whether formal or informal, respects and involves the viewpoint of female users at every stage.    

This still leaves the problem of voice. Navigation app Waze lets users choose a persona, and Alexa can be assigned a male voiceBut in a world where human identity is increasingly gender fluid and non-binary, perhaps we should be asking ourselves why an AI chatbot should have a conventionally female or male voice at all.  

Hollywood defaults to the male voice

In a famous sequence in the classic sci-fi movie 2001: A Space Odyssey, a spaceship’s speaking computer HAL has to be shut down after killing one of the crew. As it “dies” it regresses until it can barely recite the very first song it was taught.   

Notoriously, film director Stanley Kubrick, made Canadian actor Douglas Rain re-record this sequence 50 times before settling on the first take. Today, AI experts refer to HAL as the default for what a sentient machine could sound like. Certainly, that’s the case in the movies. Integrate.ai’s 2016 report found that, nearly 50 years on from Kubrick, 77% of AI characters were male.

 

Nic Howell

Nic Howell

Head of Creative Strategy AT Digitas

As Head of Creative Strategy at global marketing and technology agency Digitas, Nic leads a diverse creative strategy team, ensuring they continue to deliver provocative, insight-driven strategies in order to help clients transform their brands for the digital age.  Nic began his career as a copywriter, and then went on to hold several client-side marketing roles at companies in the FT Group and the RELX Group, before becoming a journalist. He joined Digitas in 2010 from the Centaur-owned digital marketing title New Media Age, where he was Deputy Editor.

UP NEXT

Advertising week embraces technology

GO DARK.
digitas logo
GIVE LIGHT.

This holiday, digitas is going dark.

See why and find out how you can help.