Voices

The Tech Take: The genderless face of accounting bots

Last year, Sage debuted Pegg, a simple accounting bot that integrates with Facebook Messenger and Slack. The bot’s icon looks like the Nerd emoji with a round yellow head and big, owlish glasses. No nose, no mouth — and no gender.

This was deliberate on the part of software engineer Kriti Sharma, who since the release and success of Pegg has been named vice president of bots and AI at Sage, and made the Forbes 30 Under 30 list this year. She wanted to make a statement about the gender metaphor in computing — specifically, that it’s unnecessary — and chose to create something new with Pegg in a world of Siris, Alexas and Cortanas.

And since Capital One debuted the genderless Eno bot this year, Pegg is no longer alone.

Computer engineers have used metaphors to help us relate with machines since the advent of computing. For instance, the metaphor of a “desktop” helps us organize our “documents” and “files” on our computer as if they are physical papers on a physical desk; and Apple’s Siri speaks to us in the guise of a human woman so we can feel like we’re interacting with a person and not a series of bits and bytes.

These metaphors, though seemingly simple, have worked ingeniously to make extremely complex computing functions accessible and useful to laypeople. But when we move to human metaphors, we start having to grapple with what those signifiers mean about how we interact with gender and race.

There has crept into our zeitgeist the idea that people “like” interacting with female voices more. Stanford communications professor Clifford Nass told CNN in 2011 that more people just prefer female voices over male. And he’s right that in some AI interaction studies, people describe female voices as “warmer.” However, other data suggest the opposite — female politicians, radio personalities and celebrities face a lot more backlash for their vocal pitch and tics, ranging from Hillary Clinton being called “shrill” (a criticism rarely, if ever, made of male politicians) and Kim Kardashian being criticized for her vocal fry. Margaret Thatcher famously underwent vocal training to lower the pitch of her voice so she would be taken more seriously.

Robot
Attendees watch SoftBank Group Corp.'s Pepper humanoid robot on the second day of Mobile World Congress (MWC) in Barcelona, Spain, on Tuesday, Feb. 28, 2017. A theme this year at the industry's annual get-together, which runs through March 2, is the Internet of Things. Photographer: Chris Ratcliffe/Bloomberg

More to the point, people — both men and women — prefer giving instruction to machines coded as female, because assistant and helper roles have for years now, due to myriad social constructs, been filled by women. Conversely and relatedly, people don’t like taking instruction from female voices, which is why initial warning announcements during emergencies are often given in a female voice (meant to be comforting), and evacuation orders are typically broadcast in a male voice (more likely to be obeyed).

These ways in which gender informs human interaction with machines have cyclical effects in that mimicking bias metaphorically only serves to reinforce a bias in real life. An April 2017 Science study described how AI is only as intelligent as the data it is fed, and if that data is biased, it will display the same biases on the output. Sage’s Sharma gave the example of AI used in the HR and payroll industry, where machine-embedded bots provide recommendations on hiring and salary decisions. Its biased learning, such as preference for male or white applicants, will be replicated again in real life, subsequently re-fed into the AI, and the cycle of entrenched inequality only continues.

But it’s not all negative. Beerud Sheth, founder of bot development platform Gupshup, which Sage partnered with to create Pegg, noted that there is also data suggesting people are more forgiving of female gendered bots. This more easygoing relationship between human and bot makes the interaction experience more pleasant and productive, especially since we are not at a point yet where AI interfaces are convincingly human. In fact, they’re not even close.

And that’s why Sharma says it’s almost disingenuous to give AI a human metaphor at all.

“I really believe from our research, at least for financial and accounting services, it’s important to clarify you’re talking to a non-human,” she said. “By giving it a human name, you’re over-promising, because AI today is not at human-level intelligence.”

In finance and accounting, AI performs tasks ranging from the automation of data collection and entry to making sampling decisions in an audit. Not all AI needs to have a “face,” but studies have shown that in the finance and accounting world, clients prefer a male-coded bot. Why? Because, as noted above, people tend to prefer taking instruction and advice from men over women.

A 2016 survey from SpectrumGroup suggested that affluent clients tend to trust financial advice from older, white males first, and younger, white males as a second preference. Young white females take third place as trustworthy advisors. As accounting professionals work towards representing themselves also as analysts and strategic advisors, women and racial minorities find themselves having to overcome knee-jerk biases against their judgment capabilities. If software companies took this data and applied it to their bots, coding them as “male” because clients would trust them more, it might drive more success for the bot, but at the expense of replicating real-world biases.

GupShup’s Sheth predicts that with the advent and improvement of artificial intelligence, the world is at the threshold of the next technological revolution. Both he and Sharma agree that if engineers ever have a chance to get off on the right foot, it’s now.

“We have to be careful,” Sheth said. “Bots can reinforce systemic racism and sexism. Even if there are reasons for the ways [gender and race have been represented] for the past 100 years, there’s no reason to perpetuate that for the next 100.”

For reprint and licensing requests for this article, click here.
Artificial intelligence Mobile technology Machine learning Cognitive computing
MORE FROM ACCOUNTING TODAY