The prevalence of algorithmic bias is more lamentable than it is surprising considering the dearth of diversity in the computer science field specifically and in STEM professions more broadly.
“The impact of algorithms is something that’s hiding in plain sight, but many of us are only just becoming aware of the effect it has on our lives,” said Stephanie Headley, VP of OLAY Skin Care at Procter & Gamble.
Just last week, Facebook issued a public apology after users watching a video featuring Black men were shown an automated prompt asking them whether they would like to “keep seeing videos about Primates.”
Facebook blamed the error on imperfect artificial intelligence powering its recommendation function.
In May, Twitter was called out (on Twitter, of course) for using a photo preview algorithm that would frequently crop Black faces from photos in favor of white ones.
In response, Twitter announced an open bug bounty contest to surface and root out other examples of bias in its image-cropping algorithm. Researchers delivered. They found that the algorithm favors images of young, thin white females by discriminating on the basis of age, weight, race and gender.
The challenge is that bias is often coded into algorithms unintentionally, but the effect is the same, particularly when it comes to perpetuating monolithic beauty standards.
To combat hidden bias in beauty algorithms, Olay launched a campaign on Monday to coincide with National Coding Week, and it’s got an ambitious KPI: to help send at least 1,000 young women of color to code camp next summer through a partnership with Black Girls CODE.
“We need to be more intentionally inclusive, because it’s the people who create the code who will be the ones to help spark systemic change,” said Headley, a college math major herself who went on to teach high school math before starting as an assistant brand manager at P&G in 2003.
“Beauty,” Headley said, “is in the eye of the coder.”
As part of its #DecodetheBias campaign, which includes national TV spots and print ads featuring Joy Buolamwini, founder of the Algorithmic Justice League, Olay is also encouraging other brands and companies to examine the biased assumptions that might be hiding within the data sets they use to program their own algorithms.
AdExchanger Daily
Get our editors’ roundup delivered to your inbox every weekday.
Daily Roundup
Olay itself partnered with ORCAA, a consultancy that helps organizations manage and audit algorithmic risk, to analyze the Olay Skin Advisor, a web-based tool that relies on hundreds of thousands of uploaded selfies to recommend skin care products.
In collaboration with Buolamwini, ORCAA uncovered several instances of subtle bias in the Skin Advisor algorithm that Olay is now working to rectify. For example, the algorithm was less precise for people at the extreme ends of the age spectrum and also slightly less accurate for people with darker skin tones.
“Honestly, we’re excited about this,” Headley said. “And that’s because we didn’t know this was happening before we asked the question and did the assessment.”
That’s where it all starts, said Headley – simply being aware enough to “ask the questions that hold people accountable.”
But companies also need to tap into more diverse talent pools, she said. The 1,000-plus female graduates of next summer’s code camps will populate a future generation of professional computer engineers.
“As more women and women of color become coders,” Headley said, “we’ll get better code and more inclusive code.”