IPG’s Data Companies Tap DEI Expert To Remove Bias In Data-Driven Marketing

IPG’s marketing intelligence firm Kinesso has hired Dr. Femi Olu-Lafe as the SVP of culture and inclusion. Olu–Lafe was brought on to push the needle on diversity, equity and inclusion (DEI) initiatives across Kinesso and its sister IPG companies, Acxiom and Matterkind. 

Olu-Lafe – previously a senior leader and DEI consultant at YSC Consulting and former diversity and inclusion research analyst at Catalyst – holds a doctorate in psychology from Boston University and has worked with Fortune 500 companies designing and implementing DEI initiatives. 

The move comes at a time when brands and agencies are rethinking their DEI strategies in the wake of last summer’s Black Lives Matter protests and recent anti-Asian violence has mobilized the Asian American community to combat marginalization. 

Olu-Lafe, an expert in cognitive psychology and applied data analysis, started with Kinesso on Monday. She has been tasked with creating a comprehensive strategy to ensure the data and technology products developed by Acxiom, Kinesso and Matterkind (AKM) are are inclusive and respectful. The recent creation of Olu-Lafe’s role is part of an effort to reduce bias often reflected in data input by people that train algorithms in data-driven marketing, part of an ongoing DEI strategy at IPG. 

Olu-Lafe will report to Renu Hooda, chief talent officer at Kinesso. AdExchanger spoke to Olu-Lafe. 

What is the biggest issue around diversity and inclusion when it comes to consumer data?

FEMI OLU-LAFE: When it comes to data, people think, “Machines are doing this, why do we have to be concerned about it?” The approach that the organization is taking is that there are humans behind a lot of what the machines are learning. 

Because data is about people and collected by people, it’s not immune to bias – even if the machines are doing the work. It’s really important for us to look at the data with a critical eye and make sure we are asking those tough questions: When, how and what was collected? Have we been fair? Have we been just? 

What does biased data look like? 

Biases arise from our brains making judgments about how we retain the information that’s given to us. Those processes can be transposed into the way our computers make judgments 

For example, the iPhone didn’t have lots of experience with people of color. Sometimes the Face ID ended up having challenges actually identifying faces [of people of color]. 

How can you actually operationalize adding more diversity and inclusion within the data used by AKM?

We have goals when it comes to removing bias in the data.

Some of that comes from how [data is] collected and setting metrics around doing thorough diagnostics to figure out what’s going on. Then we want to set clear goals around what we want to do and what success looks like for us. 

People in the field are looking for people to do this work and show them the approach we’ve taken, and how we benefit from this.

What will success look like? Do you have benchmarks?

The starting point involves partnering with people to understand the current state of bias in the system. The [AKM] organization has already been on a journey with this, so some of this is going to be asking ourselves: What have we done already to remove a lot of bias in the data? What have we found beneficial? What do we want to grow into to take this a step further? 

Acxiom, Kinesso and Matterkind are all different data companies — how does your strategy change for each one?

My role goes across AKM, so I have people within each of those companies to partner with. It’s about building a strategy across all of the companies and ensuring that we’re launching a cohesive program rather than single initiatives. That’s a key part of bringing me on board. I’m really excited to learn about the different companies, different cultures,  what success looks like – and creating a cohesive programming strategy that works for all of them.

What will be different this time next year?

This time next year, I would love to see more diverse perspectives and to see metrics around that. 

Additionally, I would love to see us partnering externally with businesses in a meaningful way when it comes to removing bias and cultivating thought leadership with some of our clients as well. 

I would love to see the industry moved along a little bit further in the inclusion journey.

This interview has been edited and condensed.

Enjoying this content?

Sign up to be an AdExchanger Member today and get unlimited access to articles like this, plus proprietary data and research, conference discounts, on-demand access to event content, and more!

Join Today!