Affectiva, a Waltham, Massachussetts-based company which “provides tools for measuring emotion with electrodermal testing & face expression recognition,” recently announced a new round of funding with investors that include WPP Group’s Kantar unit and Myrian Capital. Read the release. Affectiva’s tech grew out of research from the MIT Media Lab.
Dave Berman, CEO of Affectiva, discussed the opportunity he sees for his company in digital advertising.
AdExchanger.com: What is the problem that Affectiva solves?
DB: What we provide are unique, new types of data that haven’t been relevant before. For example, in an experiment done at MIT, researchers had a bunch of people take a sip of Pepsi in new and different types of flavors. After they took a sip, they’d answer a survey question.
Well, one guy on his fourth sip makes a really funny face like he has heartburn. But, he still marked it 7 out of 10 – that he was pleased with it. Except, when he got about 20 sips in, he started changing his mind.
So, what Affectiva can provide is physiological data faster than you can get through normal surveys. Over time, we think that we can mine this subconscious and physiological information and help advertisers make better, faster decisions.
AdExchanger.com: What does the use case look like on data collection?
What happens is that at the beginning of a survey, you hit “allow” to turn on your web camera and then you start watching a commercial, for example. In real time, your results are aggregated in the cloud. We can give a real time look at how people were feeling while watched the commercial.
For one survey, we were looking at three different metrics: “smile” – are people enjoying it?; the lower eyebrow – a negative look or a confusion; and we measured attention. And then we asked a few survey questions at the end. Literally, in three seconds, we can tell you whether people enjoyed an ad and aggregate that survey across thousands of people. We’ve got other emotional states that we’re in the process of rolling those out, too.
Basically, we can tell if somebody’s engaged. Do they like it? Do they dislike it? Are they confused? And we can correlate that to whether that ad is going to be successful in the market.
Our go‑to‑market is going to be dual. We’re working with large organizations that want to test their own ads. We’re also working with large agencies that have existing testing methods. We open up our APIs to allow them to import this data and use it with their current methods. So you can kind of imagine the integration we’re looking at with a Milward Brown and their existing customer stream.
So this is for companies that have focus groups already in place. That’s the ideal, right?
Yes. But, it’s more than just focus groups because you can now get this data in the real world. So instead of having to bring people in and ask them questions or catch them while they’re in the mall, you can get them in their natural environment. That’s powerful because it’s a more realistic scenario.
There’s a system out there, it’s called the Facial Action Coding System (FACS). It was developed back in the ’70s by Paul Ekman and some other top researchers.
Basically, they drew out a code for every human expression and mapped out each human expression by movements on the face and the different points. What we’ve done is automated that.
So if you’ve seen that show “Lie to Me,” that’s what the guy does – he reads faces. We use the same sort of system.
So you also have a wearable device, too? What’s its purpose?
Yes. We’ve got a bio‑sensor (Q Sensor) that gives a very accurate read of the internal state of someone. So if you’re wearing a sensor… It looks like a watch. You put it on and we can stream, in real time, your attention, engagement level, or interest level. It’s a sympathetic nervous system response, sometimes known as the fight or flight response. It goes up when you’re aroused or engaged and it goes down when you’re not.
Prior to our sensor, you had to go into labs and they’d wire you up, etc. It’s another data set, and we’re using that with some of our other customers who are doing in‑house copy testing as well. They want that other signal, it’s a very accurate read also – The Galvanic Skin Response is what it’s called. It’s been around for 100 years.
Looking at digital advertising today, and where you see specific opportunity for Affectiva?
We’re pretty new to advertising. In fact, we’re going to be hiring a whole bunch of people with this DNA.
But, what we’re hearing from the customers is that they want a new passive metric from customers, so they don’t have to answer a lot of questions. Getting that in real time, at scale, is a very compelling value proposition. So literally being able to send out a commercial and get a thousand people to respond to it and get accurate information in seconds is what we bring. It’s speed, accuracy, science and that passive metric.
Why was WPP Group the right investor for you? It seems like you could have gone straight to brands, for example.
They’ve got a ton of expertise in this space. They’re very excited about the technologies at the Kantar Group – this is what they do. We think that they’re going to add a lot of value for us on product creation and on helping us understand the space. We thought it was a great opportunity and Millward Brown, as you know, is the gold standard of copy testing and media measurement. So it’s just a great fit.
Looking back at your WebEx experience, is there anything coming in handy right now that you might not have expected with your new role at Affectiva?
I’ve been here over a year and a half now. I think that where we are at now, it’s more of about scaling and execution. At WebEx, I took a team from two people in sales to about a thousand, so I’m familiar with scaling and bringing in top talent, and building out and executing against goals and delivering on expectations of customers – in addition to building and scaling this platform in multiple markets.
Figuring out to go to market was a little bit harder than we thought, but the strategy was, “Let’s get out there and talk to customers, and see where we can add unique value.” We’re very fortunate that we came across WPP and that they were able to help us articulate the synergies.
To be clear, WPP is a minority investor. And, we’re going to be doing a great deal of business with them, but we’re also going to be doing business with the other agencies that want to use our technology.
What other ways do you think the Affectiva technology can be leveraged?
I think there’s a bunch of ways to go with this and now that we’re funded, we’re going to keep building out the product road map.
We have been toying around with a debate meter that people could watch the debates live for the election, and record their faces, and get real‑time audience feedback at scale for the debates. We think that would be pretty interesting. I don’t know if it’s going to make the final cut on the product road map, but there’s a whole bunch of great ideas that we can get this out and start leveraging the power of emotion to help people make better decisions.
The tip of the iceberg is media measurement. When it moves into multiple markets’ interaction, we’re going to have a scalable platform that people can subscribe to, to use emotion for all these other markets. That’s the great thing about this round of funding, is it allows us to build that platform. As we keep building out these emotions, we’re going to be able to take those to other markets.