Advertising giant M&C Saatchi is currently testing billboards with hidden Microsoft Kinect cameras that read viewers’ emotions and respond to whether a person’s facial expression is content, melancholy or neutral.
This test ads – featuring a fictional coffee brand called Bahia – have already appeared on Oxford Street and Clapham Common in London. So now we have ads that can read the reactions of those who view them and adapt accordingly, cycling through different images, patterns, fonts and colours. With partners Clear channel AND PosteroscopeSaatchi has made advertising history. When future media historians look back, they will see 2015 as a watershed year.
There are three key things we need to be aware of: ads can read our behavior, they are based on our emotions rather than our browsing history, and ads operate this data to improve themselves.
What are we to make of this? Is it a little scary? The answer is yes and no. The campaign is an attempt to get closer to us, which is a hallmark of advertising and audience research. They want to get to know us better so they can create messages that will impact and resonate with us. This is an example of what I call “empathetic media,” because by reading facial expressions, ads are able to bypass the guesswork and directly tap into our emotions.
The Evolution of Advertising
While amazing and artistic, the Saatchi ads are not a threat to privacy. After all, unlike our computers, phones and tablets, these posters don’t know or care who we are. The ad makers claim they don’t store images or data, and there’s no reason to disagree. All their ads do is respond to the shape of a face – the really scary stuff is on the internet and in mobile apps that track our habits. For example, one study notes that the Eurosport Player application has 810 data trackers collecting information about hardware and software, as well as information about navigation (where a person visits a website), behavior, time of visit, visitor actions and geolocation (where a person is in real space).
The real genius of the novel ad is using our facial expressions to learn and redesign the ad. By responding to our facial expressions, the ads have a purpose – an evolutionary need to improve and become more effective.
The idea of malleable advertising was envisaged about 100 years ago by such authorities in the field of advertising as Daniel Starch AND Claude Hopkins. They insisted that advertising be viewed as a science based on gathering information, analyzing it, and using that insight to improve campaigns. Both Starch and Hopkins sought to understand which techniques worked and which didn’t, to make the advertising business subject to the laws of cause and effect. The grandfathers of advertising would be very pleased with today’s offspring.
While the logic is elderly, processing feedback to self-correct in real time is novel. For years, Google has led the way in automatically serving ads based on our interests; self-correcting ads in the physical world are the next step forward.
Connecting with the topic
Most media coverage of M&C Saatchi’s ads has praised them as artificial intelligence campaign. While this is true to some extent, advertising is actually quite mechanical: the advertiser does not understand why we smile, frown, or look grave, or what these facial expressions mean. They simply fit the shapes and respond.
So what would knowledgeable advertising look like? It would have to be able to engage with the context of our lives, in real time. What that consists of is a somewhat philosophical question, but it could include our individual life histories, our natural spoken language, human values, politics, current events, popular culture, and aesthetic trends—all of the topics that human advertisers consider when creating campaigns.
Of course, these ads don’t do that – but others in the ad industry may have the technological muscle to do so. For insight into the artificially knowledgeable advertising of tomorrow, check out Google’s Deep Mind which promises to “combine the best techniques from machine learning and systems neuroscience to build powerful, general-purpose learning algorithms.” When we remember that Google is first and foremost an advertising company, Deepmind is a company to watch.
Then there are sensors. Soon, we will wear and carry more sensors and have more sensors around us. Empathetic media will give advertisers even greater insight into our emotions through the way we speak to our mobile devices, more detailed facial recognition, and emotional insights from our heart rate, breathing patterns, and how our skin reacts to stimuli. And if that sounds far-fetched, remember that you just read a true story about ads that recognize your emotions.