Last week, a pair of shoes was floating around the internet. I had been looking at them online as a gift for my father-in-law, but he didn’t like them and neither did I. But no matter what site I visited, there they were, staring back at me in all their moccasin glory.
Digital advertising can be really annoying, but it can also be hazardous. It was recently accused of perpetuating fake news, financing child abuse and disruption democracy.
Facebook, which derives most of its $40 billion in annual revenue from digital advertising, is in line of firebut many other companies across the digital advertising supply chain are now feel the sting.
It is no surprise then that the Advertising Standards Authority (ASA) has taken steps to tighten regulationsor that modern government legislation is under consideration.
We hope these changes will provide a much-needed safety net for the digital advertising industry, often referred to as the “Wild West” – a dark, lawless place where everything happens. But digital advertising is probably a safer place to focus your marketing spend now more than ever. The burgeoning “ad verification“The industry is restoring trust in the sophisticated digital supply chain by ensuring ads are placed and targeted correctly.
Self-regulation is also gaining momentum. The Internet Advertising Bureau now offers The gold standard certification for companies striving to provide positive and safe and sound digital advertising experiences, while the launch media industry coalition is an example of collective efforts to raise transparency and accountability.
These changes are welcome and work. However, their focus on brand safety overlooks several more vital issues.
Be careful what you click
First, there’s the politicization of ad placement. For a moccasin brand that harassed me online, it probably makes sense for their ads not to appear alongside toxic hate speech on YouTube. But what about a news article about alleged animal abuse in the leather supply chain? Where would you draw the line?
Brands make these decisions constantly, through complex process where ads are placed (or not) based on their association with “good” or “bad” keywords used in websites and articles.
In the responsive digital world we live in, each user experience is truly unique. What I see online is different from what you see. We can never be sure what strategy lies behind the ads we are served.
Secondly, what is the social cost of brands moving away from vital topics, including race and religion through decisions regarding advertising placement?
For example, a fashion article in 2018 is a much safer place to advertise a brand of loafers than an article about animal cruelty. The message for the digital platform? Content that is casual, not polarizing, pays off. This has powerful implications for democracy and free speech. If content that doesn’t meet the brand’s tolerance tests isn’t commercially appealing, are specific narratives being suppressed?
It is this point that makes many question the durability of digital advertising. Indeed, Facebook is experimenting with an ad-free subscription. Perhaps content that brands deem “hazardous” will be pushed even further behind paid channels.
Finally, in the increasingly automated world of targeted marketing, algorithms are not very good at detecting context.
The shoe brand in question may therefore choose not to be associated with content it deems inappropriate—such as comments about “animal cruelty.” However, algorithms cannot always differentiate content in a meaningful way. Is all content about “animals” problematic?
When the results are unknown, brands will choose the safest and potentially most sanitized option. This could mean that algorithms will cease to be neutral tools and become loaded with values Ones.
The worrying thing is that the only real way to overcome this algorithmic bias is for content to be verified by real people. We are currently seeing an increasing number of “commercial content moderators“doing our filthy work online by policing social media sites and removing harmful and disturbing images.
Snapshot
Often underpaid – and teetering on the edge of “safe and sound” and “unsafe” in a matter of seconds – what these people see can lead to grave consequences. psychological repercussionsThe human toll in the security movement should not be underestimated.
Substantial technology, large responsibility
All of this raises grave questions about the role of marketing in society and the ethics of large tech. For many, self-regulation is not enough. Politicians have called on brands to limit trade relations with tech giants to address security issues on their platforms.
I agree. We need to push harder on every organization in the digital supply chain. The DARE approach that I support (Digital Advertising Responsibility and Ethics) focuses less on demonizing and more on humanizing business. It supports two key actions.
First, promoting the work of ethical industry players who are changing the rules of the game and encouraging a modern definition of responsibility – for example, the Financial Times, which he left Facebook after controversial identity checks on advertisers (a move Facebook is currently considering) or Nestlé considering funding sustainable cocoa sourcing ethical advertising buying.
Other brands such as Vodafone are also moving digital advertising in the company have more control. Such examples demonstrate the trust deficit that currently exists in the digital advertising industry.
Second, we need a greater role for ethics in the digital supply chain. Ethics begins where the law ends, considering what is right and wrong in every decision. Ethical thinking requires ongoing reflection on the changing digital landscape, not rules-based compliance. Yet, while many advocate Code of Ethics When it comes to the technology industry, I believe that education, open discussion, and individual reflection will be key to the development of this field.
Only by making progress in these directions will we be able to move away from the unclear culture that currently dominates the online world filtering. It is not about putting a shoe in large tech. More like a loafer.