Why Google’s Ad Problem Won’t Go Away
A few years ago, right when I was starting out, I built keywords and wrote ad copy for a big agency. During this time, I learned about “brand protection negatives,” or the phrases that the agency’s client did not want the brand associated with — hence the “brand protection” name. That list of negative keywords was outrageous and would make many people blush. Whenever I need a good laugh, I took a look at this list and wondered about the person who had to sit down and think of these completely inappropriate, NSFW phrases.
I thought about those brand protection negatives earlier this year when Google found itself in hot water as businesses discovered that their advertisements were appearing alongside inappropriate content in the Google Display Network, most notably on YouTube. Big brands such as Starbucks and Walmart pulled their advertising. Reportedly the boycott has cooled off. But the problem of ads appearing alongside inappropriate content on YouTube is not going away. The risk remains real: YouTube is vulnerable.
For context, let’s look at a few revealing statistics:
- YouTube reaches over 1 billion users (1/3 of all people on the internet)
- YouTube can be navigated in more than 76 different languages (95 percent of the internet population)
- There are 300 hours of video uploaded every minute.
The staggering 300 hours of video uploaded every minute results in lot of content flooding YouTube (432,000 hours per day or 157,680,000 hours per year). When one of these videos is uploaded to YouTube, it is put through an editorial process that labels it as G, PG, Teen, or Mature as well as a variety of other groupings (Police/Crime, Acts of Warfare or Violence, Social Issues, Religion, etc.). But it can take some time for Google’s reviewers to complete that process.
Google Has a Problem
The sheer volume of videos that posted on YouTube is reason alone why Google’s problems are far from over. Google reviewers can’t keep up with the number of hours of videos uploaded. As a result, the review process is, to a degree, automated — which results in videos being mislabeled or missing a label. In addition, reviewing and approving a video also makes it possible for the video to qualify for monetization (via the YouTube Partner Program), meaning that the video may accept advertising. Currently, YouTube requires a YouTuber to have 10,000 lifetime views to monetize their YouTube channel. Now, that may seem like a lot of views, but it’s a lifetime view count, which means I can create 10 videos that each get 1,000 views, 20 videos that get 500 views apiece, 50 videos that get 200 views apiece, and so on. Once that 10,000-view count is hit, all channel videos begin to be monetized.
The lax reviewing standards coupled with a fairly easy monetization process can lead to some unfortunate situations, as the following example shows. In March, James Dean of The Times tweeted a troubling image:
In this example, an Oracle image ad was placed over a video for an extremist group. Obviously, as part of the brand safety process I mentioned at the beginning, Oracle would want this type of video excluded. But why did this video specifically qualify as part of the monetization process? The answer: tough to say. In some instances, videos are uploaded and disapproved because of a single word in the video title (e.g. “dead” or “death”) but in other cases, as reported by The Wall Street Journal, a video may have a racial slur in the title or description and still get approved. What’s ironic — and probably should have been expected — is that once these stories began to pop up back in March, YouTube went to the extreme and began demonetizing large amounts of content without any warning — and in some cases prematurely.
A Flawed Process
Clearly, if YouTube is going to monetize a video, they need to be more vigilant as to where those dollars originate. Essentially, in the example from James Dean, YouTube made money off a video that supports terrorism. How did that video get monetized? How did the reviewers not catch that? When there are so many hours of video and so much money involved, not to mention YouTube’s belief in free speech, it’s easy to understand why videos such as these slip through the cracks.
Google Goes to Extremes
YouTube went to the extreme when it came to demonetizing videos. For example, consider the case of Real Women, Real Stories, created by Matan Uziel. The goal of his channel is to give women the opportunity to give voice to their stories of survival from trauma; ranging from physical abuse to sex trafficking. This channel is a noble endeavor of survival and resolve, if ever there was one. Uziel uses the funds from ads on the channel to direct and produce future videos. But, one day, out of nowhere, all funds ceased because his videos got caught up in the demonetization process that YouTube began. His videos don’t support hate speech (just the opposite in fact). But nonetheless, the content addressed a subject that Google didn’t want on YouTube. Uziel has seen ad revenue slowly come back as the YouTube algorithm “learns where they should show ads, and where they should not” says Jamie Byrne, a director of enterprise at YouTube.
The examples I have cited represent just two instances out of thousands, maybe even millions that occur daily. We have given YouTube (and Display networks in general) the benefit of the doubt over the years because “it’s a new product,” “it’s not a science,” or “it’s difficult to monitor.” But, if Google is rolling out a product that can track brick-and-mortar purchases at your nearest Wendy’s back to your double-bacon cheeseburger search, then Google needs to find a consistent and responsible way to protect brands from advertising on videos that push violence, hate speech, or any other topic that goes against a company’s corporate belief.
But, we need to remember that YouTube would have to hire more than 75,000 employees to watch video for 40 hours a week to manually review every minute of every video uploaded. That scenario is unrealistic. So, advertisers, as well as consumers, need to be aware that Google’s ad problem will never go away.
Image source: videoadvertisingnews.com