Advertisements

Why Disney, Nestle and Fornite have Withdrawn their YouTube Advertising

According to Bloomberg several major brands in the category of Disney, Nestle or Fornite would have withdrawn their YouTube advertising after the controversy unleashed in recent days related to an online community of pedophiles.

Advertisements
Share Give it a Spin!
RSS
Follow by Email
Facebook
Twitter
YOUTUBE
PINTEREST
LINKEDIN
INSTAGRAM

According to Bloomberg several major brands in the category of Disney, Nestle or Fornite would have withdrawn their YouTube advertising after the controversy unleashed in recent days related to an online community of pedophiles.

Walt Disney Co. and Nestle have withdrawn their YouTube ads after the scandal uncovered by a user to demonstrate that there is an international pedophile community using the algorithm of YouTube to find and share videos of minors in compromising situations. Most of these videos are monetized by the Google matrix, so before they can be viewed, ads from large multinationals, such as Disney and Nestle, are reproduced.

As an immediate consequence of this, since Wednesday Nestle would have withdrawn its YouTube advertising in relation to all brands it sells in the United States, as confirmed by a spokesperson through an email, according to Bloomberg.

German food giant Dr. Oetker would also have paused his advertising on the platform, as well as video game maker Epic Games Inc., according to a company spokesman confirmed to The Verge. The company would have suspended all of its pre-roll ads – which are shown before viewing a video – from Fornite on YouTube because its advertising was appearing in videos of minors in compromising situations that were being shared by pedophiles from around the world. world.

And according to the information provided by Bloomberg, Disney could have also joined the companies that have decided to withdraw their advertising on YouTube after this scandal. According to the media, the information would have been provided by employees of the company, although at the moment the company has not wanted to make this decision public.

Last Sunday the youtuber MattsWhatItIs showed once again the problem that YouTube suffers with the pedophile community by showing how in just 5 clicks this type of people managed to access underage videos in compromised situations, entering a spiral infinite since the algorithm of the platform recommends the user to continue consuming similar videos.

The user who has reported this situation has also explained that pedophiles around the world are dedicated to identifying in the comments the exact moments of each video in which children are left in compromised situations, often using videos of girls in bikini or swimsuit .

YouTube will return the money to advertisers

According to Bloomberg, advertising investment for this type of videos was less than $ 8,000 in the last 60 days, and the platform plans to return that money to advertisers, a spokeswoman explained.

After the publication of Matt’s video, which already has more than 1.7 million views, the Google video platform has once again emphasized its efforts to fight against this type of situation.

“YouTube has a zero tolerance policy on issues of child sexual abuse. We have invested significantly in technology, in human resources and in establishing relationships with NGOs to combat this problem. If we identify links, images or material that promotes this type of content, we report it to the competent authorities, remove it from our platform and terminate the account, “a YouTube spokesperson told Business Insider Spain, regarding the scandal this week.

In line with this, the company also announced on its blog the intention to strengthen its content protection policies to “help creators and artists understand when they crossed the line” by uploading content that undermines the goal of making the YouTube “The best place to listen, share and create a community”.

Even though the company uses artificial intelligence and human capital to eliminate malicious content, these scandals continue to happen again and again. Already in 2017 certain relevant companies withdrew their advertising on YouTube after verifying that their ads were appearing in videos of violent content and extremist ideologies.

Advertisements
Advertisements

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: