Editor’s Note: In the article below you’ll find the video YouTube is Facilitating the Sexual Exploitation of Children, and it’s Being Monetized. A WARNING: The video is disturbing in nature. Nothing is blurred out nor is it audio only. Viewer should use discretion.
[Natasha Lomas | TechCrunch] More than a year on from a child safety content moderation scandal on YouTube and it takes just a few clicks for the platform’s recommendation algorithms to redirect a search for “bikini haul” videos of adult women towards clips of scantily clad minors engaged in body contorting gymnastics or taking an ice bath or ice lolly sucking “challenge.”
A YouTube creator called Matt Watson flagged the issue in a critical Reddit post, saying he found scores of videos of kids where YouTube users are trading inappropriate comments and timestamps below the fold, denouncing the company for failing to prevent what he describes as a “soft-core pedophilia ring” from operating in plain sight on its platform.
He has also posted a YouTube video demonstrating how the platform’s recommendation algorithm pushes users into what he dubs a pedophilia “wormhole,” accusing the company of facilitating and monetizing the sexual exploitation of children.
We were easily able to replicate the YouTube algorithm’s behavior that Watson describes in a history-cleared private browser session which, after clicking on two videos of adult women in bikinis, suggested we watch a video called “sweet sixteen pool party.”
Clicking on that led YouTube’s side-bar to serve up multiple videos of prepubescent girls in its “up next” section where the algorithm tees-up related content to encourage users to keep clicking.
Videos we got recommended in this side-bar included thumbnails showing young girls demonstrating gymnastics poses, showing off their “morning routines,” or licking popsicles or ice lollies.
Watson said it was easy for him to find videos containing inappropriate/predatory comments, including sexually suggestive emoji and timestamps that appear intended to highlight, shortcut and share the most compromising positions and/or moments in the videos of the minors.
We also found multiple examples of timestamps and inappropriate comments on videos of children that YouTube’s algorithm recommended we watch.
Some comments by other YouTube users denounced those making sexually suggestive remarks about the children in the videos.
Back in November 2017, several major advertisers froze spending on YouTube’s platform after an investigation by the BBC and the Times discovered similarly obscene comments on videos of children.
Earlier the same month YouTube was also criticized over low-quality content targeting kids as viewers on its platform.
The company went on to announce a number of policy changes related to kid-focused video, including saying it would aggressively police comments on videos of kids and that videos found to have inappropriate comments about the kids in them would have comments turned off altogether.
Some of the videos of young girls that YouTube recommended we watch had already had comments disabled — which suggests its AI had previously identified a large number of inappropriate comments being shared (on account of its policy of switching off comments on clips containing kids when comments are deemed “inappropriate”) — yet the videos themselves were still being suggested for viewing in a test search that originated with the phrase “bikini haul.”
Watson also says he found ads being displayed on some videos of kids containing inappropriate comments, and claims that he found links to child pornography being shared in YouTube comments too.
We were unable to verify those findings in our brief tests.
We asked YouTube why its algorithms skew toward recommending videos of minors, even when the viewer starts by watching videos of adult women, and why inappropriate comments remain a problem on videos of minors more than a year after the same issue was highlighted via investigative journalism.
The company sent us the following statement in response to our questions:
Any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We enforce these policies aggressively, reporting it to the relevant authorities, removing it from our platform and terminating accounts. We continue to invest heavily in technology, teams and partnerships with charities to tackle this issue. We have strict policies that govern where we allow ads to appear and we enforce these policies vigorously. When we find content that is in violation of our policies, we immediately stop serving ads or remove it altogether.
A spokesman for YouTube also told us it’s reviewing its policies in light of what Watson has highlighted, adding that it’s in the process of reviewing the specific videos and comments featured in his video — specifying also that some content has been taken down as a result of the review.
However, the spokesman emphasized that the majority of the videos flagged by Watson are innocent recordings of children doing everyday things. (Though of course the problem is that innocent content is being repurposed and time-sliced for abusive gratification and exploitation.)
[Editor’s Note: This article was written by Natasha Lomas and originally published at TechCrunch. Title changed by P&P.]