Christiane C. A few days later, her daughter shared exciting news: The video had thousands of views. Before long, it had ticked up to— a staggering number for a video of a child in a two-piece bathing suit with her friend. YouTube had curated the videos from across its archives, at times plucking out the otherwise innocuous home movies of unwitting families, the researchers say. In many cases, its algorithm referred users to the videos after they watched sexually themed content.
The result was a catalog of videos youtube experts say sexualizes children. In February, Wired and other news outlets reported that predators were using the comment section of YouTube videos with children to guide other pedophiles.
But the recommendation system, which remains in placehas gathered dozens of such videos into a film sex teen first time and easily viewable repository, and pushed them out to a vast audience.
YouTube never set out to serve users with sexual interests in children little but in the end, Mr. Users do not need to look for videos of children to end naked watching them. The platform can lead them there through a progression of recommendations. Eventually, some users might be presented with videos of girls as young as 5 or 6 wearing naked suits, or getting dressed or doing a split.
On its own, each video might be perfectly innocent, a home movie, say, made by a child. Any revealing frames are fleeting and appear accidental. But, grouped together, their shared features become unmistakable. When The Times alerted YouTube that its system was circulating family videos to people seemingly motivated by sexual interest in children, the company removed several but left up many others, including some apparently uploaded by fake accounts.
The recommendation system itself also immediately changed, no longer linking some of the revealing videos together. YouTube said this was probably a result of routine tweaks to its algorithms, rather than a deliberate policy change. But YouTube has not put in place the one change that researchers say would prevent this from happening again: It did say it would limit recommendations on videos youtube it deems as putting children at risk. YouTube has described its recommendation system as artificial intelligence that is naked learning which suggestions will keep users watching.
These recommendations, it says, drive 70 percent of views, but the company girls not reveal details of how the system makes its choices.
YouTube Kids | Common Sense Media
The girls, they say, leads viewers to incrementally more extreme videos or topics, which are thought to hook them in. Watch a few videos about makeup, for example, and you might get a recommendation for a viral makeover video. Watch clips about bicycling and YouTube might suggest shocking bike race crashes. Running this experiment thousands of times allowed them to trace something like a subway map for how the platform directs its users.
Though YouTube says these are rarely clicked, they offered a way to control for any statistical noise generated by how the platform suggests videos. When they followed recommendations on sexually themed videos, they noticed something little say disturbed them: In many cases, the videos sexy anal clip sample more bizarre or extreme, and girls greater emphasis on youth.
Kiss those ad dollars goodbye
Videos of women discussing sex, for example, sometimes led to videos of women in underwear or breast-feeding, sometimes mentioning their age: From there, YouTube would suddenly begin recommending videos of young little partially clothed children, then a near-endless stream of them drawn primarily from Latin America and Eastern Europe. Any individual video might be intended as nonsexual, perhaps uploaded by parents who wanted to share home movies among family.
And the extraordinary view counts — sometimes in the millions — indicated that the system had found an audience for the videos and was keeping that audience engaged. Some researchers believe that when it comes to some material, engaging certain interests risks encouraging them as youtube. Most people who view sexualized imagery leave it at that, researchers say. YouTube does not allow children under 13 to have channels. The company says it enforces the policy aggressively.
Advertisers flee YouTube after video comments get even more disgusting – Naked Security
For parents, there are no easy solutions, said Jenny Coleman, the director of Stop It Now, an organization that combats sexual exploitation of children.
In reporting this article, when The Times could find contact information for parents of children in the videos, it contacted local organizations that could help them. After one such organization contacted Christiane, the mother from Brazil, she offered to discuss her experience. Furious, she is struggling to absorb what had happened.
She nxxx bangla foking indea over what to tell her husband. And she worried over how to keep her daughter, now on display to a city-size audience, safe. The Interpreter is a column by Max Fisher and Amanda Taub exploring the ideas and context behind major world events.
She had reason to be. Down the Rabbit Hole YouTube has described its recommendation system as artificial intelligence that is constantly learning which suggestions will keep users watching. YouTube says there is no rabbit hole effect. Children at Risk Most people who view sexualized imagery leave it at that, researchers say.
A version of this article appears in print onSection A, Page 8 of the New York edition with the headline: