Christiane didn't think of anything when she posted a video of her 10-year-old daughter and her friends playing in the backyard swimming pool.
"This video is completely normal, it is not a serious problem," said the woman living in the outskirts of Rio de Janeiro, Brasil.
A few days later, this video received thousands of views. Soon, it passed the 400,000 mark, an amazing number for videos around the scene of children playing in two-piece bathing suits. But at this point, Christiane began to panic when watching the video and looking at the huge number of views. Obviously, she has reason to worry.
A team of researchers has discovered an automatic suggestion system on YouTube, which controls most of the billions of views on the platform by suggesting what users should watch next, showing video signs. for users who have seen other videos about children in skimpy outfits.
Sometimes the system proposes innocuous home videos. But in many cases, its algorithm has introduced users to these videos after users view sexual themes. And Christiane's daughter's video, suggested by YouTube's suggestion system a few months after the pedophile content warnings on the platform were released.
"I'm really scared of it," Christian said. "Fear of the fact that my child's videos fall into a scary synthetic category like this."
Christiane said she was really scared when she saw the number of views on the video that filmed her daughter.
According to Jonas Kaiser, a researcher at the Berkman Klein Internet & Social Center at Harvard University, YouTube's algorithm is still recommending videos for children for users whose viewing history matches a person's profile. pedophile.
This expert said that in fact, pedophiles do not need to search for videos about children to watch them. Because YouTube can lead them there through a hint process. Initially, a user watched videos containing images of sexy, sexy women. After that, the system will suggest videos with sexy posing women in children's suits. Finally, this user can see videos of girls about 5 or 6 years old who are wearing bathing suits or leaving their bodies exposed.
These videos, if placed separately at a corner, have no effect, as it may be a family-only video. Any frame related to a child wearing a dress may appear only by chance. But, when YouTube is grouped together, their meaning and purpose becomes unmistakable.
In fact, after being warned, YouTube deleted some videos but still left a lot of similar content, including content uploaded by fake accounts. The company said it changed the suggestion system so that it no longer linked these videos. In short, this platform only thinks that the problem lies in several algorithms and can handle it easily, not the management policy story.
Jennifer O’Connor, YouTube's product manager, declared strongly: "Child protection is at the top of our list of important things to do."
But for researchers, this is just anti-institutional behavior. The YouTube thing to do and didn't do is turn off the new content recommendation system on children's videos. It is possible for AI and YouTube tools to identify and do this. The reason given is simple and everyone can recognize it, that these videos have huge traffic, even the largest. The removal of the YouTube suggestion feature will "hurt content creators" who are based on viewers' clicks. But the benefits of these content creators are advertisers, or simply, YouTube's own revenue. Therefore, company representatives only say that they will limit suggestions on videos that they think will put children at risk. No more.
These YouTube-tolerant content can totally suggest and create pedophiles.
YouTube describes its suggestion system as a form of artificial intelligence that constantly learns, to propose content that is suitable for viewers. The company said it spurred up to 70% of views on the platform, but did not disclose details about how it works.
In the eyes of the researchers, they called it "rabbit hole effect", like what Alice had fallen into to get lost in the "wonderland". But here, it implies that the platform leads viewers to more extreme videos or topics by combining them from discrete videos into one system. For example, when you watch a bike video, YouTube may suggest you a bicycle-related accident video with high views and shocking content.
Researcher Jonas Kaiser and colleagues tested thousands of times YouTube's suggestion feature and gradually realized its scary. Suggested content is becoming more and more extreme, especially when it comes to sexual topics. From a video of women discussing sexuality alone, it can lead to videos of underwear or nursing girls. These characters have a very young age, sometimes only 18, 19 even 16 years old. The content that the system suggests seems endless, with lots of videos from Latin America and Eastern Europe.
Outside YouTube headquarters in California, USA.
Córdova, who has a lot of time to study online pornography distribution systems, said she realized what was happening. Initially, the videos are non-gender, can be uploaded by parents who want to share memorable moments in the family. But YouTube's algorithm, based on learning from images, considered videos as destinations for everyone on another journey. Videos with unusual views, sometimes up to millions, indicate that the system has found the right audience and made them participate more and more.
"YouTube has normalized it," said Marcus Rogers, a psychologist at Purdue University, who has done a lot of research on child pornography.
Even according to him, the risk doesn't stop there. Because many videos on YouTube show links to social networking accounts of young people. And the pedophiles, who are very skillful in talking and approaching these kids for bad purposes, follow Rogers.
YouTube representative Jennifer O’Connor said there was no such thing as a "rabbit hole effect" here.
Christiane, a mother from Brazil, is still angry and "struggling" to absorb what happened. She wondered what to say to her husband. She expressed confusion at how the YouTube platform worked and worried about how to keep her daughter, the small swimsuit girl appeared on the video that was watched by hundreds of thousands of people, okay safe.
"The only thing I can do," she said. "It's forbidden for her to upload anything to YouTube".
Refer The New York Times