YouTube uses a recommendation algorithm to help people view things they’d like to see. Recently, the algorithm seemingly encouraged pedophiles (YouTube would have no way of knowing this) to watch videos of children playing at home, videos that the family members uploaded.
A report from the New York Times detailed how YouTube had been exploiting minor children through the automated recommendation system. According to the report, researchers at the Berkman Klein Center for Internet and Society at Harvard were studying the influence of YouTube in Brazil. This was when they noticed the alarming issue. The experiment used a server, which followed YouTube recommendations a thousand or more times, which build a map of sorts in the process. The map is designed to show how YouTube users are guided as to what they may want to watch.
During the experiment, recommendations stemmed from sexually-themed videos, which is when researchers noticed that the system showed videos that were extreme or bizarre, placing more emphasis on youth. In some cases, a video of females discussing sex led to videos of women breastfeeding or wearing just underwear. Many times, the women mentioned their ages, which ranged from 19 to 16 years old.
Deeper into the experiment, YouTube started recommending videos where adults wore children’s clothing or solicited payment from ‘sugar daddies.’
With such softcore fetish recommendations already being showed, YouTube showed videos of children who weren’t fully clothed, many of them in Latin America or Eastern Europe.
These videos were usually home videos that had been uploaded by their parents. Many times, parents want to easily share videos and pictures of their children with family and friends. However, YouTube’s algorithm can learn that people who view sexually-exploited children want to see these family videos and may recommend them without knowledge.
One mother, Christine C., was interviewed by the Times about her 10-year-old child. The child uploaded a harmless video of her and a friend playing in the pool. The video was viewed over 400,000 times in just a few days. The mother said that her daughter was excited about the view count, which alerted Christine that something was amiss.
This is just one of many incidents that unfolded after YouTube publicly confronted its issues with pedophilia earlier in 2019. Back in February, YouTube had to disable comments on minor children’s videos because pedophiles were reportedly commenting on the videos in ways to signal other predators.
Studies have shown that the recommendation system on YouTube can create a rabbit-hole effect where the algorithm recommends more extreme content as time goes on. The company denied that reality or skirted the topic. However, in May, Neal Mohan, the chief product officer at YouTube, said that extreme content doesn’t drive more engagement or watch time than other content options.
YouTube hasn’t made many comments about the recommendation system or that it creates the rabbit hole effect. Instead, journalists and reporters are referred to a particular blog that explains how the company focuses on protecting minors and that its videos don’t violate any policies and are posted innocently.
The announcement also focuses on the recent steps taken by YouTube to disable comments for videos that feature or are uploaded by minors. Minors are also going to be restricted so that they cannot live-stream unless a parent is on the video. Along with such, the company plans to stop recommending videos that depict minors in risky situations.
Researchers believe that it would be best to block children’s videos or videos depicting children and not allow those videos in the recommendation system at all. However, YouTube reported to the Times that it doesn’t plan to do that because the automated system is one of the largest traffic drivers and could harm creators.
ROBERT SICILIANO CSP, is a #1 Best Selling Amazon author, CEO of CreditParent.com, the architect of the CSI Protection certification; a Cyber Social and Identity Protection security awareness training program.