The video has very useful content, but because YouTube is also a "video repository" of the world, it is mixed with malicious content. More importantly, they are not monitored and managed well by YouTube, so many people are still watching these malicious videos, especially children, every day. Usually we don't find out what kind of videos this is, but YouTube has a "recommendation" feature to suggest to you, that's the danger.
The hint feature is made for you to waste time
This hint feature is the "Up next" list you see on the right side of the screen when viewed with a computer, and if you turn on autoplay mode, these videos will automatically play without you intervening.
According to Guillaume Chaslot, who worked at Google in YouTube's suggested algorithm development department, and the founder of AlgoTransparency project, providing transparency for online platforms, "these videos are what you should be. important ". He said that the motivation behind this suggestion machine has a serious flaw, because it is not based on what you want.
YouTube Suggested Area
"If YouTube's AI is fine-tuned enough, it will help you see what you like, this is great. But the problem is that the AI is not built to show you what you want, it's done. So you're addicted to YouTube, this suggestion system is made so you spend your time (for YouTube) ".
Chaslot further explained that the index is used to measure whether the algorithm suggests "good" or not, it is time to see. This will be important to a marketing company, but it does not really reflect user preferences, and it has many side effects.
Many interactive content will be suggested, this is not good
In one of his talks, Chaslot said that controversial content, such as conspiracy theories, duck news, flat earth videos … The more the content lies in the context of the content policy, the more easy to make people exchange and quarrel. In YouTube's suggested algorithm, this amount of interaction is one of the factors included in the calculation to determine whether the user should be viewed or not.
This structure is fine with good videos, such as videos of someone streaming games, cat videos, videos. Sophisticated on product handsets … But YouTube not only has this content, it is becoming a place. Consuming content from the world, Chaslot is concerned that the suggestion engine will push users to the final limit, and you will have to watch bad content whether you want it or not, for the simple reason: YouTube wants to You watch videos for as long as possible.
Google of course does not agree with Chaslot, but we will talk about this later.
Mark Zuckerberg admitted last year that controversial content is more often interacted with by people, Google does not respond to this issue but only says that the quality content is more interactive. According to personal experience, Chaslot is a problem that is not consistent among technology companies. In fact, the content of this type, such as Android vs iPhone, Mac vs Windows … is indeed more interactive, good content is not entirely different.
In YouTube, controversial content goes even further than that, they are sensitive enough for each person to express their own feelings, but still do not violate YouTube's rules so they are not deleted. And those who make videos of this type will have more money from YouTube, YouTube will have more money from the ad units.
When the suggestion goes the wrong way
According to Chaslot, the point of concern is that the index used to evaluate the effectiveness of the suggestion was wrong from the beginning, for example, time does not mean that the content will be quality. This caused a negative impact on the community, and he created the AlgoTransparency tool after quitting at Google.
This tool will scan YouTube every day to see which videos are suggested. The majority of the top videos that came out were good content, coming from popular channels, but sometimes there were also bad content. For example, when there was an investigation report on whether President Trump and Russia were linked in the 2016 election, Chaslot noticed YouTube suggested on most channels about a video from RT – agency Russian state media.
If Chaslot is right, YouTube's algorithm is helping more viewers about a video about investigating the relationship between Trump and Russia produced by … Russia. This video is on the Kremlin and says media agencies have put in too many conspiracy theories. It is important that no one understands why this happens.
As for Google, it completely rejects AlgoTransparency's way of understanding it and says it does not reflect the way their system suggests. It runs based on surveys, likes, dislike, and share … Google says:
We could not reproduce the results that AlgoTransparency gave. We design our system to make sure that content from mainstream sources comes first during your search or during the next video suggestion. This applies even when users view content about news on YouTube.
Chaslot said that if there were anomalies in his results, Google pointed it out and Google simply shared what videos were suggested to everyone. But Google has not said anything more about this issue, perhaps the company is concerned that the elements of the algorithm will be guessed and abused. Chaslot said it was Google who said about eliminating the "bias" of the algorithm, Google then took the example of a video AlgoTransparency had previously recorded.
What can you do?
If you feel uncomfortable with this hint feature, you can use a Chrome extension called Nudge, which will eliminate addictive features on News Feed Facebook or YouTube's proposed system. "I still click on stupid things when I see them, so most probably delete it."
All of these things can only solve the symptoms and not treat the root. Chaslot believes that Google needs a long-term solution, and Google must be transparent about the algorithm and give users the right to make their own decisions. "In the long term, we need to control the AI, instead of letting AI control the user."