YouTube has serious problems with the content on its platform, while they apologize and take action against the wave of peturbation videos aimed at children, is also fresh the controversy of Logan Paul, the famous youtuber who showed a video of a suicide while making fun, a video that was promoted in YouTube trends and watched six million times.
It is clear that something needs to change within YouTube. According to a former company engineer who worked with the recommendation system, YouTube’s algorithm is not optimizing what is truthful, balanced or healthy for democracy, and is full of disturbing biases.
Guillaume Chaslot, a 36-year-old French programmer with a Ph.D. in artificial intelligence, worked at Google for three years, and spoke with The Guardian about how part of that time he spent working with the team in charge of the system that YouTube uses to recommend videos to the users.
Chaslot believes that the priorities that YouTube gives to the algorithm have dangerous biases .
YouTube is something that looks like reality, but is distorted to make you spend more time online. The recommendation algorithm is not optimizing for what is true, or balanced, or healthy for democracy.
Suggest videos that keep people connected to increase profits
He explains that the engineers he worked with were responsible for constantly experimenting with new formulas that will increase advertising revenue by extending the time people spend watching videos.
Chaslot is especially concerned about the distortion that results from focusing only on showing people videos that they find irresistible, locking users into a bubble that shows content that only reinforces their view of the world. Something that of what Google has already been accused of doing with its search engine.
The developer said he had proposed changes to fix these problems on YouTube, but they were not taken into account. Chaslot says there are many ways in which YouTube can change its algorithms to eliminate false news and improve the quality and diversity of the videos that people see, but they do not .
YouTube now says that since Chaslot left the company, the recommendation system has evolved and now optimize beyond the time the user is watching, with several changes in 2016 and 2017 focusing more on the “satisfaction” of the user and discouraging the promotion of videos with inflammatory content.
Changes that begin to arrive more than a decade after Google bought YouTube in 2006 . And, they coincide with the large losses of advertisers who have seen their brands associated with extremist content.
YouTube and its influence on people’s decisions
Over the past 18 months, Chaslot has been using a program he designed to explore the bias in the content promoted by YouTube during the French, British and German elections, on global warming and mass shootings.
The results are published on Algotransparency.com. And, although each study finds something different, the research suggests that YouTube systematically amplifies videos that are divisive, sensational or conspiratorial.
He believes that the YouTube algorithm also played an important role in the 2016 US elections, pushing people in a pro-Trump direction. These algorithms that shape the content we see have a great impact on the decisions people make, especially those who are undecided.
YouTube does not agree with Chaslot’s findings and insists that its recommendation system reflects what people are looking for, that it is not biased in favor of anyone.
But we do not know how the algorithm really works, because it is a proprietary technology. So, how does YouTube determine people’s interest in something, and does YouTube not influence the videos that people choose to see if they offer the recommendations?
YouTube is the second largest social network on the planet with larger 1500 million active users , only behind Facebook, another platform that has been widely criticized for being harmful to users and society , precisely to promote chaos and divisive content using algorithms that seek to increase profits by keeping users attached to their screen to sell advertising.