Technology

YouTube will put disclaimers on state-funded broadcasts to fight propaganda

Valentina Palladino

YouTube's latest strategy to combat the spread of misinformation is to warn videos of certain sources of information. The Web Video Website announced that it will begin to tag videos posted by state-funded broadcasters to inform viewers that the content is, in part, funded by a government source. YouTube will start tagging videos today, and the policy extends to outlets, including the US Public Broadcasting Service (PBS) and the Russian public broadcaster RT.

According to a report published by the Wall Street Journal, PBS videos will now be called "publicly funded US broadcaster", while RT will have this disclaimer: "RT is financed in whole or in part by the Russian Government. "

The new policy is the way YouTube informs viewers of the origin of the content they view, information that is often hidden or unwanted by the viewers themselves. "The principle here is to provide more information to our users, and let our users make the judgment themselves, as opposed to providing us with an editorial judgment on any of these things ourselves. Neal Mohan told the WSJ.

Although providing more information about the source of information of users on YouTube is useful, the feeling of Mohan is at odds with another strategy under development: YouTube would consider publishing relevant videos from credible sources appear on a specific topic. For now, YouTube reserves its editorial judgment until it begins to decide what sources of information are credible on its website. However, we do not know if this strategy will soon become a reality, as it is still only in the early stages of development.

YouTube's decision to label all state-funded news videos follows strong criticisms from the US government and other countries about the involvement of large corporations technology companies in the spread of misinformation. Facebook Google and others had to answer questions about how Russian actors could easily spread misinformation about the 2016 elections to millions of Americans.

The new policy also comes after YouTube has addressed a number of controversies surrounding inappropriate content on its website. Over the past year, YouTube has published an ad-pocalypse after advertisers discovered that their ads were being broadcast on extremist videos; he had to deal with the general outcry caused by the distorted and inappropriate content of the children on the site (some of them abused children's popular characters or involved the potential abuse of the children themselves) ; and he had to set up new rules to oversee his greatest creators after Logan Paul posted a video showing the corpse of a suicide.

Theories of conspiracy abound

In short, it was only a matter of time before news outlets on YouTube were confronted with new rules specifically designed for them. The new labeling policy will be useful for some viewers on YouTube because it will help to better understand their preferred sources of information. He will also show Congress that YouTube is trying, at the very least, to inform its audience of possible misinformation and propaganda from sources backed by the government.

But general conspiracy theory videos are just as important on YouTube as government propaganda videos. The company has tweaked its algorithm since the conspiracy theory videos on the Las Vegas populated search results from last year immediately after the incident. However, most of the reported algorithm changes concern the promotion of more reliable sources rather than decommissioning or concealing deceptive sources.

YouTube is still in the process of modifying its algorithm to broadcast more mainstream news in news-related research. But it is unlikely that the algorithm settings can totally crush the conspiracy theory videos by glimpsing millions of views when these misleading videos continue to appear in the "recommended" section of a viewer.

Up to now, YouTube's content delivery algorithm has never focused on truthfulness – it has always been focused on delivering videos that people are most aware of. likely to click. It's not clear (and likely will be for a while) if the new changes will succeed in converting the users of sensationalized and inaccurate conspiracy videos.

Leave a Reply

Your email address will not be published.