In a surprising move, YouTube announced on Friday that it will no longer be removing videos with false claims of fraud in the 2020 presidential election. This policy reversal comes ahead of the 2024 elections and marks a significant change from the platform’s previous stance.
YouTube stated that it has already deleted tens of thousands of videos that questioned the integrity of past elections. However, the company now believes that it is necessary to re-evaluate its approach. The new policy went into effect on June 2.
Social media platforms, including YouTube, have faced intense pressure since the 2016 elections to combat political misinformation. The decision to stop removing videos with false election claims is based on the changing landscape of today’s digital environment.
According to YouTube, while the removal of such content does curb some misinformation, it may also inadvertently restrict political speech without effectively reducing the risk of real-world harm or violence. The company aims to strike a balance between addressing misinformation and preserving freedom of expression.
YouTube has mentioned that it will continue refining its policies leading up to the 2024 election, but specific details about the reasons behind the policy change have not been provided. The BBC has reached out to YouTube for further comment.
The platform clarified that it will still enforce other election misinformation policies, such as removing videos that provide misleading instructions on voting locations or methods.
The election fraud policy was initially implemented in December 2020 and resulted in the deletion of a video posted by Donald Trump on January 6, 2021. In that video, Trump repeated his false claims of widespread fraud but also called for peace. The inclusion of false claims led to the removal of the video.
In 2022, a video posted by a US congressional committee investigating the Capitol riot was also deleted by YouTube because it contained a clip of Trump repeating election falsehoods.
However, in March of this year, YouTube lifted the restrictions on Donald Trump’s channel, which has over 2.7 million subscribers. Since then, Trump has posted approximately 20 short clips supporting his campaign.
In conclusion, YouTube’s decision to stop deleting videos with false claims about the 2020 election has raised eyebrows and sparked discussions about the platform’s role in addressing misinformation. While the company aims to strike a balance between curbing false information and preserving political speech, the impact of this policy change remains to be seen. As the 2024 elections approach, YouTube will likely face further scrutiny regarding its handling of potential misinformation.