DIGITAL LIFE
YouTube loosens content moderation without warning and allows videos that violate rules, reveals 'NYT'
For years, YouTube has removed videos with derogatory insults, misinformation about Covid-19 vaccines and electoral lies, claiming that such content violated the platform's rules. But, since President Donald Trump's return to the White House, the platform has begun to encourage its content moderators to keep videos that may violate its rules up, as long as they are considered to be of public interest, such as political, social and cultural topics.
The change, not yet publicly disclosed, brings YouTube closer to other platforms that have relaxed moderation after pressure from the Republican Party. In January, Meta made a similar move when it ended its fact-checking program on social media posts. The company, which owns Facebook and Instagram, followed the example of X (formerly Twitter), Elon Musk's platform, and transferred the responsibility for moderating content to users.
Unlike Meta and X, YouTube has not made public statements about the loosening of moderation. The new policy was outlined in mid-December in training materials reviewed by The New York Times.
Videos are considered to be in the ‘public interest’...YouTube now allows videos of “public interest” to contain up to half of the content that does not comply with its rules, up from just a quarter. Moderators have also been instructed to keep town hall meetings, rallies and political debates up for air. The new guidelines mark a departure from practices during the pandemic, when such videos were removed for medical misinformation. The change could benefit political commentators, especially as YouTube takes on a prominent role in distributing podcasts.
The policy also helps the platform avoid criticism from politicians and activists unhappy with the removal of content related to the origins of Covid-19, the 2020 election and Hunter Biden, the son of former President Joe Biden.
What YouTube says...Nicole Bell, a YouTube spokeswoman, said the company continually updates its guidelines for moderators based on topics of public discussion. She said policies that no longer make sense are scrapped — as was the case in 2023 with some of the COVID guidelines — and others are tightened when necessary, such as the recent ban on content that directs users to gambling sites.
In the first three months of 2025, YouTube removed 192,586 videos for abusive and hateful content — a 22% increase over the same period last year.
“Recognizing that the definition of ‘public interest’ is always evolving, we have updated our guidelines to reflect the new types of discussions we see on the platform today,” Bell said in a statement. She added: “Our goal remains the same: to protect free expression on YouTube while mitigating serious harm.”
Platforms are in a ‘race to the bottom,’ analysts say...Critics say the changes to social media platforms have contributed to the rapid spread of misinformation and the rise of digital hate speech. Last year, a post on X falsely claimed that “social welfare offices in 49 states are distributing voter registration forms to illegal immigrants,” according to the Center for Countering Digital Hate, which studies misinformation and hate speech. The post, which was reportedly removed before the changes, was viewed 74.8 million times.
For years, Meta has removed about 277 million pieces of content annually. The new policies mean that much of that material could remain online — including comments like “black people are more violent than white people,” said Imran Ahmed, the center’s director. “What we’re seeing is a race to the bottom,” Ahmed said. He said the changes benefit companies by reducing content moderation costs and keeping more content online for users to engage with. “This isn’t about free speech. It’s about advertising, amplification, and ultimately profit.” Looser guidelines for sensitive topics YouTube has traditionally moderated heavily to keep the environment safe for advertisers, banning nudity, graphic violence, and hate speech — though it has always reserved the right to interpret its own rules. The policies allow videos that violate the guidelines to remain online if they have sufficient educational, documentary, scientific, or artistic merit. Now, the new guidelines widen those loopholes. They expand on changes already adopted before the 2024 election, according to training documents, when the company began allowing clips of candidates even if they violated its rules.
Other content that addresses political, social and cultural issues is also now exempt from the standard guidelines. YouTube considers videos to be in the public interest if they discuss or debate elections, ideologies, movements, race, gender, sexuality, abortion, immigration, censorship and other topics.
mundophone
No comments:
Post a Comment