Meta announced changes to its content moderation on Tuesday, effectively ending its fact-checking program and moving towards an X-like “community notes” approach.
“We want to… return to that fundamental commitment to free expression,” Meta, in a statement on their website, said. “Today, we’re making some changes to stay true to that ideal.”
Several media watchdog groups show skepticism towards Meta’s reasoning for the change. Additionally, they also caution against the decision and warn of a surge of lies and misinformation on the platform.
“While Zuckerberg characterized the platform giant’s new approach as a defense of free speech, its real intentions are twofold,” Nora Benavidez, senior counsel at the advocacy group Free Press, said. “Ditch the technology company’s responsibility to protect the health and safety of its users, and align the company more closely with an incoming president who’s a known enemy of accountability.”
This attribution of the change to the incoming presidential administration has become common opinion, with Meta showing support for President-elect Donald Trump and his administration in many areas. The company recently donated $1 million to Trump’s inaugural fund and added, among others, UFC chief executive Dana White, a long time Trump friend and supporter, to its board, Francis Brennan, former House GOP Deputy Communication Director, to its strategic response team and Joel Kaplan, White House Deputy Chief of Staff under George W. Bush and Republican lobbyist, as its President of Global Policy.
“Meta clearly perceives a great deal of political risk of being targeted,” Brendan Nyhan, political scientist at Dartmouth College, said. “And the way Zuckerberg presented the announcements, and the timing, was obviously intended to play to a Republican audience.”
In addition, Meta is also planning on recommending more political content and focusing its content moderation on high severity violations while significantly relaxing its moderation policies on most types of discourse.
“We’re getting rid of a number of restrictions on topics like immigration, gender identity and gender that are the subject of frequent political discourse and debate,” Meta said. “It’s not right that things can be said on TV or the floor of Congress, but not on our platforms.”
In the recent past, Facebook has been increasingly blamed for increasing hate and violence towards minorities around the world: a 2022 report by Amnesty International blamed Facebook’s algorithm for the violent atrocities perpetrated by the Myanmar military against the Rohingya people in 2017, a Business Insider 2021 report named Facebook complicit in the murder of Ethiopian Professor Maereg Amare Abraha. More examples keep surfacing. Many people believe that Facebook’s new moderation will significantly worsen this already existing issue.
“Journalists have a set of standards and ethics,” Maria Ressa, Nobel Peace Prize winner, said. “What Facebook is going to do is get rid of that and then allow lies, anger, fear and hate to infect every single person on the platform.”