Is freedom of speech the same as freedom to lie?

Is freedom of speech the same as freedom to lie?

Meta will stop checking falsehoods. Does that mean more free speech or a free-for-all?

“First, we’re going to get rid of fact-checkers,” Mark Zuckerberg, the founder of Meta, said in a video statement early this January. “Second, we’re going to simplify our content policies and get rid of a bunch of restrictions on topics like immigration and gender that are just out of touch with mainstream discourse.”

This statement marks another turn in the company’s policies in handling disinformation and hate speech on their widely used platforms Facebook, Instagram and Threads. 

Meta built up its moderation capabilities and started its fact-checking program after Russia’s attempts to use Facebook to influence American voters in 2016 and after it was partially blamed by various human rights groups like Amnesty International for allowing the spread of hate speech leading to genocide in Myanmar. 

Until now, according to Meta, about 15 thousand people review content on the platform in 70 languages to see if it is in line with the company’s community standards.

Adding information, not deleting

For other content, the company involves professional fact-checking organizations with journalists around the world. They independently identify and research viral posts that might contain false information. 

Fact-checkers, like any other journalists, publish their findings in articles. They compare what is claimed in the post with statistics, research findings and expert commentary or they analyze if the media in the post are manipulated or AI generated. 

But fact-checkers have a privilege that other journalists don’t – they can add information to the posts they find false or out of context on Meta platforms. It appears in the form of a warning label. The user can then read the full article by fact-checkers to see the reasons or close the warning and interact with the post.

Fact-checkers can’t take any further action like removing or demoting content or accounts, according to Meta. That is up to the company. 

However, Meta now likens the fact-checking program to censorship. Zuckerberg also argued for the end of the program saying that the fact-checkers “have just been too politically biased and have destroyed more trust than they’ve created.”

Can untrained people regulate the Web?

For now, the fact-checking program will be discontinued in the United States. Meta plans to rely instead on regular users to evaluate content under a new program it calls “Community Notes.” The company promises to improve it over the course of the year before expanding it to other countries.

In a way, Meta walking back on their commitments to fight disinformation wasn’t a surprise, said Carlos Hernández- Echevarría, the associate director of the Spanish fact-checking outlet Maldita and a deputy member of the governance body that assesses and approves European fact-checking organizations before they can work with Meta called the European Fact-Checking Standards Network. 

Zuckerberg had previously said that the company was unfairly blamed for societal ills and that he was done apologizing. But fact-checking partners weren’t warned ahead of the announcement of the plans to scrap the program, Hernández- Echevarría said.

It bothers him that Meta connects fact-checking to censorship.

“It’s actually very frustrating to see the Meta CEO talking about censorship when fact-checkers never had the ability and never wanted the ability to remove any content,” Hernández-Echevarría said. He argues that instead, fact-checkers contribute to speech by adding more information. 

Are fact-checkers biased?

Hernández-Echevarría also pushes back against the accusation that fact-checkers are biased. He said that mistakes do occur, but the organizations and people doing the work get carefully vetted and the criteria can be seen in the networks’ Code of Standards

For example, fact-checkers must publish their methodology for choosing and evaluating information. Fact-checkers also can’t endorse any political parties or have any agreements with them. They also have to provide proof of who they are owned by as well as publicly disclose information about their employees and funding.

Meta’s own data about Facebook, which they disclose to EU institutions, also shows that erroneous decisions to demote posts based on fact-checking labels occur much less often than when posts are demoted for other reasons — nudity, bullying, hate speech and violence, for example. 

In the period from April to September last year, Meta received 172,550 complaints about the demotion of posts with fact-checking labels and, after having another look, reversed it for 5,440 posts — a little over 3%. 

However, in all other categories combined, the demotion had to be reversed for 87% of those posts.

The sharing of unverified information

Research shows that the perception of the unequal treatment of different political groups might form because people on the political right publish more unreliable information.

A paper published in the scientific magazine Nature says that conservative users indeed face penalties more often, but they also share more low-quality news. Researchers therefore argued that even if the policies contain no bias, there can be an asymmetry in how they are enforced on platforms.

Meta is also making other changes. On 7 January, the company published a revised version of its hateful conduct policies. The platform now allows comparing women to household objects and “insulting language in the context of discussing political or religious topics, such as when discussing transgender rights, immigration, or homosexuality”. The revised policies also now permit “allegations of mental illness or abnormality when based on gender or sexual orientation”.

LGBTQ+ advocacy group GLAAD called these changes alarming and extreme and said they will result in platforms becoming “unsafe landscapes filled with dangerous hate speech, violence, harassment, and misinformation”. 

Journalists also report that the changes divided the employees of the company. The New York Times reported that as some upset employees posted on the internal message board, human resources workers quickly removed the posts saying they broke the rules of a company policy on community engagement.

Political pressure

In a statement published on her social media channels. Angie Drobnic Holan, the director of the International Fact-Checking Network, which represents fact-checkers in the United States, linked Meta’s decision to political pressure.

“It’s unfortunate that this decision comes in the wake of extreme political pressure from a new administration and its supporters,” Holan said. “Fact-checkers have not been biased in their work. That attack line comes from those who feel they should be able to exaggerate and lie without rebuttal or contradiction.”

In his book “Save America” published in August 2024, Donald Trump whose term as U.S. President begins today, accused Zuckerberg of plotting against him. “We are watching him closely, and if he does anything illegal this time he will spend the rest of his life in prison,” he wrote. 

Now, with the changes Zuckerberg announced, Trump is praising Meta and said they’ve come a long way. When asked during a press conference 7 January if he thought Zuckerberg was responding to Trump’s threats, Trump replied, “Probably.”

After Meta’s announcement, the science magazine Nature published a review of research with comments from experts on the effectiveness of fact-checking. For example, a study in 2019 analyzing 30 research papers covering 20 thousand participants found an influence on beliefs but the effects were weakened by participants’ preexisting beliefs, ideology and knowledge. 

Sander van der Linden, a social psychologist at the University of Cambridge told Nature that ideally, people wouldn’t form misperceptions in the first place but “if we have to work with the fact that people are already exposed, then reducing it is almost as good as it as it’s going to get”. 

Hernández-Echevarría said that although the loss of Meta’s funding will be a hard hit to some organizations in the fact-checking community, it won’t end the movement. He said, “They are going to be here, fighting disinformation. No matter what, they will find a way to do it. They will find support. They will do it because their central mission is to fight disinformation.”


Questions to consider:

• What is now allowed under Meta’s new rules for posts that wasn’t previously?

• How is fact-checking not the same as censorship?

• When you read social media posts, do you care if the poster is telling the truth?


 

Source link