Labeling Facebook accounts as state media results in reduced engagement


China, Russia, Iran and other undemocratic countries do their best to prevent their citizens from knowing the truth and to spread mis-information and dis-information via the social media that is fake news. 

To combat the effect of these persuasion attempts by such governments, Facebook introduced a “state-controlled media” label in June 2020 to alert users when a post comes from a page associated with certain totalitarian governments. 

In a new set of studies, researchers at Pennsylvania’s Carnegie Mellon University (CMU), Indiana University (IU), and the University of Texas at Austin (UT) investigated the causal impact of these labels on users’ intentions to read and get involved with Facebook content. They found that the labels reduced engagement on social media if users noticed them or were trained to notice them and if the labels were associated with a country that was perceived negatively.

They published their findings in the journal Information Systems Research under the title “Countering State-Controlled Media Propaganda Through Labeling: Evidence from Facebook.” 

“Propaganda is a major concern on social media, but it has not received the same attention that mis- and disinformation have received, and it can be more insidious and even less obvious,” explained CMU digital economy Prof. Avinash Collis, who coauthored the research. “By understanding the impact of labeling propaganda, social media companies, news media companies, and users will be able to implement and respond to the labels more appropriately.”

How are dictatorships using social media platforms like Facebook?

Authoritarian countries amplify antidemocratic narratives. To determine whether the use of these labels slows the spread of information, researchers examined these countries’ effectiveness in altering people’s beliefs and behaviors on social media. They conducted two online experiments, and in a third study, they analyzed field data from Facebook before and after the company began using labels.

In the first experiment, 1,200 individuals with US Facebook accounts were shown posts with and without state-controlled media labels. Users who saw headlines with labels that originated in China and Russia were less likely to believe, like, read, share, and comment on the posts than users who saw headlines without labels, but only if they actively noticed the label.

In the second experiment, the researchers tested whether it was the label itself or the country listed in the label that influenced users. Nearly 2,000 Americans with Facebook accounts were shown posts with and without state-controlled media labels. Users’ behavior was tied to public sentiment toward the country listed on the label. For example, they responded positively toward content labeled as coming from Canadian state-controlled media and negatively toward content labeled as coming from Chinese and Russian state-controlled media.

In addition, training users on the labels that notified them of their presence and tested them on their meaning significantly boosted their likelihood of noticing the labels and believing them when they came from Canadian state-controlled media.

In the third study, which analyzed field data, researchers tested the efficacy of the labels by examining users’ engagement before and after June 4, when Facebook began using labels to identify Chinese and Russian state-controlled pages. Facebook’s labeling policy had a significant effect: After Facebook launched the policy, labeled posts were shared 34% less and liked 46% less than before the labels were added, confirming the first two online experiments.

“Our three studies suggest that state-controlled media labels reduced the spread of misinformation and propaganda on Facebook, depending on which countries were labelled,” commented IU operations and decision technologies assistant Prof. Patricia Moravec who led the study.

“Although efforts are being made to reduce the spread of misinformation on social media platforms, attempts to reduce the influence of propaganda may be less successful,” concluded UT doctoral student   Nicholas Wolczynski who coauthored the study. “Given that Facebook debuted the new labels quietly without informing users, many likely did not notice the labels, reducing their efficacy dramatically.” The authors suggested that social media platforms clearly alert and inform users of labeling policy changes, explain what they mean, and display the labels in ways that users notice.