Facebook has been conducting an interesting experiment in “informed democracy” as it figures out how to deal with the huge number of posts containing climate change misinformation …
Background
Deliberate disinformation, or misinformation based on either ignorance or misunderstanding, is one of the biggest moderation challenges faced by social networks like Facebook and Twitter. Topics with significant false information range from COVID-19 vaccinations to climate change – with the latter proving particularly difficult to tackle.
For out-and-out false information, both platforms apply a label noting that the claim is not true, and linking to a reputable source of accurate information.
Where things get trickier is where a post falls into a somewhat gray area. This, in the words of a consultancy hired by Facebook, is where a post “is not necessarily false, yet expresses views that may contain misleading, low-quality, or incomplete information that can likely lead to false conclusions.”
An obvious example of this would be the many posts along the lines of “Sure, climate change is happening, but it has always happened, and we just adapt to it.” That is technically true, but massively misleading in that the scale of change before and after the industrial revolution is exponentially different. Or “Wildfires have always happened” – also true, but fails to acknowledge the fact that the frequency and extent of them is steadily increasing.
A new approach to climate change misinformation
Casey Newton explains Facebook’s approach to the issue. In particular, the company wanted to understand what typical users would want in terms of moderation if they were properly informed about the issue.
For its experiment, Meta and BIT worked to find about 250 people who were broadly representative of the Facebook user base. They brought them together virtually across two weekends to educate them about climate issues and platform policies, and offered them access to outside experts (on both climate and speech issues) and Facebook employees. At the end of the process, Facebook offered the group a variety of possible solutions to problematic climate information, and the group deliberated and voted on their preferred outcomes.
Facebook wouldn’t tell me what the groups decided — only that all three groups reached a similar consensus on what ought to be done. Their deliberations are now being taken under advisement by Facebook teams working on a policy update, the company told me.
It’s worth noting that while 250 people would be a tiny sample for a quantitative survey, it’s actually a very large one for this type of qualitative work, which is geared to gaining detailed understanding rather than simply asking people multiple-choice questions.
Facebook parent company Meta says it plans to continue with this type of approach.
“We don’t believe that we should be making so many of these decisions on our own,” Brent Harris, vice president of governance at the company, told me in an interview. “You’ve heard us repeat that, and we mean it” […]
“We think that if you set this up the right way, that people are in a great position to deliberate on and make some of the hard decisions (around) trade-offs, and inform how we proceed,” Harris said. “It was actually really striking how many folks, when they came together, agreed on what they thought the right approach would be.”
Photo: Matt Palmer/Unsplash
FTC: We use income earning auto affiliate links. More.
Comments