Meta, Facebook’s newly named parent company, has launched a new initiative designed to combat the spread of online misinformation. This new initiative comes in the form of a fact-checking mentorship program, developed in partnership with the Poynter Institute’s International Fact-Checking, which aims to help fact-checking organizations scale their efforts and increase their impact.

Meta explains:

“Reducing the spread of misinformation is a challenge that no single organization can tackle alone. Strong partnerships with subject matter experts and sharing information on best practices play a big role in effectively addressing misinformation. That’s why, as part of this global mentorship program, the nonpartisan IFCN will select up to 6 experts from the fact-checking industry to serve as mentors for up to 30 organizations in Meta’s third-party fact-checking program.”

Meta is allocating $450,000 to fund the initiative, which will help improve fact-checking processes through shared education. The program also has a specific focus on helping more organizations in more regions to counteract harmful trends.

Facebook, specifically, has been the subject of constant and often harsh, criticism over the role that it plays in the spreading of misinformation, with recent leaks underlining that its News Feed Algorithm is amplifying the negative impacts of sharing fake news and similar fallacious material.

In its defense, Meta had its Vice President for Global Affairs and Communication, Nick Clegg, vouch for its role as being a non-divisive platform. As explained by Nick:

“The increase in political polarization in the US pre-dates social media by several decades. If it were true that Facebook is the chief cause of polarization, we would expect to see it going up wherever Facebook is popular. It isn’t. In fact, polarization has gone down in a number of countries with high social media use at the same time that it has risen in the US.” 

Even so, it’s still hard to deny the claims that Facebook causes more polarization than not, especially if the top ten daily most shared links look like this. Facebook’s algorithms are driven by engagement, as well as content that generates the most engagement triggers and elicits the most emotional response. Said emotions associated with the most engaging content, ironically, usually end up giving either an ‘angry’ or ‘happy reaction.

The Wrap

Fighting against the spread of misinformation is an important element for any social platform. It’s also an equally difficult one to maintain, seeing as how the likelihood and innate system of certain platforms expand its reach over time. Facebook, given the amount of time it has been allowed to grow, has become ‘too big’ to effectively cover and monitor 100% of its current scope. TLDR (Too long didn’t read); it’s easy to come across and share false information but harder to verify legitimate and credible ones.

This new initiative will thus play an essential role in helping Facebook, and Meta, establish a more secure and proven foundation, especially now that they look to expand to more regions. Along with other efforts, both entities, regardless, will have to up their game to tackle and effectively combat the impacts of localized misinformation.

Subscribe to our ‘Bottoms Up!’ Newsletter. Get the latest social media news, strategies, updates and trends to take your business to the highest level.


Sources

https://bit.ly/3mH5i4W