How to Spot ‘Coordinated Inauthentic Behavior’ on Facebook, According to Snopes

How to Spot ‘Coordinated Inauthentic Behavior’ on Facebook, According to Snopes

Photo: JOSH EDELSON (Getty Images)

“Coordinated Inauthentic Behavior,” a phrase coined by Facebook, is the use of multiple social media accounts or pages to mislead or influence people for political or financial ends. As part of the deception, these pages hide the identities and obscure the motives of the people behind them.

For example, the network of pages exposed here revolves around pushing coronavirus disinformation, “blue lives matter” propaganda, and other pro-police and pro-military content. The Facebook pages and groups have gained over 2 million American followers, but are run mainly by people in Kosovo. They collected followers and likes by promising giveaways of free groceries and free mobile homes, but trying to enter the sham contests ultimately required users to give credit card info to a site run out of Cyprus.

It’s a big problem. According to a report written by Jeff Allen, a former data scientist at Facebook, in 2019, content from troll farms reached around 100 million US users per week.

You probably can’t do much to shield yourself from this kind of stuff, short of swearing off social media forever. But you can learn to recognize it for what it is. Here are some tips on how to spot this particular brand of fake news, according to Snopes.com.

Check Facebook’s ‘page transparency’ section

Every page on Facebook has a section called “page transparency” that lets you see what country the page’s managers are posting from and any name changes that page has undergone. The page transparency section is on the right of the page on desktop Facebook; you’ll spot it as you sroll down to page posts in mobile view.

G/O Media may get a commission

Check for verification

Verified pages have blue badges next to the group or profile name. If you see one of these, it means the page or profile likely represents who or what it says it represents. If it’s missing, it could indicate something is amiss. (An unverified page representing Lorenzo Lamas, for instance, probably isn’t actually Lorenzo Lamas.) To get verified, page administrators must send documentation to Facebook to verify their authenticity.

‘Like and Share’ posts could indicate a problem

If a page is filled with pictures and memes exhorting readers to like and/or share them, it could indicate coordinated inauthentic behavior. It doesn’t definitively point to shady behavior, but Snopes’ staff notes that it often sees this behavior with inauthentic pages .

Check the ‘page creation’ date

You can tell a lot from a page from when it was created. Snopes warns to look for highly politically charged pages with very recent creation dates. If a page is about hot-button American political issues but was created two days ago and is being run by people from another country, be suspicious. To find the creation date, click the “Page Transparency” link on pages or the “About” page in groups.

Check a Facebook group’s administrators and moderators

Facebook groups (but not pages) list their administrators, moderators, and members, so click “Members” on a group to see who is running it, so you can determine whether the admins seem on the level.

Does any of this actually work?

It’s hard to say how effective providing this information actually is. Facebook says it’s committed it’s “taken aggressive enforcement actions against these kinds of foreign and domestic inauthentic groups,” but others, like ex-Facebook employee Allen maintain that Facebook could be doing more.

“Adding even just some easy features like Graph Authority and pulling the dial back from pure engagement-based features would likely pay off a ton in both the integrity space and... likely in engagement as well,” Allen wrote.

Source Link