Twitter changes hacked content rules after Biden story furor

Twitter changes hacked content rules after Biden story furor

Twitter said late Thursday it was changing its policy on hacked content after an outcry about its handling of an unverified political story that sparked cries of censorship from the right

ByThe Associated Press

October 16, 2020, 11:20 AM

• 2 min read

Share to FacebookShare to TwitterEmail this article

Twitter said late Thursday it was changing its policy on hacked content after an outcry about its handling of an unverified political story that prompted cries of censorship from the right.

The social media company will no longer remove hacked material unless it’s directly shared by hackers or those working with them, the company’s head of legal, policy, trust and safety, Vijaya Gadde, said in a Twitter thread.

And instead of blocking links from being shared, tweets will be labeled to provide context, Gadde said.

“We want to address the concerns that there could be many unintended consequences to journalists, whistleblowers and others in ways that are contrary to Twitter’s purpose of serving the public conversation,” she said.

Twitter and Facebook had moved quickly this week to limit the spread of the story published by the conservative-leaning New York Post, which cited unverified emails from Democratic presidential nominee Joe Biden’s son that were reportedly discovered by President Donald Trump’s allies. The story has not been confirmed by other publications.

Twitter initially responded by banning users from sharing links to the article in tweets and direct messages because it violated the company’s policy prohibiting hacked content. But it didn’t alert users about why they couldn’t share the link until hours later.

Twitter CEO Jack Dorsey tweeted that it was “unacceptable” the company hadn't provided more context around its action. A little over 24 hours later, Gadde announced the company was making changes after receiving “significant feedback (from critical to supportive)" about how it enforced the policy.

Facebook said it was “reducing” the story’s distribution on its platform while waiting for third-party fact-checkers to verify it, something it regularly does with material that's not banned outright from its service, though it risks spreading lies or causing harm in other ways.

Source Link