Skip to Navigation Skip to Main Content

Doyle McManus: Congress wades into swamp of social media

Profile image for Doyle McManus
Doyle McManus, Syndicated Columnist

Congress is grilling Facebook, Twitter and Google for allowing Russian agents to hijack their social networks. The grilling is long overdue.

Doyle McManus
Doyle McManus

When Mark Zuckerberg took his company public in 2012, he said his mission was "to bring people closer together" and "create a more open culture," not merely to make money. Facebook made plenty of money and reunited lots of old high school friends. But it also created a vast electronic swamp that mixes real information, misinformation, foreign propaganda and scammery, without helping users to distinguish one from the other.

Along the way, it became the country's most powerful news organization, but a news organization that seems to operate without an editorial brain -- unless you count the algorithms that fail to notice all the fake stuff.

Case in point: the 2016 election campaign.

As Facebook general counsel Colin Stretch told a Senate subcommittee recently, content produced by Russian operatives may have reached as many as 126 million Americans over the last two years. Their aim, he said, was "to sow division and discord and to try to undermine the election."

Twitter said more than 36,000 Russian bots sent 1.4 million tweets during the campaign. The company allowed one Russian account to label itself "the unofficial voice of the Tennessee Republican Party" for 11 months, even though the real GOP told them it was fake.

Google said it had discovered 1,108 campaign-related videos from Russian sources on YouTube, the video service it owns.

No one knows if the election outcome would have been different without Russian interference. But it's impossible to argue that all that disinformation had no effect at all.

Besides, this isn't only about who won the election. Russia wanted -- perhaps above all -- to stir up divisions among Americans.

One Russian-authored Facebook group, "Blacktivist," urged African-Americans to retaliate against police violence, "an eye for an eye," according to researcher Jonathan Albright of Columbia University's Tow Center for Digital Journalism.

Another, aimed at white conservatives, said Black Lives Matter activists should "be immediately shot."

Some of that evil stuff has continued after election day. In March, a Russian account on Facebook attacked immigrants in the United States illegally, saying: "The only way to deal with them is to kill them all." And Twitter accounts linked to Russia have been attacking the credibility of special counsel Robert S. Mueller III, who's investigating Russia's role in the campaign.

It's not hard to see how we've come to this pass. Facebook and the other social media companies sell access to their networks to all comers; that's how they make their money. Much of their selling is automated and anonymous; they don't always know who's buying. And they've long insisted that it isn't their job to police the content that travels across their systems, at least when it comes to political speech.

The digital executives who testified recently all acknowledged, profusely, that they needed to do a better job. They've already shut many of the offending accounts down, they said. Facebook is hiring 3,000 human editors to help its algorithms figure out when noxious content is flowing across its network.

What else do they need to do?

More transparency is the easiest answer -- more information about advertisers, more clarity about the true sources of content, more outside researchers to analyze what's going on. Better enforcement of existing standards should be easy, too, even though it will require expensive human labor to do.

But the next step is more complicated: Should social media companies do something to distinguish between good information and bad -- between truths and falsehoods?

"Facebook understands that it has a responsibility to democracy," said Peter Eckersley, chief computer scientist for the Electronic Frontier Foundation, a digital rights group. "It would be very shortsighted of them to put short-term profit ahead of that."

Whether the social networks like it or not, they are already the de facto arbiters of what's true and false in American public discourse.

"What's posted on Facebook determines what society thinks is true or false," Eckersley said. "When we see something on Facebook that tends to confirm out viewpoint, we tend to share it."

So yes, please, let's take action.

DOYLE McMANUS writes for the Los Angeles Times. His column is distributed by MCT Information Services.