Brent Staples on Facebook’s Unintended Consequences and the People Who Lack the Wisdom and Humility to Use Their Power Responsibly!

Dear Commons Community,

Brent Staples comments on social media’s unintended consequences and specifically on the people such as Mark Zuckerberg who made headlines this week by banning individuals who use Facebook for promoting hatred.  This is a complicated issue because no matter how “despicable” the haters are, there are fundamental issues and values involved such as freedom of speech.  He also comments that the people who own and operate major social media companies “lack the wisdom and humility to use their power responsibly.”  Below is the entire column.  It is well worth a read in helping to understand the influence of those who oversee social media and the dilemma of trying to control it.

Tony

——————————————————————————————————-

Facebook’s Unintended Consequence

People who lack the wisdom and humility to use their power responsibly.

By Bret Stephens

Opinion Columnist

May 3, 2019

Over the past several years we’ve learned a lot about the unintended consequences of social media. Platforms intended to bring us closer together make us angrier and more isolated. Platforms aimed at democratizing speech empower demagogues. Platforms celebrating community violate our privacy in ways we scarcely realize and serve as conduits for deceptions hiding in plain sight.

Now Facebook has announced that it has permanently banned Louis Farrakhan, Alex Jones, Milo Yiannopoulos and a few other despicable people from its social platforms. What could possibly go wrong?

The issue isn’t whether the people in question deserve censure. They do. Or that the forms of speech in which they traffic have redeeming qualities. They don’t.

Nor is the issue that Facebook has a moral duty to protect the free-speech rights of Farrakhan, Jones and their cohorts. It doesn’t. With respect to freedom of speech, the First Amendment says nothing more than that Congress shall make no law abridging it. A public company such as Facebook — like a private university or a family-owned newspaper — has broad latitude to feature or censor, platform or de-platform, whatever and whoever it wants.

Facebook’s house, Facebook’s rules.

The issue is much simpler: Do you trust Mark Zuckerberg and the other young lords of Silicon Valley to be good stewards of the world’s digital speech?

I don’t, but not because conservatives believe (sometimes with good reason) that the Valley is culturally, politically and possibly algorithmically biased against them. As with liberalism in academia, the left-wing tilt in tech may be smug and self-serving, but it doesn’t stop conservatives from getting their messages across. It certainly doesn’t keep Republicans from winning elections.

The deeper problem is the overwhelming concentration of technical, financial and moral power in the hands of people who lack the training, experience, wisdom, trustworthiness, humility and incentives to exercise that power responsibly.

That much should have been clear by the way in which Facebook’s leaders attempted to handle their serial scandals over the past two years. Ordering opposition research on their more prominent critics. Consistently downplaying the extent of Russian meddling on their platform. Berating company employees who tried to do something about that meddling. Selling the personal information of millions of its users to an unscrupulous broker so that the data could be used for political purposes.

Now Facebook wants to refurbish its damaged reputation by promising its users much more privacy via encrypted services as well as more aggressively policing hate speech on the site. Come again? This is what Alex Stamos, Facebook’s former chief security officer, called “the judo move: In a world where everything is encrypted and doesn’t last long, entire classes of scandal are invisible to the media.”

In other words, it’s a cynical exercise in abdication dressed as an act of responsibility. Knock a few high-profile bigots down. Throw a thick carpet over much of the rest. Then figure out how to extract a profit from your new model.

Assuming that’s Facebook’s deeper calculation — it’s hard to think of another — then it may wind up solving the company’s short-term problems. But it might also produce two equally dismal results.

On the one hand, Facebook will be hosting the worst kinds of online behavior. In a public note in March, Zuckerberg admitted that encryption will help facilitate “truly terrible things like child exploitation, terrorism, and extortion.” (For that, he promised to “work with law enforcement.” Great.)

On the other hand, Facebook is completing its transition from being a simple platform, broadly indifferent to the content it hosts, to being a publisher that curates and is responsible for content. Getting rid of Farrakhan, Jones and the others are the easy calls for now, because they are such manifestly odious figures and they have no real political power.

But what happens with the harder calls, the ones who want to be seen publicly and can’t be swept under: alleged Islamophobes, militant anti-immigration types, the people who call for the elimination of Israel? Facebook has training documents governing hate speech, and is now set to deploy the latest generation of artificial intelligence to detect it.

But the decision to absolutely ban certain individuals will always be a human one. It will inevitably be subjective. And as these things generally go, it will wind up leading to bans on people whose views are hateful mainly in the eyes of those doing the banning. Recall how the Southern Poverty Law Center, until recently an arbiter of moral hygiene in matters of hate speech, wound up smearing Ayaan Hirsi Ali and Maajid Nawaz, both champions of political moderation, as “anti-Muslim extremists.”

Facebook probably can’t imagine that its elaborate systems and processes would lead to perverse results. And not everything needs to be a slippery slope.

Then again, a company that once wanted to make the world more open and connected now wants to make it more private. In time it might also become a place where only nice thoughts are allowed. The laws of unintended consequence can’t rule it out.

 

Comments are closed.