Facebook and Democracy: Social Media’s Coarsening Impact on the Public Square

Could Twitter diminish your tolerance for opposing ideas (as well as your productivity)? Is Facebook bad for democracy?

Facebook, Twitter, YouTube, Reddit, and other social media platforms are set up to show people content that they are already likely to agree with, which is fine when you are checking out puppy dogs and meal ideas. But when the content turns toward politics or life-changing policies, social media algorithms on Facebook and elsewhere leave people seeing only content they “like,” trapping them in a self-reinforcing bubble with little exposure to alternative ideas.

The result? People with different opinions are drifting further and further apart, removed from intellectual challenges and less likely to engage with political opponents. This drop in the need for intellectual rigor is making it more difficult to find solutions to problems that impact everyone.

Harvard Law Professor Cass Sunstein’s latest book, “#Republic: Divided Democracy in the Age of Social Media,” outlines the role of social networks in representative government, and warns that the division of viewpoints into hardened us vs. them groupings is real, growing, and becoming more difficult to overcome with time.

Speaking to political columnist Michael Barone recently, Sunstein said that the blinders narrowing our minds are harming the American creed.

Echo chambers and information cocoons are a real problem for democracy. It’s very important for people to step outside a kind of hall of mirrors which they can construct with the aid of Facebook or Twitter or Instagram, and encounter both topics that are unfamiliar and maybe not especially interesting to them, and certainly points of view that aren’t congenial and that may be disruptive to what they already think that is central to, let’s say, the American project.”

The average Facebook user gets about 20 percent of his or her news from Facebook, with younger people getting a higher percentage. Likewise, the data show that people on Twitter tend to follow people that agree with their points of view.

Sunstein says this phenomenon is no surprise. Visionaries like Bill Gates saw 20 years ago a new world in which people could get exactly what they want, effectively creating what Sunstein calls “The Daily Me,” a completely personalized online encounter in which everything on one’s computer or tablet reflects views that are preferential to the owner. That’s exactly where society headed.

Is there a danger in not turning the trend around, or not having people demonstrate a curiosity for what others outside their viewpoints think? And is the decision to look at like-minded ideas on the Internet any different than self-selecting pre-sorts of media that came before it, like the cable news channels or news magazines?

Yes and no, Sunstein says. Self-selection has been going on for ages, but its scale has never been so large and so reinforced. As a result, despite its massive reach, social media have basically made it harder to solve problems. When it comes to policies like immigration, infrastructure, education, or economic mobility, the positions have become so rigid, that “doing something about some of these issues would seem preposterous.”

Sunstein notes that human curiosity doesn’t keep everyone down. The counter-effect of social media is that people on each side of the debate pay close attention to what the opposition is saying so that they can monitor and challenge it.

Though Sunstein describes his own book as downbeat and not cheerful, he suggested a few prescriptions that could turn the tide for American society. For one, providers of information, whether they be news outlets or Facebook itself, can get out of the business of reinforcing the barriers.

Two ideas that would be on the list of proposals are, why not give Facebook users an Opposing Viewpoints button where they can just click and then their newsfeed is gonna show them stuff that they don’t agree with. Or why not give Facebook users a Serendipity button where they can just click and if they click, then they’re gonna get stuff that is just coming to them through an algorithm which provides people with a range of stuff. So if you’re someone who is just focused on one set of issues, you’re gonna get the “Wall Street Journal” and “New York Times” also.

And Facebook, to its credit, doesn’t wanna pick winners and losers, so they shouldn’t promote one particular newspaper, but they could have a random draw of things, maybe it could be geographical.

One other approach to get us back into a constructuve debate is to challenge Americans try to take a high road when they disagree in public online forum, and not merely insult their opponents, but nudge people to explain the positive aspects of the positions they support. Good luck with that, but courtesy used to be an American value.

Watch Barone’s interview of Sunstein below.

Fake News May Distract, But It Doesn’t Rig Elections

Fake news is not a technical glitch.” This sentence is the headline of a recent article about the hysteria that has enveloped the nation over the “unexpected” presidential outcome. It also is a simple explanation that clears up much of the confusion being disseminated since the Nov. 8 vote.

Ironically, there has been a lot of misinformation about what “fake news” is. Is it false stories made up whole cloth? Yes. Is it misreporting about events that have happened? No, but that’s become a much-discussed point about the journalism profession since the issue arose. Is it media opinion? No.

Blaming members of the media for expressing their opinion rather than just stating the facts of a news story has been a complaint for decades, if not centuries. Not reporting all the facts is poor journalism, but a lie of omission is not the issue at hand.

Fake news is “creative writing,” to be kind. It’s the act of crafting imaginary facts about people whose opponents would be willing to believe are true. It’s pernicious, but it isn’t merely bad journalism. It is not based in fact at all.

Yet, people are willing to believe what they are told is news because Americans trust the format.  “Crankish conspiratorial thinking has been a theme in America for a long time,” notes professional software engineer and blogger Ariel Rabkin.

But there has been an outcry at the platforms that have unwittingly served as dispensers of fake news. The messenger has been condemned as much as the fake news itself.

Blaming the messenger — the online platforms where this fake news appears — is not the answer, however. Getting angry at Google or Facebook for “throwing” the election by permitting fake news on their sites is a pretty big waste of breath.

Consider the complaints. Facebook repeatedly tweaks its algorithm to impact how news trends, for which it recently faced a fair backlash, but that does not equate to Facebook making up false stories that show up on the site. And it would hurt Facebook’s business model to try to decide what’s real and what’s not.

As Rabkin explains:

Facebook didn’t invent rumor-mongering. It doubtless has made the problem more visible, since what used to be merely asserted drunkenly in saloons or spoken on talk radio is now in publicly visible text online. But visibility is not the same as impact and we should not assume without evidence that technology has made false rumors more dangerous to society. (The election of Donald Trump is not evidence that falsehood has any new potency. Partisans have been repeating lies about their opposition since the birth of democracy.) …

Google and Facebook have a deep ethos of neutrality, and to the extent that they are credible, it is precisely because they do not make blatant editorial decisions that embed their preconceptions and beliefs about which sources to trust. If Google or Facebook were to anoint some limited set of news sources as “authoritative” and some others as “fake,” they would immediately be faced with quite an ugly controversy about who is who, and this is controversy they avoid for both business and philosophical reasons.

Getting to the top of Facebook or Google search returns is a contest, and contestants know how to play the game.

This is the era of digital marketing, where getting seen is as important as what is said. Many players are vying for the top spot, and are willing to pay for it. An entire industry has made its fortune teaching other businesses how to rank up the Google pages. They game and test and look at data to learn how to outbid their competition to get to that spot.

This is how these platforms make their money, and they aren’t going to jeopardize the funding stream. So while Facebook and Google may constantly be rewriting and reframing their algorithms to try to second-guess what people are looking for to be able to deliver that to them, there are many, many guardians at the gate willing to point out what these platforms are doing wrong.

To wit: Being the editors of quality news is not the job description for Facebook and Google engineers.

If users are seeking carefully curated news, The New York Times and The Wall Street Journal are both available online, and there is no particular reason why Google ought to compete directly against them.

Americans do want reliable information on which to form opinions, it’s in their best interest to have all the competing arguments coming at them, good and bad. This involves becoming educated, not just by what’s on the screen, but what is in books, what occurs in real-life experiences and involves real-life witnesses.

Anybody can put anything on the Internet, for better or worse. It’s our responsibility as members of society to be able to develop and express well-considered, well-formed, and well-sourced positions.

And for all its faults, America was not “hacked” into electing Donald Trump. Some Americans may have believed fake news and used it to form their opinions, but that is not what “hacking” is. No evidence points to machines having been tampered with, despite Trump’s pre-victory claims that it could happen. The Wisconsin and Pennsylvania recounts requested by Green Party candidate Jill Stein only reinforce the validity of the vote.

So let’s be vigilant thinkers and put a little effort into determining the quality of information on which we form our opinions. We’ve no one to blame but ourselves if we fault the machines for doing a poor job of thinking for us.

Read Rabkin’s entire article on TechPolicyDaily.com.