Patient-Based Health Care … on Facebook?

Bertrand Might has a rare genetic disorder that his family confirmed in 2012 after almost four years of searching for an explanation. Bertrand was the first person ever documented with his disease, called NGLY1 deficiency.

When his family finally discovered what Bertrand was facing, they at least had an answer. But then they faced another problem — finding others coping with the same ordeal.

It’s a common problem for people with unusual illnesses. Because some diseases are so rare, when a family finally gets a diagnosis, they want to compare notes with others to learn tricks and tips for managing their situation. Unfortunately, in such cases, these others are hard to locate. Medical data networks are hard to access and usually don’t have much information in them.

Matt Might, Bertrand’s dad, had a background in tech, and was able to juice a blog post to get picked up in search engines. The post went viral and Might got a lot of news coverage about the problem his son was facing. He has since found 15 people in the world with the same disease Bertrand has.

But not everyone has that success, even when Google and other sites are trying to harness their technological power to make medical data easier to access and control. But for all their efforts, David Shaywitz, Director of Strategic and Commercial Planning at Theravance, a publicly-held drug development company in South San Francisco, says Facebook may already be the best-positioned platform to support patient-centered health care that so many people dream about as the future of medicine.

Facebook is where patients with rare conditions, and their families, often go to connect with others in similar situations – typically via private groups. Apparently, these can be extremely specific – the example the panelist cited was childhood epilepsy due to one or another individual genetic mutation. Families reportedly self-organize into private groups based on the specific mutation, and share experiences and learnings. …

The irony, of course, is that because of its features and popularity, Facebook has organically emerged as arguably the most attractive platform for patient groups to organize – despite the far more deliberate efforts of other companies and organizations that offer platforms aimed at bringing patients together. …

Now, everyone reading this post is probably familiar with Facebook. It’s quirky. It can manipulate what you see and don’t see, whether you can share your opinion or have your opinion banned. It tries to influence what viewpoints should be supported and which should be ignored. And it really only provides an illusion of privacy when, in fact, one false setting and you’ve gone “public” or worse, “live.”

But then again, isn’t publicity what people in the Mights’ situation are looking for? And doesn’t Facebook have a whole lot of people looking for other people to “friend”? Facebook’s influence is unparalleled.

Facebook, at its core, is about cultivating relationships — in marked distinction to the transactional core of Google (search) and Amazon (deliver).  The core mission of Facebook is to connect people – and to help good things emerge from these connections. What better forum than Facebook to bring patients together — and what better platform for health?”

As Shaywitz notes, Facebook has already seen success in the health care arena, most notably allowing people to list their organ donor status, “an initiative which produced an immediate lift in organ donor registrations.”

Furthermore, as a platform to serve patients, Facebook already has the framework that other organizations are trying to build or replicate. Might told Shaywitz that Facebook could do a lot more, like create an opt-in “find patients like me” service. Shaywitz suggests other applications, like “user-friendly medical data import, sharing, visualization, and analysis.”

Ultimately, however, Facebook already has harnessed what patient-based health care is all about.

What many technologists fail to appreciate about health care is the importance and value of relationships, of human connection, of community. At its best and most foundational, medicine is about relationships, not transactions. Most of medicine, health, and wellness isn’t about showing up with a discrete question and leaving with a discrete answer. Our experience of illness and disease is so much more complex and nuanced, individualized and personal, a process of understanding that unfolds over time. The best physicians and care providers recognize this, and appreciate the importance of listening, and the value of longitudinal connection.

Do you think Facebook can appropriately manage health care databases and connections? Leave your comment.

Facebook and Democracy: Social Media’s Coarsening Impact on the Public Square

Could Twitter diminish your tolerance for opposing ideas (as well as your productivity)? Is Facebook bad for democracy?

Facebook, Twitter, YouTube, Reddit, and other social media platforms are set up to show people content that they are already likely to agree with, which is fine when you are checking out puppy dogs and meal ideas. But when the content turns toward politics or life-changing policies, social media algorithms on Facebook and elsewhere leave people seeing only content they “like,” trapping them in a self-reinforcing bubble with little exposure to alternative ideas.

The result? People with different opinions are drifting further and further apart, removed from intellectual challenges and less likely to engage with political opponents. This drop in the need for intellectual rigor is making it more difficult to find solutions to problems that impact everyone.

Harvard Law Professor Cass Sunstein’s latest book, “#Republic: Divided Democracy in the Age of Social Media,” outlines the role of social networks in representative government, and warns that the division of viewpoints into hardened us vs. them groupings is real, growing, and becoming more difficult to overcome with time.

Speaking to political columnist Michael Barone recently, Sunstein said that the blinders narrowing our minds are harming the American creed.

Echo chambers and information cocoons are a real problem for democracy. It’s very important for people to step outside a kind of hall of mirrors which they can construct with the aid of Facebook or Twitter or Instagram, and encounter both topics that are unfamiliar and maybe not especially interesting to them, and certainly points of view that aren’t congenial and that may be disruptive to what they already think that is central to, let’s say, the American project.”

The average Facebook user gets about 20 percent of his or her news from Facebook, with younger people getting a higher percentage. Likewise, the data show that people on Twitter tend to follow people that agree with their points of view.

Sunstein says this phenomenon is no surprise. Visionaries like Bill Gates saw 20 years ago a new world in which people could get exactly what they want, effectively creating what Sunstein calls “The Daily Me,” a completely personalized online encounter in which everything on one’s computer or tablet reflects views that are preferential to the owner. That’s exactly where society headed.

Is there a danger in not turning the trend around, or not having people demonstrate a curiosity for what others outside their viewpoints think? And is the decision to look at like-minded ideas on the Internet any different than self-selecting pre-sorts of media that came before it, like the cable news channels or news magazines?

Yes and no, Sunstein says. Self-selection has been going on for ages, but its scale has never been so large and so reinforced. As a result, despite its massive reach, social media have basically made it harder to solve problems. When it comes to policies like immigration, infrastructure, education, or economic mobility, the positions have become so rigid, that “doing something about some of these issues would seem preposterous.”

Sunstein notes that human curiosity doesn’t keep everyone down. The counter-effect of social media is that people on each side of the debate pay close attention to what the opposition is saying so that they can monitor and challenge it.

Though Sunstein describes his own book as downbeat and not cheerful, he suggested a few prescriptions that could turn the tide for American society. For one, providers of information, whether they be news outlets or Facebook itself, can get out of the business of reinforcing the barriers.

Two ideas that would be on the list of proposals are, why not give Facebook users an Opposing Viewpoints button where they can just click and then their newsfeed is gonna show them stuff that they don’t agree with. Or why not give Facebook users a Serendipity button where they can just click and if they click, then they’re gonna get stuff that is just coming to them through an algorithm which provides people with a range of stuff. So if you’re someone who is just focused on one set of issues, you’re gonna get the “Wall Street Journal” and “New York Times” also.

And Facebook, to its credit, doesn’t wanna pick winners and losers, so they shouldn’t promote one particular newspaper, but they could have a random draw of things, maybe it could be geographical.

One other approach to get us back into a constructuve debate is to challenge Americans try to take a high road when they disagree in public online forum, and not merely insult their opponents, but nudge people to explain the positive aspects of the positions they support. Good luck with that, but courtesy used to be an American value.

Watch Barone’s interview of Sunstein below.

Fake News May Distract, But It Doesn’t Rig Elections

Fake news is not a technical glitch.” This sentence is the headline of a recent article about the hysteria that has enveloped the nation over the “unexpected” presidential outcome. It also is a simple explanation that clears up much of the confusion being disseminated since the Nov. 8 vote.

Ironically, there has been a lot of misinformation about what “fake news” is. Is it false stories made up whole cloth? Yes. Is it misreporting about events that have happened? No, but that’s become a much-discussed point about the journalism profession since the issue arose. Is it media opinion? No.

Blaming members of the media for expressing their opinion rather than just stating the facts of a news story has been a complaint for decades, if not centuries. Not reporting all the facts is poor journalism, but a lie of omission is not the issue at hand.

Fake news is “creative writing,” to be kind. It’s the act of crafting imaginary facts about people whose opponents would be willing to believe are true. It’s pernicious, but it isn’t merely bad journalism. It is not based in fact at all.

Yet, people are willing to believe what they are told is news because Americans trust the format.  “Crankish conspiratorial thinking has been a theme in America for a long time,” notes professional software engineer and blogger Ariel Rabkin.

But there has been an outcry at the platforms that have unwittingly served as dispensers of fake news. The messenger has been condemned as much as the fake news itself.

Blaming the messenger — the online platforms where this fake news appears — is not the answer, however. Getting angry at Google or Facebook for “throwing” the election by permitting fake news on their sites is a pretty big waste of breath.

Consider the complaints. Facebook repeatedly tweaks its algorithm to impact how news trends, for which it recently faced a fair backlash, but that does not equate to Facebook making up false stories that show up on the site. And it would hurt Facebook’s business model to try to decide what’s real and what’s not.

As Rabkin explains:

Facebook didn’t invent rumor-mongering. It doubtless has made the problem more visible, since what used to be merely asserted drunkenly in saloons or spoken on talk radio is now in publicly visible text online. But visibility is not the same as impact and we should not assume without evidence that technology has made false rumors more dangerous to society. (The election of Donald Trump is not evidence that falsehood has any new potency. Partisans have been repeating lies about their opposition since the birth of democracy.) …

Google and Facebook have a deep ethos of neutrality, and to the extent that they are credible, it is precisely because they do not make blatant editorial decisions that embed their preconceptions and beliefs about which sources to trust. If Google or Facebook were to anoint some limited set of news sources as “authoritative” and some others as “fake,” they would immediately be faced with quite an ugly controversy about who is who, and this is controversy they avoid for both business and philosophical reasons.

Getting to the top of Facebook or Google search returns is a contest, and contestants know how to play the game.

This is the era of digital marketing, where getting seen is as important as what is said. Many players are vying for the top spot, and are willing to pay for it. An entire industry has made its fortune teaching other businesses how to rank up the Google pages. They game and test and look at data to learn how to outbid their competition to get to that spot.

This is how these platforms make their money, and they aren’t going to jeopardize the funding stream. So while Facebook and Google may constantly be rewriting and reframing their algorithms to try to second-guess what people are looking for to be able to deliver that to them, there are many, many guardians at the gate willing to point out what these platforms are doing wrong.

To wit: Being the editors of quality news is not the job description for Facebook and Google engineers.

If users are seeking carefully curated news, The New York Times and The Wall Street Journal are both available online, and there is no particular reason why Google ought to compete directly against them.

Americans do want reliable information on which to form opinions, it’s in their best interest to have all the competing arguments coming at them, good and bad. This involves becoming educated, not just by what’s on the screen, but what is in books, what occurs in real-life experiences and involves real-life witnesses.

Anybody can put anything on the Internet, for better or worse. It’s our responsibility as members of society to be able to develop and express well-considered, well-formed, and well-sourced positions.

And for all its faults, America was not “hacked” into electing Donald Trump. Some Americans may have believed fake news and used it to form their opinions, but that is not what “hacking” is. No evidence points to machines having been tampered with, despite Trump’s pre-victory claims that it could happen. The Wisconsin and Pennsylvania recounts requested by Green Party candidate Jill Stein only reinforce the validity of the vote.

So let’s be vigilant thinkers and put a little effort into determining the quality of information on which we form our opinions. We’ve no one to blame but ourselves if we fault the machines for doing a poor job of thinking for us.

Read Rabkin’s entire article on TechPolicyDaily.com.

Censorship at Facebook? Maybe Not. Intellectual Diversity? Maybe Not

We all saw the report: Anonymous sources claimed that Facebook employees have deliberately censored stories from the site’s “trending” topics that favored the conservative outlook.

Conservatives across the country were frustrated and angry, and the reason why ran deeper than simple indignation at unfair treatment. The frustration was more intense because media bias is a documented fact that politically and culturally conservative Americans have been grappling with for decades. The traditional press, across both print and broadcast media, famously tilts to the left. This holds both in explicit opinion commentary and in subtler, implicit ways, such as which stories are deemed worthy of straight news coverage and which are seen as red herrings to ignore.

But new media seemed to hold new promise for a level playing field. From the young days of the blogosphere in the early 2000s, conservative- and libertarian-leaning blogs gained huge followings, inflected major debates, and kept the “mainstream media” newly accountable.

As social media such as Facebook and Twitter gained prominence, Americans with views disdained by the traditional coastal media again found cause for optimism and new ways to organize and discuss the news of the day.

This is why the Facebook allegations felt so disappointing to so many. A digital platform that had seemed to determine popular stories by a neutral algorithm was instead running a subjective editorial desk and reportedly staffing it with young, left-leaning college grads who openly put their thumbs on the scale.

That’s why, this past Wednesday, I joined a group of other conservative leaders at Facebook headquarters to meet with Mark Zuckerberg, Sheryl Sandberg, and others from management. I came in with an open mind, eager to help explain conservative frustrations and discuss future solutions. And the spirit of the meeting was cordial and productive. Personally, I am extremely skeptical (to put it mildly) that there is some top-down conspiracy to weaponize Facebook to intentionally censor conservative views, and I hope that this is the beginning of serious efforts to combat the risk of systemic bias.

Facebook has a tremendous opportunity to out-innovate old media models and win over customers who are hungry for ways to separate the signal from the noise. But questions of editorial oversight and — even more important — intellectual and ideological diversity within Silicon Valley remain important issues that deserve serious solutions.

Facebook and other young, innovative companies have a massive opening to change the status quo in news aggregation by disrupting old patterns and helping citizens bypass “gatekeepers.” They can greatly improve the marketplace of ideas. But to do this, it is vital that new media avoid making old mistakes.I hope that last week’s meetings were just the beginning of serious efforts to combat the risk of systemic bias. Silicon Valley talks a great deal about diversity. Rightly so. But that has to include intellectual, cultural, and religious diversity, or else a golden opportunity could easily be wasted.