During a 2017 rally in Barcelona, a woman reacts as she watches the Catalan Parliament session during which lawmakers voted to secede from Spain. In an effort to foster civil discourse, Politibot distributed well-reasoned opposing viewpoints on the divisive issue

During a 2017 rally in Barcelona, a woman reacts as she watches the Catalan Parliament session during which lawmakers voted to secede from Spain. In an effort to foster civil discourse, Politibot distributed well-reasoned opposing viewpoints on the divisive issue

Social media platforms used to be a place to discover interesting people far away, to get hints of breaking news, to connect with your audience, and even a tool for democracy. They could still be all of those things. But in 2017 they have more often been seen as a tool for propaganda, harassment, and the propagation of false or silly content. They have also become an enemy of the media as they drive traffic up and down and capture the overwhelming majority of ad revenue.

The low levels of trust in media and the polarization in the U.S. and elsewhere are intertwined with the deterioration of public discourse. Some of the issues at stake may require regulation, but the most powerful forces could be awareness and behavioral changes in the use of technology. And here is where journalists could still play a major role in improving the social conversation while showing why they deserve to be trusted.

I’ve covered four presidential campaigns in the U.S. and several more in Spain, Belgium, Italy, and the Netherlands. In 2016, I covered Brexit and the U.S. election and that experience—talking to skeptical voters online and (mainly) offline—inspired me to use my Nieman fellowship at Harvard and MIT to explore tools for journalists to recover trust and survive as a relevant voice in this noisy and fragmented world.

A few months into my studies, I am more optimistic. And, yes, technology could help, most of the time in simple ways. Here are a few ideas. Some come from my experience at Politibot, a chatbot inside Facebook Messenger and Telegram that I co-founded with a group of journalists and developers to explain European and American issues. Others, from the MIT Laboratory for Social Machines and the MIT class Depolarization by Design taught by Deb Roy, as a thought-provoking debate with sociologists, engineers, economists, and journalists.

1. Messaging apps to break the bubble

At Politibot we tried an experiment with the debate around the independence of Catalonia, a very divisive issue that has been discussed in Spain with little tolerance for nuance among readers or even among journalists covering the news.

Politibot sends a daily conversation about one topic in the news to its users via Facebook Messenger or Telegram. The conversation allows them to understand a complex issue with short texts, images, gifs, and few links. The interaction is simple, mainly through buttons and emojis that keep the conversation moving.

In October, we asked the 8,000 users of Politibot if they agreed with the referendum organized by the Catalan government and then sent them articles with the opposite view to the one they declared. The pieces were carefully curated: we chose authors who argued their position with respect and substance. We called this experiment the “echo chamber” and we explained what the phrase meant with an article from Harvard Law professor Cass Sunstein.

On December 21 Catalonia held regional elections and the result showed again a society split in two halves between voters who support parties that are in favor of independence from Spain and parties that are against it. It’s not clear if the pro-independence parties will try to hold another referendum or will engage in talks with the national government.

During all this crisis it has been difficult to find balanced coverage in national or Catalan media so the effort to present different views in a very polarized context has been particularly appreciated by Politibot readers.

We have done “the echo chamber” experiment several times and the reactions have been very positive. Although few readers say they have changed their minds, they appreciate the effort in contrast to other news outlets that simply reinforce what they already believe. One of the most common pieces of feedback was that a news organization actively trying to break the echo chamber deserves support from readers (Politibot raises funds through Patreon). The fact that a news organization helps identify bias reinforces its role as a fair player, especially in a partisan media context.

This has been possible because of the editorial content, the friendly tone of the chatbot, and the strong relationship developed with our readers over the last two years. But it is also related to the platform. Messaging apps—where people speak with friends and family—are the most personal space in the digital world. They are a place with few distractions that usually offer a very different experience from websites or open social media. One of the aspects often highlighted by our users is the civility of the experience and the ability to read opposing views in a safer, calmer environment when “you don’t have to block people.”

2. Platform indicators for healthy conversation

One of the most interesting ideas discussed at MIT around polarization is the creation of a set of “health” indicators to measure a conversation. These indicators could be used to reward media or individuals that promote better debates.

In an effort at MIT with the nonprofit Cortico, Deb Roy has come up with four parameters to measure the conversation of one or several groups:

  • Shared attention: do they care about the same issues?
  • Shared reality: if they do care about the same issues, do they agree on core facts?
  • Diversity of opinion: is there a variety of interpretations?
  • Receptivity: are those exposed to diverse opinions listening or shouting back?

There could be other healthy indicators. One of the challenges is how to measure the dominance of loud voices. One of the options could be to measure gender balance.

The next step would be to quantify the content in the conversations, attribute value, and establish ranges for each indicator.

With these or similar criteria in place, we can easily imagine how Facebook or Twitter could reward media organizations that promote a varied, open, respectful debate by simply showing their stories in newsfeeds without buying ads or without doing Facebook Lives of exploding fruit.

To reach scale on a project like this you now need to involve Google or Facebook. But even without them on board, the idea of news organizations fostering positive engagement and finding attentive users in the process seems valuable and worth a try.

3. A filter for control

Ethan Zuckerman, the MIT professor who invented the pop-up and teaches the class the Future of News, is on sabbatical this year writing a book on trust, media, and platforms. In one of his visits to MIT, he explained his latest project, still in beta. It’s gobo.social, a tool to filter the posts shown on Facebook and Twitter according to several criteria: the user can choose to have “my perspective” or “lots of perspectives” on politics, more or less seriousness, more or less virality, or “mute all the men.” The tool lets the user see the posts that have been taken out of the feed after applying the filters. There is also an option to silence content from corporations.

Letting the reader know more about content, advertising, and bias is not necessarily what the platforms want. But it should be one of the media’s most important goals, especially  for journalists who are concerned about preserving their unique public service role.

4. Tools to ask questions

Todd Rogers, a behavioral scientist from the Harvard Kennedy School and a guest speaker at MIT, has shown in his research how people hold more extreme views as a result of an illusion of understanding. When confronted with questions that force them to explain how something really works, they moderate their views as they realize they know less than they thought.

This idea has been applied to moderate comments on news sites, particularly in cases of hate and extreme speech. The Norwegian site NKR tried asking three questions about a story to readers before they could comment on the content.

There are ways to scale this effort of asking readers more questions. At MIT, Jonathan Haidt, author of “The Righteous Mind,” made us try an app he is working on called OpenMind. The app asks you a few questions about your political views, suggests an issue, asks for an opinion on that, and then guides you through the opposing arguments. It offers praise for people who are more open to opposing views and urges college students to be more receptive toward different opinions.

Similarly, Katherine Cramer, a University of Wisconsin professor who was studying the rural resentment towards cities before the rise of Trump, talks about “active listening.” Just making people answer questions and interact in groups could change the dynamics of trust. Her challenge is scale. That’s one of the concerns at MIT Media Lab, which is trying to develop an automated interface as an alternative to the very labor-intensive personal interviews.

Reporting is about asking questions, so journalism shouldn’t be left out of these efforts. Media organizations can do it in groups through emails or through their platform as the German media group Presse-Druck- und Verlags-GmbH is trying in a project financed by the Google fund for media innovation in Europe. Or it could happen inside a closed Facebook group as the experiment developed by Spaceship Media and AL.com with 25 women from San Francisco who supported Hillary Clinton and 25 women from Alabama who supported Donald Trump in 2016.

5. Audio to slow down

One of the main factors in the spread of falsehoods or hate speech on social media is the lack of friction. One simple answer is to slow down, even introducing a delay in the publishing of comments just as news organizations apply a series of steps and standards before an article is ready to be published.

The social backlash against mobile/screen dependency could help slow the consumption of news and the urge to comment without much reading or reflecting.

One element in this less compulsive environment could be audio. It potentially could bring back the lost sense of conversation. It has already worked for popular podcasts such as The Daily from The New York Times or NPR Politics. Audio is not just as a way to reach the audience in a new medium but also a great channel to create a personal relationship with reporters and lead listeners to trust them more.

One of the challenges for 2018 is how to get more journalistic content in the daily conversations with the smart audio devices made by Amazon and Google. The old habit of discussing the news for hours may be lost, but these new interfaces could offer a way to engage in a world with more voice and less noise.

Further Reading

Show comments / Leave a comment