When Machines Decide What We ‘Think’

The Internet helps like-minded citizens find each other, but does it foster democracy? A while back Eli Pariser, board president and former executive director of the liberal advocacy group MoveOn.org, felt optimistic about its potential for doing this. "For a time, it seemed that the Internet was going to entirely redemocratize society," he writes in his new book, "The Filter Bubble: What the Internet Is Hiding From You." Now, Pariser has growing concerns about the answer to that question.

In the days following September 11, 2001, Pariser, then a recent college graduate, launched an online petition calling for a restrained and multilateral response to the attacks. More than half a million people quickly signed the petition, and Pariser joined MoveOn, a liberal online advocacy organization, shortly afterward.

Since then he's watched the Internet evolve in ways that alarm him. "Democracy requires citizens to see things from one another's point of view, but instead we're more and more enclosed in our own bubbles," he writes. "Democracy requires a reliance on shared facts; instead we're being offered parallel but separate universes."

RELATED ARTICLE
Finding Information Pathways to Community Inclusion
- Peter M. Shane
Pariser's concerns are based in part on his own experience with Facebook. Most of his friends lean to the left, as he does, but Pariser is interested in what conservatives are thinking so he befriended some. Yet what they wrote stopped being displayed in his Facebook newsfeed because "Facebook was apparently doing the math," he notes, and discovered that he clicked on links from his progressive friends more frequently than those from his conservative friends. Even if he wants to follow their thinking, algorithmic decisions prevent him from doing so.

Facebook isn't the only online business that "thinks" it knows what people want and acts accordingly. Pariser tells of an experiment in which he had two well-educated, politically progressive friends who live in the Northeast do a Google search for "BP" after last year's Gulf of Mexico oil spill. Each got a different result—one received investment information about BP while the other saw news about the oil spill. It turns out that Google monitors 57 signals about a Web user's profile and online behavior to personalize search results for each user.

Pariser expresses concern about the personalization of the Internet that creates this troubling filter bubble. Trapped within it, users are exposed to less that surprises them while being fed a steady diet of information that confirms their beliefs.

"Ultimately, democracy works only if we citizens are capable of thinking beyond our narrow self-interest. But to do so, we need a shared view of the world we cohabit. We need to come into contact with other peoples' lives and needs and desires," he writes. "The filter bubble pushes us in the opposite direction—it creates the impression that our narrow self-interest is all that exists. And while this is great for getting people to shop online, it's not great for getting people to make better decisions together."