This is the adapted text of the Hedy Lamarr Lecture Meyer delivered at the Austrian Academy of Sciences on October 3. The lecture was also sponsored by Medienhaus Wien, a think tank based in Vienna, Austria. Philip Meyer is Emeritus Professor in the School of Journalism and Mass Communication at University of North Carolina at Chapel Hill. A 1967 Nieman Fellow, he has written a number of books including “Precision Journalism: A Reporter’s Introduction to Social Science Methods” and “Ethical Journalism: A Guide for Students, Practitioners, and Consumers.” He was a reporter with the Detroit Free Press during its coverage that won the staff a Pulitzer Prize in 1968 for their detailed investigation into the causes of the Detroit race riots.

It is an honor to be here and share my thoughts about the changing news business. I told my friend Alvin Shuster, who reported from Europe for The New York Times, “They want me to give the Hedy Lamarr Lecture.” He was surprised.

“Nobody,” he kidded, “wants to hear an 80-year-old lecturer not named Einstein.”

But being old has its advantages. It is easier to make sense of events that have occurred over a long span of time. Age allows us to view things from different perspectives. As Steve Jobs said in his Stanford Commencement address, it is easier to connect the dots looking backward. I have been reading Carl E. Schorske’s “Fin-De-Siècle Vienna,” an account of the swift political and cultural changes that the industrial revolution brought to this city late in the 19th century. To those living at the time, he said, “no one quite knew how to distinguish between what was above and what below, between what was moving forward and what backward.”

And, today, trying to find patterns in the current confusion in my own country, I see strange things happening to democracy. I wonder how much can be attributed to the way that new forms of media, especially social media, enable the clustering of polarized viewpoints. Is this a temporary aberration or a permanent consequence of the technology?

Making predictions is hard.

But it’s not impossible. I have read Ithiel de Sola Pool’s book “Forecasting the Telephone” in which he looked at the many different consequences that experts were foreseeing when the telephone was in its early adoption stage, from 1876 to 1940. Some of the forecasts were quite accurate. For example one made in 1906 held that the telephone would facilitate the central management of political organizations and campaigns. One that was off the mark, also made in 1906, was that telephone mouthpieces would collect germs and spread disease, especially tuberculosis.

Scholars today have different interpretations of what technology is doing to journalism, and I cannot be certain that any of them are wrong. Clay Shirky of New York University sees an inevitable period of chaos. “The old stuff gets broken faster than the new stuff is put in its place,” he has said. “The importance of any given experiment isn’t apparent at the moment it appears; big changes stall, small changes spread.”

I have some experience to support that observation, having been present at a big change that stalled. From 1978 to 1981, I helped Knight Ridder prepare for its Viewtron venture. It was an electronic home information service that we thought would revolutionize the news business. My job was to research the market and supervise writing of the user manual. I gave enthusiastic speeches claiming that this new technology would prove as revolutionary as the invention of movable type. But it wasn’t quite there yet. Although the information moved electronically, it moved slowly, on the telephone wire, at 300 baud. Its graphics displayed on a home TV set. They were crude. The feature that users liked best was e-mail, but that failed to arouse our interest because, we thought, e-mail had nothing to do with newspapers.

Other observers have focused on interactivity as the main value created by new technology. My friend Dan Gillmor gave us the concept of “the former audience.” The news business, he has said, is less like a lecture and more like a seminar. That sounds right. And the difference between a seminar and a lecture is that a lecture is better organized. If you have ever had the experience of organizing the transcript of a seminar into a coherent published product, you’ll know what I mean.

Organizing Information

But does information need to be organized for us? Jeff Jarvis, at the City University of New York, is not so sure. He believes that the velocity of information reduces the need for formal composition in the form of articles. Today, he says, news is more of a process than a product. Instead of being limited to articles, consumers can just reach into “the free flow, the never-starting, never-ending stream of digital.” In this view, just keeping the flow going is the main job of journalism, with articles sometimes useful as a byproduct. I like the image of the never-starting, never-ending stream. But I think its users are going to need some help with finding value in it. That help could end up looking very much like “articles.”

I suspect that all of us can agree that the technologies of the information age produce data faster than they produce understanding. Instead of replacing journalism, the Internet is creating a new market need: for synthesis and interpretation of the ever-increasing stream of facts.

In “The Vanishing Newspaper: Saving Journalism in the Information Age,” I compared the digital age to the development of agriculture. When hunting and gathering were the main sources of food, few had the luxury to be very particular about its quality. But farming created such abundance that a greater variety of processing became feasible. We worried less about getting enough to eat and started thinking about vitamin content, fiber, taste, texture, all sorts of variables. And in the United States, we passed a milestone in 1983. In that year, the portion of gross domestic product spent on food manufacturing, i.e. processing, began to exceed the amount spent on farming.

So it is with information. Now that it is abundant, our interest turns to processing. Two kinds are economically important. One organizes information for efficient retrieval by the end user. I call this “ease of use,” and it is facilitated through well-crafted articles, multi-media offerings, and web architecture. The other process is further upstream in the system. All those facts, in the “never-starting, never-ending stream of digital,” need to be integrated into meaningful themes and conclusions. Unorganized facts are not enough.

We need structure to see “the truth about the facts.”

The truth about the facts: That is a nice phrase, but it is not mine, nor is it a redundancy. It comes from the 1947 report of the Commission on Freedom of the Press led by Robert M. Hutchins. With technology making so many more facts available today, it argued six decades ago, finding the truth about them is both more difficult and more important.

If that prescience is impressive, consider Walter Lippmann. He had seen the same need 25 years earlier. In 1922, he published “Public Opinion,” a classic still in print today. In the first chapter, he warned that “direct exposure to the ebb and flow of sensation” is not enough. Here’s how he put it:

For the real environment is altogether too big, too complex, and too fleeting for direct acquaintance. We are not equipped to
deal with so much subtlety, so much variety, so many permutations and combinations. And although we have to act in that environment, we have to reconstruct it on a simpler model before we can manage with it.

Lippmann suggested the use of what he called “fictions” to manage data, and offered, as one example, the schematic models of science. “A work of fiction may have almost any degree of fidelity,” he said, “and so long as the degree of fidelity can be taken into account, fiction is not misleading.”

Walter Lippmann was still alive in the second half of the 20th century, when journalists began experimenting with two new ways of making the quest for truth more manageable. Precision journalism borrowed the tools of science. Narrative journalism was based on art. In their early stages, these two approaches seemed to be in conflict. My argument today is that, in the 21st century, we should consider the possibility that we need both.

What they have in common is recognition that raw data require structure to be made coherent. Science creates structure with what Lippmann called schematic models, which come from theory. Art creates structure through the narrative design in storytelling. Is it possible for us to find ways to merge these two strategies and tell stories about the data that are grounded in verifiable theory? Before we think about that possibility, let’s consider a little history.

‘New Journalism’ Meets ‘Precision Journalism’

In 1974, the year that Walter Lippmann died, Everette E. Dennis and William L. Rivers published a book titled “Other Voices.” It catalogued what their subtitle called “The New Journalism in America.” They labeled one of their categories “the new nonfiction.” Today it is known as “narrative journalism” or “creative nonfiction.” In recognition that it has become a mainstream technique, Harvard University’s Nieman Foundation ran a well-attended annual conference on narrative journalism from 2001 to 2009. Attendance peaked at nearly 900 in 2008, before the recession discouraged discretionary travel. The genre first started to appear in the 1960s. Its fiction-like techniques include internal monologue—what a newsworthy person was thinking—and detailed character development and scene building. Early practitioners included Gay Talese and Tom Wolfe and Jimmy Breslin. Some novelists tried it working from the other direction, creating the nonfiction novel. There was Norman Mailer’s “The Executioner’s Song” and Truman Capote’s “In Cold Blood.”

It was also in the 1960’s that some of us began to apply social science research methods—including statistical analysis and hypothesis testing—to the practice of news reporting. This genre is often called “precision journalism,” a term coined by Dennis. He and Rivers saw the conflict. The narrative journalists, they said, “are subjective to a degree that disturbs conventional journalists and horrifies precision journalists. In essence, all the other new journalists push reporting toward art. Precision journalists push it toward science.”

One of the salient characteristics of science is that is conducted in a way that its results can be verified. It means asking a question of nature in a way that you will not be fooled by the answer. I participated in an early example.

In the summer of 1967, Knight Newspapers sent me to help the Detroit Free Press and its overworked staff cover an inner-city race riot. I had just completed a fellowship year at Harvard University’s Nieman Foundation and immersed myself in scientific method, particularly as practiced by social scientists. The social sciences then were on the brink of a revolution. Quantitative methods, once an exotic subset, were being enabled by the rapid drop in the price of computing power and the development of higher-level statistical languages. Harvard taught me to use one of those languages, called DATA-TEXT, written at Harvard for the IBM 7090.

While the riot was in progress, at the Free Press we covered the event with interviews and direct observation. But if you follow Lippmann’s logic, you see that news is about more than events. It is also about patterns and the underlying structure that produces those patterns.

It was the underlying structure that we were worried about in our Detroit post-riot coverage. Detroit had enjoyed a reputation of relatively good race relations. What changed to cause the riot? We needed some hypotheses. A good place for journalists to look for hypotheses to test is in the conventional wisdom. In 1967, there was no shortage of it. Conventional wisdom, as expressed by editorial writers and social scientists offered several:

  • The riff-raff theory. It held that rioters came from the bottom of the social-economic scale and were simply incapable of any other means of getting ahead. (You might have heard echoes of this one in popular discussion of the recent rioting in England.)
  • The non-assimilation theory. A sizeable portion of Detroit’s African-American population consisted of migrants from the rural South. They experienced difficulty in being socialized to the urban environments of the north, with its very different living conditions, and so resorted to rioting.
  • The rising aspirations theory. As one approaches a desired goal, the frustration of not reaching it increases. Relative deprivation is more painful than absolute deprivation, and the rioting was an expression of that pain.

At the Free Press, we hired consultants from the University of Michigan and organized a sample survey of households in the riot area, using personal interviews. The questions were designed to test those conjectures.

We found that residents who were raised in the north were more likely to riot than their neighbors from the South. Three times as likely, in fact. So much for the non-assimilation theory.

We found that education and economic status were not predictors of riot behavior. That falsified the riff-raff theory.

By elimination, we were left with the rising aspirations theory. Relative deprivation was behind the structure that led to the pattern that led to the riot, the event. If we had not gone into that project with specific theories to test, our stories might have been chaotic collections of just loosely related facts.

The civil rights movement and the anti-war movement of the late 1960’s and the 1970’s were ideal topics for the application of precision journalism, and it has since become a standard tool for investigative reporters.

Another way to report pattern and structure is through storytelling. That is where the narrative journalists excel. Their creative nonfiction has a more personal voice than conventional news writing. Some believe that it has the potential for making newspapers seem friendlier to their readers.

For decades, as a precision journalist I considered narrative journalists my natural enemies. It didn’t help that the early practitioners sometimes got caught making things up. For example, Gail Sheehy wrote an article for New York magazine in 1973 that described in great detail the sexual and financial escapades of a prostitute in New York who was called “Redpants.” Then the Wall Street Journal revealed that there was no “Redpants.” Sheehy had used a composite of several different prostitutes to provide the dramatic compression needed to give her story the pace and depth of fiction.

In 1980, a Washington Post reporter, Janet Cooke, wrote a heartbreaking narrative piece about a 13-year-old heroin addict named Jimmy. Unfortunately for her, it won the 1981
Pulitzer Prize for Feature Writing, and that led to questions about Jimmy and requests for a followup. It turned out that he didn’t exist. Cooke lost her job, and the prize was withdrawn.

The defense of those who make news too much like fiction is that they are conveying “a higher truth.” Indeed, fiction will do that. But the reader deserves to know when fiction is the vehicle for the purported truth.

Narrative journalism, like precision journalism, is a lot of work. To give reporting the rich, creative structure of fiction, the writer must first assemble a very large body of facts, and then choose those that can fit a narrative structure. Jimmy Breslin, my favorite news writer, had a piece in New York magazine in 1968 that I liked to use in my news writing classes—as an excellent example of writing—and in my ethics classes as a poor example of reporting.

Breslin covered a portion of Robert Kennedy’s campaign for the Democratic nomination for president and was with him when he spoke to students in the field house at Kansas State University. Months later, after Kennedy had been killed, Breslin went back to Kansas State as an invited speaker. He walked across the campus with a group of students who were still talking about Kennedy.

His narrative recalled the passion of those Midwestern students and how Kennedy had aroused their latent political activism with his opposition to the war in Vietnam. As he listened to them reminisce about the speech, he spotted a limestone building, its walls charred and its interior gutted by fire.

“What was in that building,” he asked the students.

“The ROTC.”

ROTC, of course, is the Reserve Officer Training Corps, a four-year program that allows an undergraduate to earn a military commission at the same time he or she receives the baccalaureate degree. Anti-war sentiment in the 1960’s led some universities, including Harvard, to ban it from their campuses.

So the theme of Breslin’s narrative was that anti-war radicalism had reached all the way to conservative Kansas in the middle of the continent, and he drove the point home with the ending, what journalists call “the kicker.” Student protesters burned the ROTC building.

Except they didn’t. I am a graduate of Kansas State University. The ROTC building was then, and is now, at the other end of the campus. The burned out building was Nichols Gymnasium. It contained a gym, a swimming pool, and the student radio station. The arsonist was an emotionally disturbed undergraduate whose grievance had nothing to do with the war in Vietnam.

I’m not accusing Breslin of making the ROTC part up. Perhaps he was misinformed. The ROTC had stored its rifles in Nichols Gym until 1942 when the Military Science Department got its own building. Maybe somebody mentioned that, and he leaped to the wrong conclusion. The problem with narrative journalism is that it tempts writers to avoid fact checking for fear that the facts might ruin a good story.

You may have noticed by now that all of my examples of errors in narrative journalism are old ones. Maybe its practitioners are getting better.

‘Evidence-based Narrative’

Both genres, narrative journalism and precision journalism, are special forms requiring special skills. If we were to blend the two, what should we call it? I like the term “evidence-based narrative.” It implies good storytelling based on verifiable evidence.

Yes, that would be an esoteric specialty. But I believe that a market for it is developing. The information marketplace is moving us inexorably toward greater and greater specialization.

Since the end of World War II, journalism has been evolving from a mass production model to one of more intimate communication. Traditional media were manufacturing products. They required economies of scale to cost-justify their means of production – a printing press or a broadcast transmitter. And so journalism was a matter of creating a few messages designed to reach many people. But as technology increases the number of channels, the new information economy supports more specialized content—many messages, each reaching a few people. That means, as a public, we have fewer common experiences that build common values.

I used to startle my students by declaring, “I can remember [dramatic pause] when there was no television!” And a time when there was no FM radio. On Sunday nights in the USA when I was 11 years old, 75 percent of the radio audience listened to the same program broadcast nationally on the AM band: the Jack Benny show, 30 minutes of music and comedy. When President Franklin D. Roosevelt was running for his fourth term in 1944, one of his speeches reached 83 percent of those listening to radio. Those are very large audience shares, and their sheer size made them immensely valuable to advertisers. Indeed, the value of a medium for advertisers was judged by what ad salesmen called CPM, which stands for cost per thousand consumers reached.

This calculation gave all audience members equal weight. The transition to more specialized media messages made it feasible to consider the characteristics of the audience, and it allowed advertisers to target their messages to specialized groups. CPM is less important today as a measure of the value of advertising. The quality of the audience is more important than its quantity.

Now extend this trend toward specialization to the present day, and what do you find? Twitter and texting. Many short messages, each aimed by their sources at individuals or small groups that choose to follow them. This is the most minute and specialized portion of what Jarvis called “the never-ending stream.”

An environment that rewards specialization need not limit itself to subject matter specialization. It can also build a specialty based on methodology. Both precision journalism and narrative journalism appeal to a sophisticated audience, one that appreciates the need for information to be structured in a way that focuses attention on the truth.

That is why it is not so wild a dream that evidence-based journalism, incorporating precision with narrative, could fill a need for trustworthy interpretation and selection of the relevant truth from the eternal data stream.

The most successful technology forecasts, as Ithiel Pool found, have been those that combined study of the technical possibilities with market analysis. The successful entrepreneurs will analyze the market need and then work backward from that, building a system to meet a verified need, not an imagined need. We should have thought of that when we built the Viewtron service.

Information Overload

The need for systems that synthesize and process data into shared knowledge will, I predict, become obvious. Unprocessed data is indistinguishable from noise. As the unending stream of data increases, so will the demand for institutions and better methods to process it.

How will they be formed? Last spring, the agency that regulates the flow of electronic information in the USA, the Federal Communications Commission, released a report on the information needs of communities in a broadband world. It suggested that the needed institutions will be the spontaneous result of market forces. The old and new media will improve each other’s effectiveness, it said.

Crowds can pore through document dumps and reporters can find the source within the agency to describe the significance or reveal which documents were withheld. Citizens can offer Twitter updates from a scene, and reporters can look for patterns a
nd determine which tweets might be self-serving or fraudulent.

It may be. But I don’t think it will be that easy. The patterns will not always be obvious. Greater analytic and narrative skills will be needed. It won’t be often that the two skills, analysis and narrative, can be combined in one reporter, so we’ll need more team reporting and editors capable of recruiting and managing the necessary talent. In other words, the old media will have to change, too.

The case of the Detroit Free Press in 1967 foreshadowed this development. The Pulitzer Prize was awarded to the staff of the newspaper for the totality of its work, the spot coverage, the narrative interpretations, the investigations into the 43 deaths, and the survey. At the time, the Pulitzer committee preferred to recognize individual effort, but staff awards have become more common as well-managed news organizations bring a variety of skills to bear on major stories.

I see it happening recently in some of the nonprofit investigative media that have begun to fill the vacuum left by dying newspapers. ProPublica, based in New York City, teamed with The Seattle Times last December to report on the home mortgage crisis in the USA. ProPublica supplied a computer specialist, Jennifer Lefluer, to do the heavy lifting in the data analysis, and the Times provided a reporter, Sanjay Bhatt. They drew a probability sample of about 400 foreclosure filings in each of three widely separated metropolitan areas, Seattle, Baltimore and Phoenix. Their jointly written story combined quantitative analysis with human interest reporting, and it showed vividly how the combination of relaxed lending standards and inflated home prices caused the crisis. And it demonstrated that better record keeping by government regulators could have provided some early warning.

Institutions like ProPublica have an enormous opportunity. They can stand out so clearly from the noisy confusion of information overload, their value could be so rare, that everyone will want to pay attention to them. Do you see a pattern here? Once again, we might be able to send a few messages to many people, a reversal of the de-massification of the media. It would be possible once again to pay attention to common concerns.

I am not the first to suggest this. W. Russell Neuman, the American political scientist, made the same point in 1991 when he published “The Future of the Mass Audience.” Newspapers were powerful because their expenses on the production side created a bottleneck that led to natural monopoly. New media, even though production is cheap, could create scarce value by improving quality farther upstream in the process, at the creative end. That is where the new bottleneck will be.

Such a development might save us from further fragmentation of the information marketplace. And that fragmentation, as events in my country demonstrate, can be dangerous.

Social psychologists have long been concerned by what was first known as “the risky shift syndrome.” A graduate student in 1961 discovered that decisions made by groups tend to lead to more dangerous actions than decisions made by individuals. More recently it has been called “group polarization,” recognizing that the shift can go to the other extreme, too much fearful inaction. The prolongation of the war in Vietnam has been attributed to this phenomenon.

Today’s information technology, as you know, encourages the formation of like-minded groups. It became easier for the members to find one another when communication shifted from a model of one to many to a system of many to many. Social networks encourage polarization by confirming and amplifying extreme views.

So is information technology the cause of the current gridlock in the national government of the United States? We have grounds to speculate. Now that media are “social, global, ubiquitous and cheap,” as Clay Shirky says, we can no longer think of “the public” in the same way. Instead, politicians have to deal with “issue publics,” groups that focus on single issues. Perhaps you have heard of Grover Norquist, the American citizen who has made it his mission to track down lawmakers and persuade them to sign his pledge against tax increases. As of late September, a majority of the House of Representatives, 235 out of 435 and nearly half the Senate, 41 out of 100, had signed that pledge.

Do you see what that does to democracy? By definition, representative democracy is a deliberative process. You cannot run a democracy by referendum, because a referendum presents each issue in isolation. People want conflicting things. We want lower taxes and more services. We want less regulation except where regulation protects our special case. Resolving those conflicts requires deliberation, and that’s what a legislative body is for. Benjamin Franklin expressed it well during the Constitutional Convention of 1787.

“When a broad table is to be made, and the edges of the plank do not fit,” he said, “the artist takes a little from both and makes a good joint.”

Of course, since Franklin’s day, the velocity and volume of information have increased by orders of magnitude. When media are “social, global, ubiquitous and cheap,” single-interest groups can focus with laser intensity on politicians and tie their hands before attempts to smooth the edges even begin. Is this a temporary aberration or a permanent structural barrier that will change the way democracy works?

Only time will tell, but information ought to help democracy, not hurt it. We need new institutions to build new media forms that will let truth stand out from the noisy babble and command attention because they are trusted and comprehended. Narrative journalism combined with precision journalism could do that job. Let’s get started.

Most popular articles from Nieman Reports

Show comments / Leave a comment