Measuring the Effects of Changing the Way Violence Is Reported

Reporting on crime and violence has been a staple in the newspaper diet since before the pennycress. In that time, one by-product of this coverage has remained consistent: readers have been presented with distorted picture of the world. For example, today nearly seven in ten Americans tell pollsters that violent crime in the United States is on the rise; only a quarter say that the United States is making progress in battling crime. Yet violent crime has been decreasing steadily over the past three years. Although some people might get information about violence from personal experience as victims or witnesses, most do not experience crime and violence personally. Instead, much of their information or perceptions about crime come from the news. And recent research confirms that people react to reading and hearing news about crime and violence by fearing their world and remaining ambivalent about the best course of preventive action.

Testing a Newspaper’s Effects on Readers: The Old-Fashioned Way

A survey of American newspaper editors found that 95 percent of the responding papers have conducted readership research to “guide editorial decision-making.” But such research usually is focused on what readers like and read the most. Rarely does it investigate what readers believe or know as a function of the news they are reading.

How news affects people has been studied, but this information is seldom utilized by those whose job it is to report and edit the news. Unfortunately, too, this research seldom identifies specific stories or news outlets. Instead it tends to focus on the relationship between general knowledge, such as people’s ability to name political candidates, and their daily use of a newspaper. But if we fail to study directly the ways in which newspaper readers understand and integrate specific reporting into their own lives, how can reporters and editors know how best to gather and present information or be able to better assess its impact?

Surveys can be used to help newspapers understand the ways in which the presentation of news influences readers’ perceptions of the world in which they live and whether it empowers them or creates inappropriate fear, helplessness and cynicism. By using different tools of discovery, newspapers can find out what their readers do know about particular topics, where their knowledge is lacking, and how their knowledge compares to others who rely more heavily on other news sources.

Having this kind of information at hand can help editors and reporters with decisions about coverage that they must make every day. Such an analysis can identify possible areas for in-depth reporting. For example, it could show that sensationalistic crime stories have a much greater impact on fear and helplessness than such crimes would warrant. It might show that the misrepresentation of crime patterns in news is reflected in erroneous reader beliefs about where or to whom crime happens. Or it might show that the dearth of solution stories is associated with the common belief that in America a high crime rate is inevitable.

Testing a Newspaper’s Effects on Readers: A New Approach

Content analysis of the papers themselves, as well as appropriate readership studies, are necessary to understand how readers interpret crime and violence news. For this reason The Violence Reporting Project uses both. In our readership studies we compare heavy-, light- and non-newspaper readers. After a newspaper changes its reporting of violence, we can go back to measure the effects. We’d expect to find the most significant changes to have occurred among heavy readers. We also filter out effects of alternative sources of news, especially television. Another important control in our research involves asking about crime and violence information that is known not to have been present in their newspaper. Clearly, this knowledge should be less likely to show change than that which the newspaper reported.

In our content analysis, we catalog and examine the kinds of articles that a newspaper publishes as well as what it omits in its coverage of crime and violence. We are also interested in more complex features of individual stories as well as any patterns we locate across stories. Does the reporting illuminate possible precipitating reasons for the violence or was it apparently random? Were violations of social norms, such as desecration of the dead, involved and, if so, how sensationalized is the reporting of them? How much crime reporting is there relative to other news? Is a particular area of the city or the whole urban region regarded as criminal in an exaggerated and repetitious way? How often are crime patterns rather than single incidents the focus of coverage? How often are possible solutions reported in comparison to the individual occurrences of crime? Are these community-wide prevention policies discussed as well as precautions individuals can take? Is there information about the consequences of the crime?

Our next step is to compare actual crime and violence statistics with the pattern in which these crimes are being reported. Recent analyses reveal over-reporting of homicide and underreporting of assaults, domestic violence, burglaries and robberies. For example, an extensive recent analysis of The Los Angeles Times by researcher sat UCLA found that certain crimes were more likely to be reported than their relative occurrence would suggest. These crimes involve homicides of women, children and the elderly, as well as cases with multiple victims, with suspects who were strangers to the victims, and those that occur in wealthier neighborhoods. Homicides of African-Americans, Hispanics, the less educated, and those involving a weapon other than a firearm received less coverage than their relative frequency would lead us to expect.

In our work with newspapers we do two readership surveys, one before any changes in coverage are made and one after so we can measure their effect. Does covering crime and violence differently result in readers acquiring new beliefs, attitudes and knowledge about these issues, including deeper understanding of risk factors for violence and possible methods of prevention? This is the kind of information we try to elicit by how we question readers.

The Center for Advanced Social Research, a research unit in the School of Journalism at the University of Missouri, conducts these surveys. Questions we ask probe in some depth what people know and believe about the occurrence of crime and violence, particularly in their own city. We try to learn what readers know about causal influences on violence patterns, what the impact of violence is in their city, and what solutions have been conceptualized or tried. These surveys also measure how fearful people are and what they’ve done to keep themselves safe. Both before and after we do the initial survey, we discuss with editors and reporters what will be asked and how to interpret results. Readers’ responses are evaluated within the context of the attention the newspaper has paid to coverage of various crimes and violence. In the analysis we factor out exposure to other media news and violence-based entertainment.

The second readership survey is conducted about two to four months after a newspaper initiates its editorial changes. This survey will give us indications of whether these changes are achieving the newspaper’s desired results. Our project is so new that we have not yet completed this cycle of surveying with any particular newspaper, but when we do, what we learn about the effects of violence reporting will be shared with any interested newspaper or researcher.

The workshops we’ve done with newspapers, and conversations we’vehad with editors and reporters since then, have produced some exciting ideas for improving coverage of violence. However, while we think the ideas make sense, because this experiment is a first of its kind, we don’t yet have empirical evidence that these ideas, once they are put into practice, will help readers arrive at a more comprehensive and balanced understanding of these topics. But as we gather evidence of their effectiveness, this research can become an enormously valuable tool in explaining to other newspaper editors the value of rethinking their coverage of violence.

Lori Dorfman is Director of the Berkeley Media Studies Group, a project of the Public Health Institute in Berkeley, California. She edited “Reporting on Violence,” a handbook for journalists published by the Berkeley Media Studies Group.

Esther Thorson is Associate Dean for Graduate Studies and Research at the Missouri School of Journalism and is Senior Consultant in the Center for Advanced Social Research, a survey facility housed in the School.