Science Journalism
Those who report on science have never been better prepared to do so, according to Los Angeles Times science and technology writer Robert Lee Hotz, whose insights open our section on science journalism. But as Hotz also observes, the challenges these reporters confront have never been larger: Newsroom cutbacks mean the reporters “are stretched to cover increasingly complex science stories ….” And their task is made harder by the dearth of impartial sources, forcing them “to look as hard at the scientists as we look at the science itself.” – Melissa Ludtke, Editor
Once upon a time science writing was simple: A reporter would read published studies in the scientific literature and write about the latest wonder of research or miracle of medicine.
Things have gotten more complicated since those early days of science journalism. The spread of pollution, the Vietnam war, the Chernobyl meltdown, the Challenger explosion, the emergence of AIDS and antibiotic-resistant bacteria have all revealed a darker, more vulnerable side of science. This is not to say science has gone bad: Our lives have been extended through medical advances and improvements in diet and made more convenient with personal computers and inventions so ubiquitous we take them for granted. Science has become a complex story that can no longer be portrayed as an isolated or idealistic pursuit. What happens in science affects us all and is influenced—even shaped—by money, special interests, and politics. In short, we need to report science as part of the real world.
Given that reality, we teach our graduate students at the Knight Center for Science and Medical Journalism at Boston University to view science in a more interwoven way than it was reported in the old days. Increasingly our discussions focus on context—fleshing out the scientific, economic and social aspects of issues to illuminate their relevance and meaning.
A few years back, students in science journalism courses would be asked to find newsworthy journal articles and “translate” them for the public. Nowadays, we no longer do this exercise since the most skillful journalists act as analysts, not translators. That is not to say that reporters shouldn’t follow the literature—with extensive science backgrounds, most of our students already do. But rather than focus their work on a single study, our students use such reports as a point of departure for interviews and other research to reveal the broader currents in the field.
In doing so, they investigate the work on several levels. First, they flesh out the intent of the study—for example, whether it demonstrates correlation or causation, a straightforward distinction that reporters sometimes miss. They determine whether a study’s conclusions follow logically from the methods. (You’d be surprised how often they don’t.) They challenge the statistics: When a study reports a 50 percent increase in brain cancer among laboratory rats exposed to a certain chemical, the results might sound alarming until the reporter asks about the sample size. If the researcher replies “three,” the finding is less significant.
They also ask contextual questions about how this particular study compares with others in the field. What similar studies have come before? How is this one different? And perhaps most importantly: How does this study add to or contradict the existing body of scientific opinion? Such questions situate the work and help separate genuine news from institutional or media spin.
Nowhere is context-setting more important than in medical and nutrition reporting. Readers infer advice from our articles, whether or not we intend it. Plagued by fad diets and simplistic solutions, consumers are often confused by successive articles suggesting conflicting advice. For this reason, it’s especially important to spell out the difference between a definitive study and a work in progress and to compare the current work to the broad consensus of scientific opinion.
Two questions are particularly helpful:
The perils of ignoring such questions became apparent in a cover story last summer in The New York Times Magazine. This bold and contrarian article alleged that the conventional food pyramid we’ve come to rely on actually has caused the national obesity epidemic, and as a solution resurrected the largely discredited Atkins high-protein diet. The article presented an intriguing theoretical case, but left out the evidence that the readers need most—definitive studies showing the Atkins diet to be safe and effective. The article did indicate that a few preliminary studies suggested that the Atkins diet caused weight loss, but none of those studies had been published or peer-reviewed. Nor did the article include crucial information about the people who had taken part in those studies—how many people participated, under what conditions, whether they had particular health problems or other characteristics, and whether the results could be generalized to other people. Yet the argument came through in such an unequivocal tone that some people who read the article later reported changing their eating habits to one that most nutritionists would consider unhealthy.
Such contrarian articles are popular among editors—they’re surprising and edgy. But they misserve the public if they substitute one simplistic explanation for another. The best such stories invite readers to cast aside preconceptions and see an issue with more subtlety and depth.
It’s not always reporters who lead readers astray. In the increasingly privatized world of corporate and university research, the release of information may have more to do with money than scientific relevance. Last January, for example, PPL Therapeutics, the British company that cloned Dolly the sheep, issued a press release announcing the birth of five cloned pigs whose organs lacked a gene that triggers rejection. The development would represent an important advance in the effort to transplant pig organs into human and sounded like a great story.
It later turned out that the company announced the results before submitting them to a peer-reviewed journal, a key filter of scientific credibility. Furthermore, an American company had actually beaten them to the discovery. They were holding their announcement pending publication in the peer-reviewed journal Science, which would happen in two days. It also was revealed that Dolly, the cloned sheep, suffered from arthritis.
Normally such developments would cast doubt on PPL’s cloning technology and perhaps send the company’s stock into a decline. Yet the cleverly timed press release short-circuited the real story. After reporters wrote about the cloned pig results, PPL stock shot up 46 percent. “The company’s spin doctors may have raised hackles in the scientific community but they undoubtedly caught the attention of financiers,” according to the Financial Times of London. Those journalists who reported the pig cloning as an isolated breakthrough became unwitting accessories to the company’s spin.
After years of teaching about context in science writing, my co-director, Ellen Ruppel Shell, and I have modified some of our most cherished journalistic beliefs. These include:
The requirement to report science in context creates a lot of extra work for our students, but it has helped them write in a probing and sophisticated way. Some years ago, one of our graduate students was working on a story about the PCB pollution in the harbor of New Bedford, Massachusetts. Local people had tried for years to get the EPA to pay attention to the problem. Their efforts were finally rewarded with an indictment against the polluter and a cleanup of the harbor. No sooner had the cleanup begun than they bitterly protested the incinerator the EPA was using to destroy the harmful chemical.
Fleshing out the science, our student found that the kind of high-temperature incinerator the EPA installed did in fact destroy the PCB’s. Still, the protests continued. Not wanting to write a typical “he said, she said” article, this student searched deeper for meaning in the city’s economic and social history. He learned about New Bedford’s prosperous whaling days and its subsequent downward spiral. He realized that the present-day Portuguese fishing community had seen one promise after another of prosperity and urban renewal fade away. In their eyes the EPA was no different from the factory owners—just one more group of outside experts who claimed to know what was best for the town.
The story wasn’t only about whether fumes from the incinerator were poisoning the neighbors. Underlying these fears was a trail of broken promises and betrayal. It was the story of a local community who had learned to trust no one—even the agency that was working to improve their lives.
The story appeared as the cover article for a regional Sunday magazine. The arduous process it took to piece its meaning together illustrated a valuable lesson: The relentless search for context leads us closer to the truth.
Douglas Starr is co-director of the Knight Center for Science and Medical Journalism at Boston University.
Things have gotten more complicated since those early days of science journalism. The spread of pollution, the Vietnam war, the Chernobyl meltdown, the Challenger explosion, the emergence of AIDS and antibiotic-resistant bacteria have all revealed a darker, more vulnerable side of science. This is not to say science has gone bad: Our lives have been extended through medical advances and improvements in diet and made more convenient with personal computers and inventions so ubiquitous we take them for granted. Science has become a complex story that can no longer be portrayed as an isolated or idealistic pursuit. What happens in science affects us all and is influenced—even shaped—by money, special interests, and politics. In short, we need to report science as part of the real world.
Given that reality, we teach our graduate students at the Knight Center for Science and Medical Journalism at Boston University to view science in a more interwoven way than it was reported in the old days. Increasingly our discussions focus on context—fleshing out the scientific, economic and social aspects of issues to illuminate their relevance and meaning.
A few years back, students in science journalism courses would be asked to find newsworthy journal articles and “translate” them for the public. Nowadays, we no longer do this exercise since the most skillful journalists act as analysts, not translators. That is not to say that reporters shouldn’t follow the literature—with extensive science backgrounds, most of our students already do. But rather than focus their work on a single study, our students use such reports as a point of departure for interviews and other research to reveal the broader currents in the field.
In doing so, they investigate the work on several levels. First, they flesh out the intent of the study—for example, whether it demonstrates correlation or causation, a straightforward distinction that reporters sometimes miss. They determine whether a study’s conclusions follow logically from the methods. (You’d be surprised how often they don’t.) They challenge the statistics: When a study reports a 50 percent increase in brain cancer among laboratory rats exposed to a certain chemical, the results might sound alarming until the reporter asks about the sample size. If the researcher replies “three,” the finding is less significant.
They also ask contextual questions about how this particular study compares with others in the field. What similar studies have come before? How is this one different? And perhaps most importantly: How does this study add to or contradict the existing body of scientific opinion? Such questions situate the work and help separate genuine news from institutional or media spin.
Nowhere is context-setting more important than in medical and nutrition reporting. Readers infer advice from our articles, whether or not we intend it. Plagued by fad diets and simplistic solutions, consumers are often confused by successive articles suggesting conflicting advice. For this reason, it’s especially important to spell out the difference between a definitive study and a work in progress and to compare the current work to the broad consensus of scientific opinion.
Two questions are particularly helpful:
- Are the studies so powerful that readers should change their medication, diet or behavior?
- What would be the effect of changing those behaviors versus keeping them as they are?
The perils of ignoring such questions became apparent in a cover story last summer in The New York Times Magazine. This bold and contrarian article alleged that the conventional food pyramid we’ve come to rely on actually has caused the national obesity epidemic, and as a solution resurrected the largely discredited Atkins high-protein diet. The article presented an intriguing theoretical case, but left out the evidence that the readers need most—definitive studies showing the Atkins diet to be safe and effective. The article did indicate that a few preliminary studies suggested that the Atkins diet caused weight loss, but none of those studies had been published or peer-reviewed. Nor did the article include crucial information about the people who had taken part in those studies—how many people participated, under what conditions, whether they had particular health problems or other characteristics, and whether the results could be generalized to other people. Yet the argument came through in such an unequivocal tone that some people who read the article later reported changing their eating habits to one that most nutritionists would consider unhealthy.
Such contrarian articles are popular among editors—they’re surprising and edgy. But they misserve the public if they substitute one simplistic explanation for another. The best such stories invite readers to cast aside preconceptions and see an issue with more subtlety and depth.
It’s not always reporters who lead readers astray. In the increasingly privatized world of corporate and university research, the release of information may have more to do with money than scientific relevance. Last January, for example, PPL Therapeutics, the British company that cloned Dolly the sheep, issued a press release announcing the birth of five cloned pigs whose organs lacked a gene that triggers rejection. The development would represent an important advance in the effort to transplant pig organs into human and sounded like a great story.
It later turned out that the company announced the results before submitting them to a peer-reviewed journal, a key filter of scientific credibility. Furthermore, an American company had actually beaten them to the discovery. They were holding their announcement pending publication in the peer-reviewed journal Science, which would happen in two days. It also was revealed that Dolly, the cloned sheep, suffered from arthritis.
Normally such developments would cast doubt on PPL’s cloning technology and perhaps send the company’s stock into a decline. Yet the cleverly timed press release short-circuited the real story. After reporters wrote about the cloned pig results, PPL stock shot up 46 percent. “The company’s spin doctors may have raised hackles in the scientific community but they undoubtedly caught the attention of financiers,” according to the Financial Times of London. Those journalists who reported the pig cloning as an isolated breakthrough became unwitting accessories to the company’s spin.
After years of teaching about context in science writing, my co-director, Ellen Ruppel Shell, and I have modified some of our most cherished journalistic beliefs. These include:
- Balance: Traditional practice teaches us to provide balance by giving both sides their due. The practice might work in stories involving our political system, but rarely on the science beat. Many science issues have more than two sides; others cannot be posited as equal and opposite sides of an argument. On the issue of global warming, for example, should we give as much weight to the handful of naysayers known to be supported by the fossil fuel industry as we give to the more than 2,000 climatologists from 120 countries represented on the Intergovernmental Panel on Climate Change?
The answer is not merely to present these opinions, but to weigh them. The majority opinion might not always be right, but it is important to state where the consensus of scientific opinion lies and reveal the sources of support of the various opinion-makers. This contextual reporting avoids the “he says, she says” dilemma of traditional reporting and gives readers a true sense of balance by providing depth. - Uncertainty: Most editors shy away from uncertainty, worried that it leads to vague, unfocused stories. We encourage students to pursue it. Areas of uncertainty represent the cutting edge of science and provide insights into scientific debate. Part of what makes the global warming debate so compelling, for example, is what society should do given the uncertainty about the dimensions of the problem.
- Complexity: “Boil it down,” is the advice of most editors, and we agree that clarification is essential. Yet to ignore complexity is to present only a partial and, at times, misleading story. Such was the case last winter, when scientists reported that certain parts of the Antarctic ice sheet were thickening. As journalist Keay Davidson of the San Francisco Chronicle points out, some newspapers simplistically editorialized that the findings cast doubt on the theory of global warming. Actually, the findings shed light on the incredibly complex movement of polar ice sheets, including the likelihood that global warming will produce unstable weather patterns.
The requirement to report science in context creates a lot of extra work for our students, but it has helped them write in a probing and sophisticated way. Some years ago, one of our graduate students was working on a story about the PCB pollution in the harbor of New Bedford, Massachusetts. Local people had tried for years to get the EPA to pay attention to the problem. Their efforts were finally rewarded with an indictment against the polluter and a cleanup of the harbor. No sooner had the cleanup begun than they bitterly protested the incinerator the EPA was using to destroy the harmful chemical.
Fleshing out the science, our student found that the kind of high-temperature incinerator the EPA installed did in fact destroy the PCB’s. Still, the protests continued. Not wanting to write a typical “he said, she said” article, this student searched deeper for meaning in the city’s economic and social history. He learned about New Bedford’s prosperous whaling days and its subsequent downward spiral. He realized that the present-day Portuguese fishing community had seen one promise after another of prosperity and urban renewal fade away. In their eyes the EPA was no different from the factory owners—just one more group of outside experts who claimed to know what was best for the town.
The story wasn’t only about whether fumes from the incinerator were poisoning the neighbors. Underlying these fears was a trail of broken promises and betrayal. It was the story of a local community who had learned to trust no one—even the agency that was working to improve their lives.
The story appeared as the cover article for a regional Sunday magazine. The arduous process it took to piece its meaning together illustrated a valuable lesson: The relentless search for context leads us closer to the truth.
Douglas Starr is co-director of the Knight Center for Science and Medical Journalism at Boston University.