Throughout the evening and early morning hours of November 9th and 10th, 1989, I stood near the Brandenburg Gate and watched, transfixed, as the Berlin Wall came down. To the extent that I was aware of my own sensations, I felt privileged to be there that at that climactic moment.
What put me there was a decision by several colleagues at NBC News with whom I had been closely tracking developments in Eastern Europe during the preceding months. Good fortune is often a component of success, but in this case our good fortune had been earned. Our attention had not wavered throughout that remarkable year. Surely, there was guesswork involved in the decision to broadcast Nightly News from Berlin at that particular moment. But it was educated guesswork. So, mixed in with a sense of awe at what I witnessed that night, there was also, I confess, a wave of professional and competitive gratification.
All through that night, I stood at the base of the wall, interviewing Germans in various stages of delirium. Their brandy and champagne bottles seemed unnecessary; freedom was stimulant enough. As I called up to the revelers, and they shouted down to me, there never was a pause in the chip, chip, chip of their hammers and chisels, nor any escape from the concrete dust that billowed off the wall.
At dawn, I made a brief visit to my hotel room to freshen up. As I bent over the bathroom sink and splashed cold water on my face and head, the images of the preceding hours played through my mind. Distracted by these thoughts, I was at first perplexed by the gray, granular liquid circling in the basin. Then, of course, it struck me: I was watching the Berlin Wall go down the drain.
More than 18 years have passed since that November morning. Of course, a great deal has happened; that’s always the case, even during relatively quiet periods in history, and these have hardly been quiet times. The news divisions of NBC, ABC, and CBS have not disappeared, despite much talk about the threats posed by cable television, the Internet, and other alternative media. For about 12 of those years, I continued doing what I had been doing for so long. Then, for reasons I could no longer ignore, I decided it was time to stop.
Throughout the 1990’s and continuing into the new century, I had nursed a growing conviction that something important — perhaps basic — had changed in the world of television news since that night when the world changed. Although I had no way to realize it at the time, I have since come to believe that the broadcast news divisions’ commitment to the news — covering, reporting and explaining it in at least some of its complexity — went down the drain that November morning, along with the bipolar world.
It is inescapable that there never was a truly golden age of television news. For every “NBC White Paper” or “CBS Reports,” there was also a “Person to Person.” How could it have been otherwise? A commercial enterprise makes commercial choices. Those of us who grew up in the early days of television, and whose nascent sense of the world was formed as much by video images as by textbooks, had no quarrel with the equation. We happily watched it all.
Later, as some of us decided to try our own hand at broadcast news, the same equation held. Except for the most naive and deluded among us, we anticipated that there would be trade-offs and bargains that, if not precisely Faustian, were sure to trouble us. In this we were correct, prompting some of us to drop out early on. Others, the vast majority, stayed on and fought on. It seemed a fair enough trade-off: win one, lose one; win one, lose two; even, win one and lose three. For me, the halcyon days were the 1970’s and 80’s, diminished and mitigated by the stories not covered, surely, but halcyon days nonetheless.
Thereafter, the calculus changed. There were a variety of reasons. I want to reflect on three of them: one based on world events, a second on American corporate culture, and a third on technological advances.
The first, of course, was the end of the cold war. Whether one considered Communism’s collapse to be sudden or glacial, the new reality was breathtaking. Events and phrases that had felt contemporary only yesterday took on the coloration of history: MAD, U-2, Quemoy and Matsu, missile gap, Bay of Pigs, Vietnam, Nicaragua, Salvador, Iran-Contra, and so many more.
These buzz phrases faded into the white noise of 1990’s America. There was a corollary development: for mainstream media, in general, and broadcast news, in particular, a long-standing sense of urgency went into steep decline. It was natural and understandable, as a half-century of menace — the nuclear specter — seemed to have evaporated. In the superpower competition, one side had been routed and replaced by a new paradigm and a single power. The most piquant coinage addressing this at the time was Francis Fukuyama’s “The End of History.” For him, the phrase was metaphorical, but for broadcast news, whether consciously or not, it was literal: contemporary history — with its intimations of world-shaking events — had ended. Now we had been released from such onerous stuff. The world would survive, and we were free to turn our attention elsewhere.
The Rise of ‘Talkability’
I never heard a discussion that explicitly laid out this point of view. But at about this time, I began to hear a new buzz phrase that I eventually came to consider the most obscene word in the relatively short history of broadcast news. The word was “talkability.” It was invoked repeatedly, almost daily, and it referred to any story, person, development, or situation, regardless of its significance, that would elicit great interest and comment among viewers. We had entered the age of the water-cooler metaphor: The idea was to search for stories that, after they aired, would prompt someone standing at the office water cooler to say to colleagues: “Hey, did you happen to see … ?”
Traditional news values, of course, were not pitched overboard. Stories of transcendent significance, with or without talkability, had to be covered, and they were. But the formula had changed. When it was a choice between two stories of roughly equal merit, the winner was invariably the one perceived to have that all-important quality. Increasingly, stories that had no particular merit, but which were drenched in talkability, wound up on television.
Inevitably, the trend began to have a significant effect on the balance between domestic and international news coverage. This was a discussion that I did hear — and participated in — many times over. It was assumed that talkability emanated from the viewer’s natural interests. It was further assumed that those interests were strongest when the story was closest to home. It followed that interest flagged as distances increased.
A fundamentally misguided argument was thus engaged, between those pushing “domestic” news and those advocating “foreign.” What had been a free-for-all between competing priorities — commercial vs. editorial imperatives — was now replaced by another model altogether, that of the archetypal American viewer, whose interests were known, whose imagination had been mapped, whose wishes had been ascertained, and whose gratification was now the North Star of television news.
There was an irony here. As this process unfolded, another, larger process had also been in play. The commercial networks were taken over by huge and vastly successful corporations: General Electric (GE), Viacom, Disney, the News Corporation, Time Warner. More than any other sector of American society, these entities had grasped the global realities upon which their success depended: the interconnection between “domestic” and “foreign,” between “national” and “international,” between “here” and “there.” Simultaneously, the news divisions of these corporations, whose job was to examine the impact of the wider world on the nation, were instead moving in the opposite direction. Parochialism replaced globalism. And with that, network news began its slide toward irrelevance.
In November 1985, scores of NBC colleagues and I were ensconced at the Noga Hilton hotel in Geneva to cover the first Gorbachev-Reagan summit. News teams from the other networks were situated, in similar numbers, at other hotels. The huge contingents were necessitated, in part, by the technical and logistical realities of the time. Although satellite transmissions were commonplace by then, ordering the limited number of transponders available in the mid-1980’s was competitive and expensive. It made economic sense to bring editing equipment into the field, so that transponders would be used only to transmit concise edited stories, rather than the raw footage that would have to be fed in bulk. (Today the situation is reversed. Transponders are plentiful and cheap, while moving gear and people around the world is expensive.)
Another technical factor was also in play. Along with evolving satellite technology, video equipment was becoming lighter, smaller, more portable. This made live broadcasts from the field increasingly feasible. Not surprisingly, what was feasible became irresistible. News extravaganzas, lasting days or even a week, proliferated.
Thus, the battalions: four or five or even six editing systems, along with editors and engineers, would occupy as many hotel rooms (apart from the rooms in which the personnel were housed). Carpenters and technicians would erect sets on location, in order to dramatize the event being covered and to impart a “you are there” patina to the proceedings. Unit managers would hire local personnel — drivers, gofers, guides, translators, facilitators. Anchormen (only men) had assistants. Programs had bookers. News coverage required researchers.
All of this required a second set of logistics: the troops had to be fed. Food — mountains of food — was always available. Early one morning, I arrived at the makeshift newsroom at the Hilton, comprised of a hotel suite that contained desks, television monitors, typewriters, and Teletype machines (this was 1985, after all). Along one glass wall overlooking Lake Geneva, a groaning board stretched half the length of the room. It was piled high with cheeses, assorted meats, fresh fruit, eggs, sausages, hot and cold cereals, crepes, assorted breads and rolls, jams, several different juices, cocoa, tea and coffee. I happened to be watching as a colleague entered the room, slowly walked the length of the table and examined what was on offer, then turned to another table, picked up a house phone, and ordered room service.
It proved to be an instructive incident when it came back to me much later, just as the corporate culture of GE began to take hold at NBC News. Taken in isolation, it was only an anecdote, but similar instances of extravagance were commonplace and legendary. Everybody had stories of their favorite boondoggle, when no expense was spared. I began to muse on how such scenes must have struck the fresh eyes of not only GE, but also those of the McKinsey management consultants brought in by GE to scrutinize the way things were done at NBC News.
The era of hard-nosed corporate management was upon us, and its momentum was enhanced by an uncomfortable reality: The cost of covering the news, already ample, was multiplied by the lavishness of our lifestyle. Despite our resentment at being reined in, we all knew that to some extent we had brought it on ourselves.
As time passed, the logic of the bottom line pervaded all aspects of the news business. As never before, the news divisions had to demonstrate their commitment to efficiency and their opposition to waste. Productivity, a central and venerable tenet of corporate culture, began to occupy the world of news to an extent it previously had not. And at this juncture, I believe, the new culture of the broadcast news business — so justified and potentially beneficial in its inception — began to drive the process in an unfortunate direction.
Covering News ‘Efficiently’
Productivity is a simple enough concept when applied to any tangible product, whether it’s a light bulb or an airplane engine. The idea is to define what level of quality is necessary for a product, then to design a production system that yields such a product in the minimum time possible, with the least amount of manpower.
The difficulty arises when the product is the news. What is a quality product? More practically, what is the formula for determining whether too much time and manpower have been invested in producing a news story? Ultimately, of course, such questions are unanswerable because of the capricious nature of what happens in the world. A light bulb is produced in a highly organized environment; a news story is often produced in difficult, even chaotic, and time-consuming conditions. Each light bulb is, or should be, identical; all news stories are different, even when they have similarities. Efficiency is a laudable goal in both light bulbs and news stories. But measuring efficiency and productivity is a lot easier in the former than the latter.
The parent companies of the network news divisions addressed this problem in an oblique way. Since news programs are the delivery systems for the product — the news story — it’s a simple task to monitor those programs and measure who contributes most and who least. The process is clearest when one examines the news bureaus, both domestic and foreign, which provide the stories that wind up on television. If in a given week, or month, or year, the bureau in Moscow produces X stories, and the bureau in Chicago produces 2X stories, it follows that the Chicago bureau is twice as productive.
In reality, it does not follow at all. Aside from the advantages of familiarity, ease of working, and superior logistics that a domestic bureau enjoys, there is another factor that is more revealing and, upon examination, just as obvious. In the new paradigm of the post- cold war world, stories from Russia rarely have the urgency that stories from the menacing, nuclear-armed Soviet Union used to have. With some exceptions, they are not as appealing as stories from the home front. Put another way, they lack talkability.
So the conundrum is total: efficiency demands productivity, productivity depends on the interest of the programs, the programs are driven by what they perceive will be interesting to viewers, viewers make do with what they are offered, and the result — great interest in some things and little interest in others — is driven by decisions that often have nothing to do with the newsworthiness of the stories or the skills of the storytellers. The irony is total, too: a process set in motion by the most successful global corporations in the world, whose lifeblood is the international arena, results in a contraction of the very entities meant to examine and report on that arena and the forces that drive it: the news divisions.
The results have been predictable and dramatic at all the networks. At NBC, for example, there used to be bureaus in Frankfurt, Paris and Rome. When they were closed, the rationale was that these prestigious and luxurious outposts had to yield to grittier, more newsworthy datelines elsewhere in the world. A fair point, given that there used to be just one news bureau for all of Africa, in Johannesburg. Paradoxically, that imbalance was addressed by closing that bureau as well. To date, there is no news bureau anywhere on the continent.
The Digital Factor
Early in the 1990’s, the dawning digital era transformed the way TV news was presented. The visual possibilities seemed, and turned out to be, virtually limitless. The “look” of a program, as well as the stories within a program, became a central consideration in production. The graphics department became a de facto coequal of the newsroom. It wasn’t as if cosmetic considerations had suddenly sprung into existence. Since the 1950’s, the look of television news had always been important for the obvious reason that it was a visual medium. The hallowed prescriptions for a failing program were always the same: change the set, change the desk, change the lighting, change the color scheme.
But the digital age was different. Whereas, in the old days, color schemes might be blue or green, and lighting might be softer or brighter, the new technology offered tools that fundamentally altered the way in which a viewer received information. The opening sequence of the various evening news programs provides a striking illustration.
In ancient black-and-white days, John Cameron Swayze or Douglas Edwards would talk into the camera with few supporting visuals. Later came the static photo over the anchor’s shoulder; if the story was about the president, there’d be a picture of the president. Still later came moving video that served the same informational purpose in a more compelling and visually pleasing way. There were many other such incremental improvements that enabled an increasingly sophisticated presentation.
Then suddenly, digitally, the change became exponential. If, say, the lead story was the war in Bosnia, this is what the viewer might see behind the anchorman: a column of tanks crossing left to right, combined (in a technique called a half-dissolve) with another shot of somber refugees trudging right to left. Often a third, even a fourth shot, would be introduced — an irate politician or statesman shaking a fist; a skyline of blasted buildings. Typically, a few words — “Bloody Day,” for example — would be superimposed over these overlapping images, the equivalent of a headline.
While these editorially driven images were flashed, other effects, aesthetic in nature, were also inserted. A digitally induced pulse might impart a strobe effect to the scene, providing a sense of urgency or ominousness. The video might be given a sepia tone or a black and white treatment, or some other effect, depending on the nature of the story. That in turn invited an emotive response, whether it was rage, satisfaction, nostalgia or patriotism.
An over-arching phrase — “War In the Balkans,” say — might be woven into the visual mix. Finally, there would be an animated border that advertised the program — ABC World News Tonight, CBS Evening News, NBC Nightly News — framing and circling the multi-element tableau. As the anchorman spoke for 20 or 25 seconds about the evening’s lead story, I often monitored my reaction to the combined audio-visual display. Invariably, I noticed that the more I watched the many elements on the screen, the less able I was to follow what the anchorman was saying. Despite my television experience, it struck me that my response probably was broadly representative of how a typical viewer might react. If anything, I thought, my insider knowledge would shield me from distraction and confusion. But the opposite proved to be the case, and I wondered then — as I do today — about the impact of such presentations on viewers trying to concentrate on the details of a complicated story.
Analogous visual tropes appeared in many other situations, not only on the network news programs, but on the cable news outlets as well. It became customary — and remains so — to provide a title for a live televised event, written on a “ribbon” across the bottom of the screen. Often, the equivalent of a subheadline appears on a second ribbon directly below the first. On the cable programs, there is invariably a third ribbon, this one animated, providing headlines about other stories of the day. In addition, the screen might offer the time of day, the temperature, a promo for a program later that day, and a logo — often animated — of the outlet providing the broadcast.
All this raises questions about distraction, confusion and the obstacles to concentrating on the topic at hand. It also raises another question, one that points toward another irony in the evolution of television news: Why would a visual medium willingly relinquish 15 or 20 percent of its canvas — the television screen — in order to present phrases that, in most cases, are self-evident (“School Siege,” “Presidential News Conference”)? I have never found a persuasive answer to this question, although one is often offered: This is how people (specifically, young people) process information today. Everything, according to this argument, needs to be an amalgam of video and audio streams. The goal is to convey something impressionistic, not specific, in order to create a “feel” for the story at hand. The thesis carries the imprimatur of received wisdom. But is it? Is this what we want?
The cable news networks have been catalysts for all these developments, and this has produced another set of ironies. At first blush, could there have been anything more enticing to news junkies than 24-hour TV operations dedicated to their narcotic of choice? When MSNBC and Fox News joined CNN on the air, the possibilities seemed limitless. Finally released from the constraints imposed by 30- or 60-minute programs, news coverage could have the best of two worlds: It could be both broad and deep. But the prevailing culture of broadcasting — comprised of its editorial preferences, economic goals, and technical abilities — has produced a different outcome. Two anecdotes serve to illustrate what the reality has proved to be.
The first concerns the lead-up to the 2005 Michael Jackson trial in southern California. I had already left NBC by that time, so I watched the coverage with a fair amount of “civilian” perspective combined with the leftover habits of a television news producer. One afternoon I turned on the TV to find the Jackson entourage just leaving the Neverland Ranch to attend a pretrial hearing in Santa Monica.
An armada of traffic helicopters hovered over the vehicles as they traversed the freeways. All three cable networks covered the trip live, with reporters speculating breathlessly on which of the several freeway routes the entourage might take to reach its destination. A sense of déjà vu nagged at me, evoking a similar scene from June 1994, when the nation had a bird’s eye view of another SUV on a southern California freeway. Back then, a set of shocking questions accompanied our viewing. Was that O.J. himself at the wheel? What was he doing? Where was he going? The Mexican border? What was his state of mind? Did he have a gun? Did he mean to do harm to himself or others?
For all the histrionics, the questions were valid and appropriate to the circumstances. Now, a decade later, the scene looked similar, but none of these questions — or any other questions — applied. Michael Jackson was on his way to a routine legal procedure of a few minutes duration, after which he would return home to await his next court date.
Why the blanket reportage of a mundane slice of reality on a weekday afternoon? One reason, of course, is that it was easy. The helicopters were there, the cameras were there, the reporters were there. What was feasible became irresistible. But transcending all such logistical and economic considerations was the most obvious imperative of all: Michael Jackson was the very essence of talkability. So, in a pattern that has become endemic, the cable networks were guided by every value but one: news value.
The second anecdote concerns politics. As the 1990’s progressed, the network news divisions began to question a tradition dating to the advent of television: gavel-to-gavel coverage of the political conventions. There was ample cause for this reconsideration; the political parties, having become media-savvy themselves over the years, had transformed the four-day events into extravaganzas of self-promotion. And there was less and less news to report, since the caucuses and primaries leading up to the conventions had already chosen the likely nominees.
I remember a conversation in which a colleague and I agreed that it made sense to cut back on convention coverage. I mentioned one regret, that viewers would lose access to the flavor of the proceedings — the unknown politicians who get their 90 seconds at the podium, the arcana of platform debate, all the minutiae that sustain political addiction. “Ah,” my colleague responded, “that’s what the cable networks will provide.” I took the point and took comfort in the thought.
I recalled this conversation years later, in 2004, when I tuned in to opening day of a convention and found identical approaches on MSNBC, Fox News, and CNN: panels of political commentators talking virtually nonstop, debating what was or wasn’t significant, and rarely pausing to show any highlights from the convention floor. There was only one way to get more of the actual proceedings: C-Span.
The Iraq War Coverage
All these trends coalesced in the television coverage of the Iraq War. On the one hand, every network and cable news outlet deployed platoons of military, diplomatic and Middle East specialists even before the war began. Surely no story since the advent of broadcasting had received such massive and instantaneous analysis. On the other hand, advances in video and satellite technology made live transmission from the battlefield not only plausible, but also easy. That was demonstrated soon after the invasion, along a roadside in Umm Qasr in southern Iraq. A few Iraqi forces holed up in a building were firing on coalition troops, with a British television crew from Sky TV transmitting live from the scene. CNN, Fox News, and MSNBC carried the confrontation — which was militarily inconsequential — live, without commentary or explanation. Forty years after the phrase was coined and imprecisely applied in Vietnam, “the living room war” became a reality in Iraq.
Technological advances had escorted television news across a new threshold. As it happened, the moment passed with scarcely a notice, for two reasons: The confrontation, although it occurred in bright daylight, was televised in the middle of the American night, and only on cable. Also, there was no newsworthy development — no tide-turning battle — and there was no dramatic story arc; after perhaps an hour of desultory shooting, a tank arrived and blasted the Iraq building in the distance. That was the end of it.
But what about the what-ifs? What if an Iraqi bullet had found its mark, and we had witnessed the death of a coalition soldier as it was happening? Would the live video feed have continued, or would someone have cut it off? Who would that someone have been? Would anyone in a stateside studio have jumped in to offer context or explanation? And what about several hours later, when the network morning news programs went on the air? Would they have shown videotape of what, just hours earlier, had been a live picture of death unfolding? These questions have not yet been engaged, either by those who produce television news or by those who watch it.
For the most part, however, television coverage of the war drew much of its impetus from the newsrooms and executive offices in New York and Washington. It was catalyzed by the hothouse urgency generated by the attacks of 9/11. I offer two examples, whose defining characteristics were that the demands of dramatic storytelling trumped the actual story and that insufficient attention was paid to important facts that would have complicated the storytelling.
The first became apparent in May and June of 2004, as the Coalition Authority prepared to transfer sovereignty to an Iraqi interim government. Daily briefings at the Pentagon were often aired in their entirety, after which the reporters on-scene would be debriefed by the anchors in the studios. Usually, the Pentagon spokesperson — whether civilian or uniformed — would talk about casualties and note that a spike in violence was to be expected at such a pivotal moment, often referred to as “a turning point.” Later, the reporter would return to that theme, reminding the audience of a supposed connection between the approaching turning point and an increase in casualties.
The problem was that the linkage was demonstrably false. U.S. military deaths illustrate the point. In April 2004, there were 135 American fatalities, a very high number. In May, there were 80; in June, 42; in July, 54; in August, 66. Thereafter, in a pattern that has held for the duration of the war, the fatality statistics rose and fell, rose and fell, then rose and fell again. Why wasn’t this reported? Obviously, a variety of factors were in play: the strain of filling so much air time, the challenge of citing pertinent data in a timely fashion, and so on. But there was also the seductive quality of a neat story line: When a momentous event was at hand, one that the insurgents would surely try to prevent, it seemed inevitable that there would be an increase in violence. Logical, yes, but it wasn’t so.
This dynamic appeared again at another dramatic moment, the elections of January 2005. The TV pictures were inspiring: Iraqis risking their lives by going to their polling places, casting their votes, then proudly displaying their thumbs bathed in purple ink to prove their participation in the election. Naturally and fittingly, the airwaves were filled with stories about the triumph of the democratic process. Very quickly, an editorial pecking order was established. The process itself became the central, overriding story, while the results of the process — the outcome of the voting — became a secondary theme. It was universally assumed that people would vote for their own ethnic group, and that proved to be the case. It was predicted that some Sunnis would refuse to vote, either out of fear or a sense of disenfranchisement that the ballot box could not redress, and this also proved to be the case. As a result, Sunni candidates who represented about 20 percent of the population emerged with even less than that minority share.
These facts were reported, but the irony of the situation was largely ignored. The democratic process had institutionalized, even deepened, the ethnic divisions that by that time were becoming more obvious and lethal by the day. If both sides of this electoral coin had been analyzed in tandem, the story line would have been ambiguous and perplexing, certainly, but closer to the significance of the event. As it happened, the great preponderance of television news coverage focused on just one side of the equation: the irrefutable appeal of people voting freely for the first time in their lives.
Competing With the Internet
In war and in peace, television news has developed a highly structured approach to the way stories are told and, for that matter, in deciding which stories are told. Naturally, economics is an important part of the process, since the goal is to attract and keep viewers. Over the years, increasingly sophisticated polling and research by the networks have served that purpose. But the profit motive, present since the dawn of the television age, hardly explains all that has happened since then.
Technology also figures, along with an evolving philosophy, often unexamined, about the best way technology can be deployed to transmit information, tell stories, and serve the needs of those who depend on television news. The emergence of the Internet has created a new culture, and television news, fearful of becoming an anachronism, has rushed to be part of the process. At first blush, this makes sense, since the computer user and the television watcher have similar experiences: both look at electronic screens and both are exposed to two potent phenomena — instantaneity and limitlessness. In both media, technology makes immediately available a flow of information that is, in practical terms, infinite.
But the differences between the computer and the television are rarely examined. Using a computer is an “active” experience, since the user controls what is flashed on screen, what length of time it remains there, and what comes next. The hyperlink is the sine qua non of computer usage. Watching television news is a fundamentally different experience, a “passive” one, assuming that the average viewer would prefer not to constantly switch among three or more news programs. The viewer cannot shape or control the experience without resorting to the on/off or change-channel button.
The television news viewer is, of course, simultaneously listening, and this further complicates the computer-television analogy. Instant messaging and e-mail have created not only new abbreviations (“lol” and “btw”) but also a new culture of abbreviation. Sentences are replaced by phrases, explanations by allusions. Television news has sought to imitate this tendency by using techniques such as eliminating the active verb. (“A frigid morning in Omaha. Snow plows out before dawn. Deserted streets.”)
Without a television separating storyteller and listener, such idiosyncrasies would never be used. Nobody tells a story that way. By doing so, television discards a central rule of TV storytelling — that the tale be told conversationally, as if the storyteller were sitting in the room with the listener.
Paradoxically, television is trying to remain relevant by appropriating the techniques of the computer, while ignoring its own unique qualities. In so doing, television news is delegitimizing itself. And while tossing out one rule of storytelling, it simultaneously embraces another, also with unfortunate results. That rule is that all stories must have an arc — a beginning, a middle, and an end. It needs to be clear and, if possible, have a touch of inevitability, as great stories often do. Stories that cannot adhere to this formula need not apply; by and large, they would remain untold.
The problem here lies in the difference between the art of literature and the craft of journalism. Describing the requirements of drama, Chekhov once famously observed that a gun seen on the mantle piece in Act I had better be fired in Act III. No such requirement applies in the real world; the gun sometimes goes off, sometimes not. In its natural and commendable desire to present the news in a comprehensible form, television conflates simplification with clarification. In doing so, it refuses to acknowledge a self-evident truth, that complexity and confusion are sometimes intrinsic to the story being told.
And so, driven by ever-tougher economic imperatives, seduced by the digital marvels at its disposal, motivated by an enshrined notion of what an audience wants to see, and fearful that nuance and ambiguity will drive that audience away, television news is at war with itself: what it tells is too simple, what it shows is too complicated. Television journalists have debated and agonized about these questions for a long time. I recall one newsroom discussion many years ago in which a colleague concluded, to universal agreement and approval, “Look, you can’t look down on the American people.” But that is exactly what has come to pass.
Marc Kusnetz, a former NBC News producer, is a freelance journalist and a consultant to Human Rights First.