Inside the BBC’s Verification Hub

‘What everyone wanted to know, on Twitter and in the newsroom, was this: Was the video real or fake? That is the kind of question the [User-Generated Content] Hub is there to investigate.’
A group of soldiers speaking Arabic shovel sand into a pit while a disembodied voice wails. After a few seconds it becomes apparent that the desperate voice is coming from a man buried in the trench; the head alone is visible. Join the conversation on twitter using the hashtag #NRTruth

The soldiers—a number dressed, incongruously, in sneakers—appear to reply with gloating taunts. But they are mainly concentrating on the job at hand: covering the victim's head in earth. They do their grisly job well; in less than a minute his head is completely buried. The video then ends abruptly—the rest is silence.

One rain-swept morning in April, Trushar Barot, assistant editor at the BBC's User-Generated Content (UGC) Hub in London's rather bleakly monolithic BBC Television Centre, was studying the anonymously posted footage on YouTube. His Twitter feed was buzzing with news of the clip. Jon Williams, the BBC's world news editor, had also raised it at the 9 o'clock news meeting. What everyone wanted to know, on Twitter and in the newsroom, was this: Was the video real or fake? That is the kind of question the Hub is there to investigate.

RELATED ARTICLE
"Vetting Information"
- David Turner
A Fateful ErrorStarted in 2005 to sift through unsolicited contributions previously perused by many different teams, the Hub has grown to a complement of 20 staffers. Initially, the team focused heavily on images, footage and eyewitness accounts e-mailed to the BBC, but in the past few years people have become much more prone to distribute material themselves through Twitter, YouTube and Facebook. As a result, the number of contributions proffered to the BBC has declined to about 3,000 a day, and the Hub's task has moved toward semi-conventional newsgathering with a Web 2.0 twist. Staffers now use search terms, see what's trending on Twitter, and look at the images and footage trusted contacts are discussing on their Twitter streams.

The golden rule, say Hub veterans, is to get on the phone whoever has posted the material. Even the process of setting up the conversation can speak volumes about the source's credibility: unless sources are activists living in a dictatorship who must remain anonymous to protect their lives, people who are genuine witnesses to events are usually eager to talk. Anyone who has taken photos or video needs to be contacted in any case to request their permission, as the copyright holder, to use it.

The risk of posting non-authenticated images is high, as the Hub was reminded on Sunday, May 27. As a breaking news story about a massacre in Houla, Syria, unfolded, staff members spotted a powerful photo circulated on Twitter, showing shrouded bodies in rows and apparently sourced from activists in Syria. "The original distributor of the photo on Twitter was tracked down, we spoke to them, and they gave us information about its sourcing," says Chris Hamilton, the BBC's social media editor since 2011. "So the picture was published on the BBC News website, with a disclaimer saying it could not be independently verified."


This photo, taken in Iraq in 2003, was posted on the BBC’s homepage on May 27, 2012 to accompany an article about a massacre in Houla, Syria. It had been distributed by Syrian activists on social media. Photo by Marco Di Lauro/Reportage by Getty Images.

Seeing the BBC News website, Getty photographer Marco Di Lauro almost fell off his chair, he later told The Daily Telegraph—the image supposedly showing the shocking aftermath of the Houla massacre was a photo he'd taken in Iraq in 2003. He posted on Facebook: "Somebody is using my images as a propaganda against the Syrian government to prove the massacre." Meanwhile, alerted by users, the BBC took down the image—90 minutes after it had been posted. But the damage was done: The Daily Telegraph and other publications reported on the error, and the blogosphere went wild over accusations that the BBC was pushing the anti-Syria position of the British government. Interestingly, few readers or commentators accepted the disclaimer posted with the photo—a key element to how many news organizations today handle the challenge that there are few independent reporters in countries such as Syria and that activists' accounts and footage often cannot be verified.

"Posting this photo was a mistake, there is no question," says Hamilton. "We should have made more checks, as is normal practice, and the decision to publish should have been delayed, something we are very happy to do in an environment where being right is more important than being first." He adds: "But this was not a systematic error. We have a strong track record of stopping numerous examples of incorrect material making it to air or online."

While frustrating, the intentional "redistribution" of the Iraq photo illustrates that "governments don't have a monopoly on spinning the media," says Hamilton. "There is a lot of potential for activists to be faking and spinning things in a way that puts their cause forward. It is something we are all aware of. But it has to be navigated anew each and every time we look at footage. There are very few things that can give you 100 percent certainty."

Raising Doubts

Authenticating photos and video, in other words, can be a tricky business, even for senior staff at the Hub. During my visit in May, assistant editor Barot recounted how on that rainy April morning he went about vetting the grisly video of the man being buried by people who appear to be Syrian soldiers: By 9:20 a.m. he had e-mailed the video to a colleague, an Arabic-speaking Syrian at BBC Monitoring, which uses language specialists to gather information from media outlets across the world. At 10:12 a.m. the colleague e-mailed back: The soldiers' accents are Alawite, the ethnic group which rules Syria and provides many of its soldiers. And sneakers are commonly worn in some Syrian units. However, the colleague wondered how the voice of a man whose face is being covered in sand could be so consistently audible—unless he has been fitted with a microphone?

Barot had noticed this too, plus another cause for suspicion: Why does the video end only a few seconds after the victim's head has submerged completely? Could it be because it has to be short enough for him to hold his breath?

His hunch corroborated by this second source, Barot e-mailed colleagues around the BBC within 10 minutes to tell them the Hub had reservations about the clip. It had failed the test.

The dubious video illustrates a point made repeatedly by Barot and his boss Chris Hamilton: The business of verifying and debunking content from the public relies far more on journalistic hunches than snazzy technology. While some call this new specialization in journalism "information forensics," one does not need to be an IT expert or have special equipment to ask and answer the fundamental questions used to judge whether a scene is staged or not.

"People are surprised to find we're not a very high-tech, CSI-type of team," says Barot. He and Hamilton, like the Hub's other members, have conventional journalism backgrounds. Hamilton, for example, has done stints as reporter and editor during his 12 years at the BBC.

Streamlined Future

It's time for the Hub's 10 a.m. news meeting, which has the feel of any morning confabulation of journalists at a media outlet, including a palpable sense of impatience to stop conferring and get on with the day's work.

After setting priorities for the rest of the day, Hamilton finds a cramped office in which to discuss the future of verification.

Is the Hub here to stay? "We're seeing correspondents and producers building up their verification skills, and you've got to work out whether it's something you need specialists for," Hamilton says. But, he adds, "in some form you'll always need them," if only for the sake of efficiency.

Hamilton can, however, foresee a time when the size of the BBC's Hub team might shrink as verification is "industrialized." By that, he means that some procedures are likely to be carried out simultaneously at the click of an icon.

He also expects that technological improvements will make the automated checking of photos more effective. Useful online tools for this are Google's advanced picture search or TinEye, which look for images similar to the photo copied into the search function. Barot used TinEye to disprove one of several gory fake images of Osama bin Laden's head that circulated online soon after his death last year. He tracked down the original photo of another corpse's face, onto which bin Laden's features were grafted using Adobe Photoshop.









This fake photo of the dead Osama bin Laden was debunked by the BBC’s User-Generated Content Hub. Using TinEye, the team revealed that bin Laden’s features had been digitally superimposed on the head of a dead Afghan fighter. Photo by Philip Hollis.

Responding to the tendency for social media to act as a rumor mill for outlandish theories, the Hub steers clear of tweets that ask the public whether something is true—in contrast to some journalists who use Twitter for crowdsourcing. Hamilton justifies this by pointing out that the mere fact the BBC is investigating a rumor "lends credence to the idea that it might be true."

However, there is no question at the Hub about the role journalists should play in verifying online information with their trusted tools and techniques. "UGC and verification are no longer a side operation," says Hamilton. "They have become part of the journalistic toolbox, alongside agency pictures, field reporters, background interviews. It's critical for any big newsroom that wants credibility in storytelling."

David Turner is a freelance journalist and author based in London. He was a correspondent at the Financial Times for 10 years.