In The New York Times today, Julie Bosman examines how print reporters are approaching the articles in scientific journals following the revelations that Hwang falsified data that showed he had created stem cells from a cloned human embryo. For the most part, Bosman finds, the scandal has not led to less reliance upon such journal articles, but more skepticism in reporting on them, which in itself offers a conundrum for reporters. Rob Stein, a science reporter for the Washington Post, told Bosman: " '… we're still in sort of the same situation that the journal editors are, which is that if someone wants to completely fabricate data, it's hard to figure that out."
Correspondent Elizabeth Kaledin, who covers medical issues for CBS News, told me in an e-mail that, like most medical and science reporters, her reporting relies "heavily" on research from scientific journals. "Most big, breaking discoveries and studies are first published in the journals precisely because it gives them a certain level of professional credibility," she said. When producing stories on information from journal articles, which are generally provided to reporters in advance of their publication, Kaledin says that she goes to "big names in the field who can put the article in perspective and tell us … is this really a big deal? Is this really news? What do the results mean? And perhaps most important, do you think this is a well-designed study? Are the results credible and statistically significant enough to warrant reporting?"
Following the revelations about Hwang's work, she said that she would "definitely read journal articles with more skepticism and will ask many, many more questions from impartial observers about the integrity of the research." The problem for reporters, as Bosman notes, is that "there are limits to the vetting that science reporters, who are generally not scientists themselves, can do." Kaledin echoed this concern:
"Let me say how hard it will be to separate fact from fiction here. I am a reporter … not a stem cell scientist ... or a cardiologist … or an oncologist. No one can be a master of all scientific specialties. We rely on the journals and their panels of experts to weed out the diamonds from the dust. So the fact that a peer reviewed journal can't discern what is fraudulent data definitely makes me feel vulnerable."Asked whether the time constraints restrict the ability of reporters to adequately vet data, Kaledin argues that "time is not the issue. Even if I had five years it would difficult for me to go to Hwang Woo Suk and say 'Hey ... your data looks fishy to me... .' Especially in an incredibly specialized -- I might even say, secretive -- field like stem cell research, we are at the mercy of other scientists looking at the data. You're basically talking about my being able to re-create his experiments before reporting on them as credible. Maybe the journal Science didn't have enough time to do that."
Journal editors are, of course, not without flaws. While publication in a well-respected journal offers the benefit of a peer-review, it doesn't immediately mean that it is iron-clad truth. "Publication of a paper only means that, in the view of the referees who green-light it, it is interesting and not obviously false. In other words, all of the results in these journals are tentative," Nicholas Wade, a science reporter for The New York Times, told Bosman. Knowing that such flaws are somewhat inherent, the best that reporters can do is call attention to those areas where skepticism is warranted. "I am a generalist," said Kaledin, "a communicator of often complex, dense medical and scientific issues to a lay audience who wants to understand. If we had a former stem cell scientist on the staff here who switched careers into broadcasting that might have been helpful ... but what newsroom in America has that luxury? I think the answer is to pile on an extra layer of skepticism ... and ask a lot more questions."