Misc: June 2009 Archives


June 19, 2009

For some reason, I'm a big fan of radical summarization. In that vein, check out: Best of all, however, is ShrinkLits: Seventy of the World's Towering Classics Cut Down to Size. Here, for instance, is Beowulf.

June 7, 2009

Check out this NYT article about a suspicious medical paper:
The disputed journal article was written by a former Army orthopedic surgeon, Dr. Timothy R. Kuklo, who is now a medical professor at Washington University in St. Louis. Dr. Kuklo, the investigation found, forged the signatures of Dr. Andersen and other Army doctors on his study and never showed it to them before it was published.


The Walter Reed episode also shows how medical journals may fail to conduct adequate due diligence on the studies they publish - information that other doctors rely on for guidance. As happened in the Kuklo case, for example, they often deal only with a study's principal author, rather than all the credited contributors. In his study, Dr. Kuklo, who has not responded to repeated interview requests, reported that a bone-growth product sold by Medtronic, called Infuse, performed "strikingly" better than the traditional bone-grafting technique used to heal soldiers' shattered shin bones. Other Walter Reed doctors told an Army investigator that claim was overblown.

It's not clear to me what standards the NYT thinks journals ought to be following. I've published a number of joint papers and I can't think of any case where the conference attempted to independently contact my co-authors, except incidentally (e.g., when notifying all of us of acceptance or rejection). Even in those cases, the conferences just relied on whatever contact information the original submitter provided. In order to verify author identities in the face of intentional deception, you can't rely on the submitted information—the submitter could just provide contact information that was forwarded to him and then pretend to be the other authors. The journal/conference would need to start with the author's name and then independently contact their (alleged) institutions to verify that they really were authors on the paper. This seems like a pretty high

This seems like a pretty heavyweight process to use to detect the unusual case of fake authorship. Moreover, if your theory is that you can't trust your (alleged) authors, you've got much bigger problems than just whether the author list is correct. As I've observed before, journals/conferences don't independently check people's data; occasionally they'll check their analysis, but that's only really useful for detecting mistakes, not deception, since it's not that hard to make your data consistent if you're just making it up.

Similarly, it's not easy to detect undisclosed conflicts of interest:

Medtronic financed some of Dr. Kuklo's research and travel while he was at Walter Reed and hired him as a consultant in August 2006 when he took his current academic post. But Dr. Kuklo did not disclose his Medtronic relationship in the journal article, which was published in August 2008.

I can't see how you would plausibly expect journals/conferences to discover conflicts of interest when the authors don't disclose them. It's not like USENIX or IEEE employs an extensive staff of private investigators to run background checks on people.

Dr. Andersen, curious about what Dr. Kuklo had actually submitted, asked Dr. Heckman for copies of those reviews. But the editor turned him down, even though Dr. Andersen was supposedly one of the study's authors. In a recent interview, Dr. Heckman said that his journal, like many others, considered such reviews confidential and shared them only with a study's lead author.

This, however, strikes me as pretty odd. If you're the author on the paper, why wouldn't you be allowed to see the reviews?