medpundit |
||
|
Thursday, November 03, 2005Unfortunately, what looks too good to be true often is, Montori's team writes. The authors cite a 1999 study of a beta blocker drug in vascular surgery patients at risk for a heart attack. No one on the beta blocker had a non-fatal heart attack, and only two died of heart problems. Of those not on the drug, nine had a non-fatal heart attack and nine died. Montori calls that "an unbelievably large treatment effect" because the study was halted after only 20 patients had had a heart attack or died. Subsequent similar trials of beta blockers that enrolled more patients and therefore saw more deaths and heart attacks have not found a benefit, let alone such a dramatic one, Montori says. But based on the 1999 study's findings, patients at risk for a heart attack are routinely given beta blockers before surgery. The authors found that the fewer "events," such as heart attacks, the greater the benefit. Researchers must resist looking at their data too early, Montori says, "because you will pick up trends, not truth." Why do doctors fall for it? For one, we aren't trained very well in statistics, so we aren't likely to sift through these papers with a critical eye. For another, the majority of these misleading studies were published in "high impact journals". According to the authors the majority of the articles came from the five leading lights of medical journals: Lancet, The New England Journal of Medicine, The Journal of the American Medical Association, Annals of Internal Medicine, and the British Medical Journal. A busy practicing physician rarely has time to sit down and read the journals carefully. So, as a group, we tend to take the papers at face value - especially if they've appeared in a prestigious journal. Shame on us and shame on the editors of the journals for not scrutinizing the results before publishing them and swallowing them whole. posted by Sydney on 11/03/2005 07:29:00 AM 0 comments 0 Comments: |
|