Wednesday, December 13, 2006
The study examined 2004 data from 3,657 hospitals, comparing their performance on quality-of-care measures for treating heart attacks, heart failure and pneumonia with the death rates for the same patients. The quality measures charted such matters as whether patients who had a heart attack received aspirin within 24 hours of being admitted and how soon patients with bacterial pneumonia were given antibiotics.
Patients at hospitals that scored near the top on the quality-of-care measures did do better than those at hospitals near the bottom -- but not dramatically so.
For every 1,000 heart attack patients, there were about five fewer deaths at the better-performing hospitals than at the lower-performing ones, the study found. The figures were similar for patients with heart failure and pneumonia.
The study looked at the rates of compliance with the following evidence based medicine recommendations:
For heart attack patients - Did they receive aspirin on admission and did they go home with the advice to take aspirin once a day? Did they get beta-blockers on admission, and go home with a prescription for one? Did they get an ACE inhibitor if their heart wasn't pumping quite up to par?
For heart failure patients- Did they have an echocardiogram to assess how well their heart pumps? Did they get an ACE inhibitor if it wasn't pumping so well?
For pneumonia patients- Did they have their oxygen saturations checked? Did they receive a pneumonia vaccine, or document when their last pneumonia vaccine? Did they get antibiotics in a timely fashion?
All of the above measures are supposed to improve survival, at least according to studies. And they do, when applied to research populations. However, the improvement is not a large one. For example, in this study of ACE inhbitors in heart failure, the ACE inhbitors improved survival by
four percent. That's not a dramatic difference, which is probably why the study failed to find dramatic differences in hospitals who met the standards and those who didn't.
Here's what counts for quality in hospitals - cleanliness and good nursing care. Hospitals don't measure those parameters, though. It's much harder to measure the worth of a nurse than to send someone around to check off documentation points on a chart. You can enact every principle of evidence based guidelines and it won't do squat for the patient if they are only attended by a nurse's aid with six weeks of training who can't recognize a turn for the worst, while the fully trained nurses are pre-occupied with fulfilling the documentation requirements.
We have vastly over-rated the improvements we get from easy pharmacological fixes while simultaneously under-rating the value of basic medical care and judgment. And with the coming of pay for performance, the mismeasure will only get worse.
posted by Sydney on 12/13/2006 09:07:00 PM 6 comments
The article forget to point out something very important- the actual scores that were in the top and bottom deciles. They are so close that even the bottom ten or twenty percent still have compliance rates of over 90% for most measures.
By 2:41 PM, at
I didn't actually make my point, which is this:
By 2:44 PM, at
This reminds me of a story I read some time ago. An American couple were traveling in the UK when she had a heart attack. Taken to the emergency room she was treated and after she was stable and the worse was over she was moved to a room.
By 7:13 PM, at
By 2:21 PM, at
The role of measurement should be to determine what adds value -- tracking that, and ignoring the rest. If good nursing care is more significant to outcome effectiveness than medical methodology, then do what you want, medically; just have good nurses. And if good nursing care can't be quantified, you have to rely on anecdotal evidence. Surely, you're not proposing either?
I agree. Trying an easy fix like this is likely to create more controversial issues without yielding much benefit.