Biomed Middle East

Change Promised for Hospital Quality Measures

Many of the metrics currently used to assess quality of hospital care fail to provide true and valuable accountability, according to leaders of a key organization that helped develop them.

Quality measures now in place include some that “only give us a false sense of accomplishment and reward ‘gaming'” or otherwise fail to meet basic criteria for meaningful assessments of patient care, Mark Chassin, MD, MPP, MPH, president of the Joint Commission, and colleagues wrote in the Aug. 12 issue of the New England Journal of Medicine.

The group promised that the quality-measurement system would be revamped, at least at the Joint Commission, to eliminate metrics that inadvertently promote bad medicine or fail to reflect desired outcomes.

It was the Joint Commission that, in 1998, launched the first national effort to collect and publish data on the performance of individual hospitals. It now has a slate of 57 inpatient process and outcome measures, with 31 of these publicly reported.

But some of these measures miss their intended targets, Chassin and colleagues — who included two Joint Commission staff members along with a prominent academic researcher in hospital medicine, Robert Wachter, MD, of the University of California San Francisco — acknowledged in a “Sounding Board” essay. (They also emphasized that they were stating their own opinions, not official policy of the Joint Commission.)

The authors outlined the following four criteria that all measures of hospital care processes should meet:

•There is a strong evidence base showing that the care process leads to improved outcomes.
•The measure accurately captures whether the evidence-based care process has, in fact, been provided.
•The measure addresses a process that has few intervening care processes that must occur before the improved outcome is realized.
•Implementing the measure has little or no chance of inducing unintended adverse consequences.

Chassin and colleagues conceded that some existing measures do not meet all these standards. For example, they noted that physicians simply check a box on a form to indicate that comprehensive discharge planning or smoking-cessation counseling was provided, but this does not indicate whether the planning or counseling was effective enough to achieve the desired goals.

“We know that for patients with heart failure, comprehensive education at discharge and coordination of care after discharge lead to improvements in functional outcomes, reductions in emergency department visits, and fewer hospitalizations, but our current measure is incapable of judging the quality of the process,” according to Chassin and colleagues.

“We were, therefore, not surprised when researchers recently found no relationship between hospital performance on the discharge-instruction measure for heart failure and readmission rates,” they wrote.

They also noted that another current measure of heart-failure care quality — assessment of left ventricular function — is not very useful because too many other unmeasured factors also have to be present for it to make a difference.

“Although all patients with heart failure should have their ventricular function measured at some point, many other correctly performed clinical processes must occur after the test has been performed for the patient to have an improved outcome,” Chassin and colleagues explained.

They also admitted that implementing at least one quality measure had a wholly unintended and undesirable effect on clinical practice.

“Some evidence suggests that administering the first dose of an antibiotic to a patient with community-acquired pneumonia within the first several hours after the patient’s arrival at the hospital improves outcomes,” they wrote.

“However, the initial Joint Commission and CMS measure of that process (first dose of antibiotic within four hours [later relaxed to six hours] after arrival at the hospital) undoubtedly led to the inappropriate administration of antibiotics to patients who did not truly have pneumonia.”

But these failures are the exception, they argued. Of 28 Joint Commission “core” measures also used by CMS, “we believe that 22 meet all four criteria” for useful accountability metrics, the authors indicated.

They also noted that, according to 2009 data, 85.9% of the 3,123 hospitals providing data to the Joint Commission had more than 90% compliance with these measures.

Chassin and colleagues indicated that the Joint Commission was adopting the four criteria as the basis for its quality measurements.

“Fortunately, as the science has advanced, we now have a surfeit of measures that meet all four accountability criteria with which to populate accreditation, public reporting, and pay-for-performance programs,” they argued.

They also promised to get rid of measures that don’t work.

“A vital part of this program, largely absent today, will be a formal process of assessing experience with the measures and using that information to improve the development of measures and decisions regarding deployment,” they wrote.

source: New England Journal of Medicine

Exit mobile version