Commentary

October 05, 2006

Voluntary Accountability Doesn't Work

EST. READ TIME 4 MIN.

Late last week, the Toronto Star asked a very interesting question: Why can’t Ontarians compare or assess the quality of care inside their province’s hospitals? Indeed, while The Fraser Institute’s recently released Hospital Report Card: Ontario 2006 provided a direct comparison of the care inside Ontario’s acute care hospitals, only 43 out of a total of 136 hospitals allowed their names to be published along side their results. The other 93 decided that they would rather not be held accountable for their performance.

While many critics claim the report is “useless” for patients and the public because only about one-third of the province’s hospitals are named, others like the Toronto Star have latched on to the more important issue at hand: while The Fraser Institute’s report compares the quality of care in all of Ontario’s hospitals—a first for Canada—almost two-thirds of Ontario’s hospitals did not want to be accountable for their performance.

Claims that the report is based on poor methodology can be quickly dismissed. The Fraser Institute’s report employs a methodology for reporting on hospital performance developed by the United States Department of Health and Human Services’ Agency for Health Care Research and Quality (AHRQ). This same methodology is currently used to measure hospital performance in over a dozen US states including New York, Texas, Florida, Oregon and Colorado. The AHRQ methodology was also recently used in a report on Manitoba’s hospitals by the Manitoba Centre for Health Policy. The methodology used by Fraser Institute for reporting on Ontario’s hospitals is valid and of high quality.

So, if a hospital’s anonymity cannot be explained by concerns over the caliber of the report’s methodology, what can explain it? What was the great concern that kept the hospitals from participating?

The report’s findings shed some light on the answer. The Fraser Institute’s report card on Ontario’s hospitals included a summary measure of death rates, known as the Hospital Mortality Index or HMI. The HMI was created to allow an examination of the overall performance of a hospital relative to other hospitals in Ontario across nine measures of mortality. Looking at the average rankings for Ontario’s hospitals from 2002/03 to 2004/05, one point becomes quite clear: hospitals with poor performances were less likely to be named.

Specifically, six of the top 20 ranked hospitals were not identified in the report. On the other hand, only one of the bottom 20 hospitals was named. This means that 19 of the bottom 20 hospitals chose not to be held accountable for their performance on these measures.

In Canada, individuals can research any problems they have with their automobiles or home stereos from information willingly supplied by consumers, the vehicle’s manufacturer, and industry experts. Yet when it comes to health care, consumers have been left with remarkably little information about where the best services are available and many hospitals (particularly those with low HMI rankings) seem to want to keep it this way.

During the age of information, a time during which new Canadian privacy legislation has been written to increase the public’s access to information, are residents of Ontario willing to accept this? Do we think it is acceptable to know which compact car is the safest on the market in a collision but to not know if we are more likely to be harmed or die at a local hospital versus another one up the road?

This should be troubling for Ontarians. The majority of Ontario’s acute care hospitals, funded through tax dollars and delivering care to residents through a government program, would rather not have the public be able to access information to decide whether they should be going to hospital x, y, or z.

At the same time, Ontarians should applaud the 43 hospitals that agreed to be identified in The Fraser Institute’s report for their efforts to empower patients with information regarding the healthcare they receive and for their ongoing commitment to quality improvement through accountability and transparency.

And public reporting does indeed make hospital care better for all. Outside of Canada, reports like that published by The Fraser Institute have had a number of measurable impacts on performance and the quality of patient care. A notable example occurred in New York State where a 41 per cent decrease in the death rate in patients undergoing bypass surgery followed the publication of performance on this surgery. Pennsylvania and New Jersey experienced a similar overall trend following the publication of their report cards.

As the editor of Maclean’s Magazine commented, regarding the impetus for continuing to publish their annual University Rankings issue in the face of being boycotted by 22 out of 47 universities, “We are hard-pressed to see how increased public knowledge…can come from less public information…That’s why we will be publishing our 16th annual University Rankings Issue…”

And it is for this reason that The Fraser Institute will continue to publish its Hospital Report Card in Ontario and in other Canadian provinces. We believe that our report will contribute to the improvement of Ontario’s acute care hospitals by providing detailed and objective performance measurements directly to patients and to the general public. Whether or not more hospitals will agree to be identified will depend on public interest in holding their hospitals accountable.

STAY UP TO DATE

Join our mailing list so you never miss a thing!

STAY UP TO DATE

Join our mailing list so you never miss a thing!