Measuring NHS productivity
http://www.100md.com
《英国医生杂志》
How much health for the pound, not how many events for the pound
The October 2004 report from the UK Office for National Statistics (ONS) invites potentially harmful misunderstandings.1 The report concludes that NHS productivity declined by 3-8% (depending on the method of calculation) between 1995 and 2003. "Production of what?" is the key question here. If we ask the wrong question the answer may lead us to the wrong policy conclusion.
The job of the health service is to produce health—to relieve suffering. In the words of National Academy of Sciences in the United States, "The purpose of the health care system is to reduce continually the burden of illness, injury, and disability, and to improve the health status and function of the people..."2 Ideally, the term productivity, as applied to the NHS, ought to refer to the ratio of inputs (such as labour, capital, and supplies) to that output, not just counts of activities.
Of course the burden of illness, injury, and disability is very hard to measure, and so we use surrogates when we assess healthcare systems, whence the hazard. But the difficulty of assessing productivity is no excuse for using misleading shortcuts. By definition, holding inputs constant, the aim of a more productive healthcare system is to offer more health than a less productive one. Useful measurements ought to help us understand how well the NHS is achieving that aim. The measurements offered by the ONS do not do that; they merely describe its activities.
The ONS calculates productivity as a ratio of inputs to outputs, defining outputs as a weighted average of 16 types of care activity, such as hospital cases, visits to doctors, and ambulance journeys. The weights reflect some view of the importance of each output. For example, the output index of the ONS weights an inpatient treatment 14 times as heavily as an outpatient treatment. The measurement takes no account, however, of the degree to which those events accomplish their purpose—healing. For example, it does not assess improvement in the mix of these so called outputs, such as when innovations in care allow patients to be treated successfully in outpatient settings rather than in the hospital. To its credit, the ONS notes carefully that "the output estimates do not capture quality change."1 Its interpreters need to show equal caution.
Measuring productivity without regard to quality or value is a risky foundation for wise policy. In globally competitive markets manufacturing and service companies that take that route often find themselves in deep trouble, because their customers know better. Think about the assertion of one UK newspaper columnist who wrote of the ONS report, "Whether quality should properly be counted in the value for money calculations is open to question."3 Will she feel the same when she buys her next car, refrigerator, or restaurant meal? Probably not.
In fact objective assessments supported or done by the Nuffield Trust,4 the Royal College of Physicians,5 and several university research groups have confirmed reductions in waits and delays, improvements in reliability and outcomes of care for cancer, heart disease, and probably orthopaedics, and smoother flow in emergency departments in the NHS between 1995 and 2003.6 Moreover, primary care access, in particular, has soared during this period, and innovations in non-visit care, especially NHS Direct, make the United Kingdom an international pioneer. Other areas are lagging behind—including smoking rates, obesity, and standardised population death rates7—and need to be tackled.
When the Labour government set about to improve the NHS through its modernisation process, politicians had every reason to believe—although they might not have recognised it—that the form of productivity measured by the ONS would and should fall. They were investing more money—raising inputs for sure, and were consciously changing the nature, quality, and intent of the units of service that the ONS report is counting as "outputs." Appropriate resources and sound improvements to the process should and did have the effect of making care far more effective, far more valuable, for patients.
The NHS faces major hurdles and is engaged in risky innovations such as foundation trusts; the new, performance based contract for general practitioners; a massive investment in information technology; and some dabbling in health care imported from offshore organisations. These innovations are far more consequential than changes in the type of productivity reported by the ONS, and they need to be carefully managed, treated as social experiments, adjusted as time passes, and assessed objectively. Their proper assessment requires that policy makers rely not on simple, potentially misleading metrics of numerical throughput but rather seek answers to the tougher and far more important question of value for money. The people of the UK should be not asking, "How many events for the pound?" but rather, "How much health for the pound?" At least, that is what they should ask if they desire an NHS that can keep them healthy and safe at an affordable price for as long as is feasible.
Donald M Berwick, president
Institute for Healthcare Improvement, 20 University Road, Cambridge MA 02138, USA (dberwick@ihi.org)
Competing interests: None declared.
References
National Statistics Online. Public service productivity: health. October 2004. www.statistics.gov.uk/articles/economic_trends/ET613Lee.pdf (accessed 14 Apr 2005).
Institute of Medicine. Crossing the quality chasm: a new health system for the 21st century. Washington, DC: National Academy Press, 2001.
Hall C. Statistics do not try to measure improved quality. Daily Telegraph 2004 October 20: 10.
Leatherman S, Sutherland K. The quest for quality in the NHS: a mid-term evaluation of the 10-year quality agenda. London: Stationery Office: 2003.
How the NHS manages heart attacks: Third public report of the myocardial infarction national audit project. London: Clinical Effectiveness and Evaluation Unit, Royal College of Physicians, 2004.
Department of Health. Chief executive's report to the NHS December 2004. www.dh.gov.uk/PublicationsAndStatistics/Publications/PublicationsPolicyAndGuidance/ PublicationsPolicyAndGuidanceArticle/fs/en?CONTENT_ID=4097366&chk=AGojLE (accessed 14 Apr 2005).
Department of Health. Health survey for England 2003. London: Stationery Office, 2004. www.dh.gov.uk/PublicationsAndStatistics/Publications/PublicationsStatistics/PublicationsStatisticsArticle/ fs/en?CONTENT_ID=4098712&chk=F4kphd (accessed 14 Apr 2005).
The October 2004 report from the UK Office for National Statistics (ONS) invites potentially harmful misunderstandings.1 The report concludes that NHS productivity declined by 3-8% (depending on the method of calculation) between 1995 and 2003. "Production of what?" is the key question here. If we ask the wrong question the answer may lead us to the wrong policy conclusion.
The job of the health service is to produce health—to relieve suffering. In the words of National Academy of Sciences in the United States, "The purpose of the health care system is to reduce continually the burden of illness, injury, and disability, and to improve the health status and function of the people..."2 Ideally, the term productivity, as applied to the NHS, ought to refer to the ratio of inputs (such as labour, capital, and supplies) to that output, not just counts of activities.
Of course the burden of illness, injury, and disability is very hard to measure, and so we use surrogates when we assess healthcare systems, whence the hazard. But the difficulty of assessing productivity is no excuse for using misleading shortcuts. By definition, holding inputs constant, the aim of a more productive healthcare system is to offer more health than a less productive one. Useful measurements ought to help us understand how well the NHS is achieving that aim. The measurements offered by the ONS do not do that; they merely describe its activities.
The ONS calculates productivity as a ratio of inputs to outputs, defining outputs as a weighted average of 16 types of care activity, such as hospital cases, visits to doctors, and ambulance journeys. The weights reflect some view of the importance of each output. For example, the output index of the ONS weights an inpatient treatment 14 times as heavily as an outpatient treatment. The measurement takes no account, however, of the degree to which those events accomplish their purpose—healing. For example, it does not assess improvement in the mix of these so called outputs, such as when innovations in care allow patients to be treated successfully in outpatient settings rather than in the hospital. To its credit, the ONS notes carefully that "the output estimates do not capture quality change."1 Its interpreters need to show equal caution.
Measuring productivity without regard to quality or value is a risky foundation for wise policy. In globally competitive markets manufacturing and service companies that take that route often find themselves in deep trouble, because their customers know better. Think about the assertion of one UK newspaper columnist who wrote of the ONS report, "Whether quality should properly be counted in the value for money calculations is open to question."3 Will she feel the same when she buys her next car, refrigerator, or restaurant meal? Probably not.
In fact objective assessments supported or done by the Nuffield Trust,4 the Royal College of Physicians,5 and several university research groups have confirmed reductions in waits and delays, improvements in reliability and outcomes of care for cancer, heart disease, and probably orthopaedics, and smoother flow in emergency departments in the NHS between 1995 and 2003.6 Moreover, primary care access, in particular, has soared during this period, and innovations in non-visit care, especially NHS Direct, make the United Kingdom an international pioneer. Other areas are lagging behind—including smoking rates, obesity, and standardised population death rates7—and need to be tackled.
When the Labour government set about to improve the NHS through its modernisation process, politicians had every reason to believe—although they might not have recognised it—that the form of productivity measured by the ONS would and should fall. They were investing more money—raising inputs for sure, and were consciously changing the nature, quality, and intent of the units of service that the ONS report is counting as "outputs." Appropriate resources and sound improvements to the process should and did have the effect of making care far more effective, far more valuable, for patients.
The NHS faces major hurdles and is engaged in risky innovations such as foundation trusts; the new, performance based contract for general practitioners; a massive investment in information technology; and some dabbling in health care imported from offshore organisations. These innovations are far more consequential than changes in the type of productivity reported by the ONS, and they need to be carefully managed, treated as social experiments, adjusted as time passes, and assessed objectively. Their proper assessment requires that policy makers rely not on simple, potentially misleading metrics of numerical throughput but rather seek answers to the tougher and far more important question of value for money. The people of the UK should be not asking, "How many events for the pound?" but rather, "How much health for the pound?" At least, that is what they should ask if they desire an NHS that can keep them healthy and safe at an affordable price for as long as is feasible.
Donald M Berwick, president
Institute for Healthcare Improvement, 20 University Road, Cambridge MA 02138, USA (dberwick@ihi.org)
Competing interests: None declared.
References
National Statistics Online. Public service productivity: health. October 2004. www.statistics.gov.uk/articles/economic_trends/ET613Lee.pdf (accessed 14 Apr 2005).
Institute of Medicine. Crossing the quality chasm: a new health system for the 21st century. Washington, DC: National Academy Press, 2001.
Hall C. Statistics do not try to measure improved quality. Daily Telegraph 2004 October 20: 10.
Leatherman S, Sutherland K. The quest for quality in the NHS: a mid-term evaluation of the 10-year quality agenda. London: Stationery Office: 2003.
How the NHS manages heart attacks: Third public report of the myocardial infarction national audit project. London: Clinical Effectiveness and Evaluation Unit, Royal College of Physicians, 2004.
Department of Health. Chief executive's report to the NHS December 2004. www.dh.gov.uk/PublicationsAndStatistics/Publications/PublicationsPolicyAndGuidance/ PublicationsPolicyAndGuidanceArticle/fs/en?CONTENT_ID=4097366&chk=AGojLE (accessed 14 Apr 2005).
Department of Health. Health survey for England 2003. London: Stationery Office, 2004. www.dh.gov.uk/PublicationsAndStatistics/Publications/PublicationsStatistics/PublicationsStatisticsArticle/ fs/en?CONTENT_ID=4098712&chk=F4kphd (accessed 14 Apr 2005).