Performance monitoring should take costs to heart
http://www.100md.com
《英国医生杂志》
EDITOR—The three papers on the performance of cardiac surgeons in the issue of 21 August show that expertise combined with statistical thinking can lead to rational performance monitoring without bullying overtones, disputed targets, or misconceived "naming and shaming."1-5
Publicly naming individual surgeons as "not meeting" even the 99.99% standard proposed by Keogh et al remains problematic if the underlying problem is commonly discovered to be specific to a process or organisation rather than a surgeon.3 Until an empirical database exists that clearly persuades cardiac surgeons and the public that resolutions from "taking a closer look" under the proposed new monitoring scheme are more likely to be surgeon specific than institutional, politicians and others should not publicly name the surgeons whose results trigger taking a closer look. Instead, report only that a closer look is being taken, the results being fairly and frankly reported at a specified date. The Healthcare Commission should also reserve the right to take a closer look, randomly as well as responsively.
Cost effectiveness was missing in all of the papers.5 How much did the original review of data quality in 10 units cost, how much did the national clinical audit support programme in the NHS Information Authority invest to incorporate and augment the Society of Cardiothoracic Surgeons' database, and how much does the appointment of local data managers cost annually? What are the annual costs of flagging patients for mortality, of analysing data and validating results? Why—if a performance monitoring protocol was in place, as recommended5—was the implementation phase so underestimated that the first data trickled into the central cardiac audit database in October 2003, too late for the production of validated, risk adjusted, surgeon specific results in 2004?
The effectiveness of performance monitoring is more difficult to measure than its costs, but costly performance monitoring should be designed so that effectiveness can be measured—one of 11 key recommendations of the Royal Statistical Society's working party on performance monitoring in the public services.5
Sheila M Bird, senior statistician
MRC Biostatistics Unit, Cambridge CB2 2SR sheila.bird@mrc-bsu.cam.ac.uk
Competing interests: None declared.
References
Bridgewater B, Grayson AD, Au J, Hassan R, Dihmis WC, Munsch C, et al. Improving mortality of coronary surgery over first four years of independent practice: retrospective examination of prospectively collected data from 15 surgeons. BMJ 2004;329: 421. (21 August.)
Zamvar V. Reporting systems for cardiac surgery. BMJ 2004;329: 413-4. (21 August.)
Keogh B, Spiegelhalter D, Bailey A, Roxburgh J, Magee P, Hilton C. The legacy of Bristol: public disclosure of individual surgeons' results. BMJ 2004;329: 450-4. (21 August.)
Campbell NC, Murchie P. Treating hypertension with guidelines in general practice. BMJ 2004;329: 523-4. (4 September.)
Royal Statistical Society Working Party on Performance Monitoring in the Public Services. Performance indicators: good, bad, and ugly. Available at: www.rss.org.uk/archive/performance/index.html
Publicly naming individual surgeons as "not meeting" even the 99.99% standard proposed by Keogh et al remains problematic if the underlying problem is commonly discovered to be specific to a process or organisation rather than a surgeon.3 Until an empirical database exists that clearly persuades cardiac surgeons and the public that resolutions from "taking a closer look" under the proposed new monitoring scheme are more likely to be surgeon specific than institutional, politicians and others should not publicly name the surgeons whose results trigger taking a closer look. Instead, report only that a closer look is being taken, the results being fairly and frankly reported at a specified date. The Healthcare Commission should also reserve the right to take a closer look, randomly as well as responsively.
Cost effectiveness was missing in all of the papers.5 How much did the original review of data quality in 10 units cost, how much did the national clinical audit support programme in the NHS Information Authority invest to incorporate and augment the Society of Cardiothoracic Surgeons' database, and how much does the appointment of local data managers cost annually? What are the annual costs of flagging patients for mortality, of analysing data and validating results? Why—if a performance monitoring protocol was in place, as recommended5—was the implementation phase so underestimated that the first data trickled into the central cardiac audit database in October 2003, too late for the production of validated, risk adjusted, surgeon specific results in 2004?
The effectiveness of performance monitoring is more difficult to measure than its costs, but costly performance monitoring should be designed so that effectiveness can be measured—one of 11 key recommendations of the Royal Statistical Society's working party on performance monitoring in the public services.5
Sheila M Bird, senior statistician
MRC Biostatistics Unit, Cambridge CB2 2SR sheila.bird@mrc-bsu.cam.ac.uk
Competing interests: None declared.
References
Bridgewater B, Grayson AD, Au J, Hassan R, Dihmis WC, Munsch C, et al. Improving mortality of coronary surgery over first four years of independent practice: retrospective examination of prospectively collected data from 15 surgeons. BMJ 2004;329: 421. (21 August.)
Zamvar V. Reporting systems for cardiac surgery. BMJ 2004;329: 413-4. (21 August.)
Keogh B, Spiegelhalter D, Bailey A, Roxburgh J, Magee P, Hilton C. The legacy of Bristol: public disclosure of individual surgeons' results. BMJ 2004;329: 450-4. (21 August.)
Campbell NC, Murchie P. Treating hypertension with guidelines in general practice. BMJ 2004;329: 523-4. (4 September.)
Royal Statistical Society Working Party on Performance Monitoring in the Public Services. Performance indicators: good, bad, and ugly. Available at: www.rss.org.uk/archive/performance/index.html