Gaming New Zealand’s Emergency Department Target: How and Why Did It Vary Over Time and Between Organisations?

Document Type : Original Article

Authors

1 Faculty of Medical and Health Sciences, The University of Auckland, Auckland, New Zealand

2 Auckland District Health Board Faculty of Medical and Health Sciences, University of Auckland, Auckland, New Zealand

3 Auckland District Health Board, Auckland, New Zealand

4 School of Population Health, Faculty of Medical and Health Sciences, University of Auckland, Auckland, New Zealand

Abstract

Background
Gaming is a potentially dysfunctional consequence of performance measurement and management systems in the health sector and more generally. In 2009, the New Zealand government initiated a Shorter Stays in Emergency Department (SSED) target in which 95% of patients would be admitted, discharged or transferred from an emergency department (ED) within 6 hours. The implementation of similar targets in England led to well-documented practices of gaming. Our research into ED target implementation sought to answer how and why gaming varies over time and between organisations.
 
Methods
We developed a mixed-methods approach. Four organisation case study sites were selected. ED lengths of stay (ED LOS) were collected over a 6-year period (2007-2012) from all sites and indicators of target gaming were developed. Two rounds of surveys with managers and clinicians were conducted. Interviews (n = 68) were conducted with clinicians and managers in EDs and the wider hospital in two phases across all sites. The interview data was used to develop explanations of the patterns of variation across time and across sites detected in the ED LOS data.
 
Results
Our research established that gaming behaviour – in the form of ‘clock-stopping’ and decanting patients to short-stay units (SSUs) or observation beds to avoid target breaches – was common across all 4 case study sites. The opportunity to game was due to the absence of independent verification of ED LOS data. Gaming increased significantly over time (2009-2012) as the means to game became more available, usually through the addition or expansion of short-stay facilities attached to EDs. Gaming varied between sites, but those with the highest levels of gaming differed substantially in terms of organisational dynamics and motives. In each case, however, high levels of gaming could be attributed to the strategies of senior management more than to the individual motivations of frontline staff.
 
Conclusion
Gaming of New Zealand’s ED target increased after the real benefits (in terms of process improvement) of the target were achieved. Gaming of ED targets could be minimised by eliminating opportunities to game through independent verification, or by monitoring and limiting the means and motivations to game.

Highlights

 Commentaries Published on this Paper

 

  • Beyond Targets: Measuring Better and Rebuilding Trust; Comment on “Gaming New Zealand’s Emergency Department Target: How and Why Did It Vary Over Time and Between Organisations?”

         Abstract | PDF

 

  • Improve the Design and Implementation of Metrics From the Perspective of Complexity Science; Comment on “Gaming New Zealand’s Emergency Department Target: How and Why Did It Vary Over Time and Between Organisations?”

         Abstract | PDF

 

  •  Games People Play: Lessons on Performance Measure Gaming from New Zealand; Comment on “Gaming New Zealand’s Emergency Department Target: How and Why Did It Vary Over Time and Between Organisations?”

         Abstract | PDF

 

Authors' Response to the Commentaries

 

  •  If Gaming is the Problem, Is “Complexity Thinking” the Answer? A Response to the Recent Commentaries

          Abstract | PDF

Keywords

Main Subjects


  1. Van Dooren W, Bouckaert G, Halligan J. Performance Management in the Public Sector. London: Routledge; 2010.
  2. Smith P. On the unintended consequences of publishing performance data in the public sector. Int J Public Adm. 1995;18(2-3):277-310. doi:10.1080/01900699508525011
  3. Bevan G, Hood C. What’s measured is what matters: targets and gaming in the English public health care system. Public Adm. 2006;84(3):517-538. doi:10.1111/j.1467-9299.2006.00600.x
  4. Kelman S, Friedman JN. Performance improvement and performance dysfunction: an empirical examination of distortionary impacts of the emergency room wait-time target in the English National Health Service. J Public Adm Res Theory. 2009;19(4):917-946. doi:10.1093/jopart/mun028
  5. Mannion R, Braithwaite J. Unintended consequences of performance measurement in healthcare: 20 salutary lessons from the English National Health Service. Intern Med J. 2012;42(5):569-574. doi:10.1111/j.1445-5994.2012.02766.x
  6. Radnor Z. Muddled, massaging, manoeuvring or manipulated? A typology of organisational gaming. Int J Prod Perform Manag. 2008;57(4):316-328. doi:10.1108/17410400810867526
  7. Mears A, Webley P. Gaming of performance measurement in health care: parallels with tax compliance. J Health Serv Res Policy. 2010;15(4):236-242. doi:10.1258/jhsrp.2010.009074
  8. Working Group for Achieving Quality in Emergency Departments. Recommendations to Improve Quality and the Measurement of Quality in New Zealand Emergency Departments. Wellington: Ministry of Health; 2008.
  9. Ardagh M. How to achieve New Zealand's shorter stays in emergency departments health target. N Z Med J. 2010;123(1316):95-103.
  10. Tenbensel T, Chalmers L, Willing E. Comparing the implementation consequences of the immunisation and emergency department health targets in New Zealand. J Health Organ Manag. 2016;30(6):1009-1024. doi:10.1108/jhom-08-2015-0126
  11. Bevan G, Hood C. Have targets improved performance in the English NHS? BMJ. 2006;332(7538):419-422. doi:10.1136/bmj.332.7538.419
  12. Jones P, Schimanski K. The four hour target to reduce Emergency Department 'waiting time': a systematic review of clinical outcomes. Emerg Med Australas. 2010;22(5):391-398. doi:10.1111/j.1742-6723.2010.01330.x
  13. Boyle A, Mason S. What has the 4-hour access standard achieved? Br J Hosp Med (Lond). 2014;75(11):620-622. doi:10.12968/hmed.2014.75.11.620
  14. Bevan G, Hamblin R. Hitting and missing targets by ambulance services for emergency calls: effects of different systems of performance measurement within the UK. J R Stat Soc Ser A Stat Soc. 2009;172(1):161-190. doi:10.1111/j.1467-985X.2008.00557.x
  15. Pawson R. Evidence-Based Policy: A Realist Perspective. London: Sage; 2006.
  16. Sanderson I. Complexity, 'practical rationality' and evidence-based policy making. Policy Polit. 2006;34(1):115-132. doi:10.1332/030557306775212188
  17. Pollitt C. Context in Public Policy and Management. Cheltenham, Glos, GBR: Edward Elgar Publishing; 2013.
  18. Drew J, Grant B. Means, Motive, and Opportunity – Local Government Data Distortion in a High-Stakes Environment. Aust J Public Adm. 2017;76(2):237-250. doi:10.1111/1467-8500.12225
  19. Pollitt C. The logics of performance management. Evaluation. 2013;19(4):346-363. doi:10.1177/1356389013505040
  20. Le Grand J. Motivation, Agency and Public Policy: Of Knights & Knaves, Pawns & Queens. Oxford: Oxford University Press; 2003.
  21. Bevan G. Performance measurement of “knights” and “knaves”: differences in approaches and impacts in British countries after devolution. Journal of Comparative Policy Analysis: Research and Practice. 2010;12(1-2):33-56. doi:10.1080/13876980903076187
  22. Janus K. The effect of professional culture on intrinsic motivation among physicians in an academic medical center. J Healthc Manag. 2014;59(4):287-303.
  23. Lipsky M. Street-Level Bureaucracy: Dilemmas of the Individual in Public Services. Cambridge: MIT Press; 1980.
  24. McCann L, Granter E, Hassard J, Hyde P. “You Can't Do Both—Something Will Give”: Limitations of the Targets Culture in Managing UK Health Care Workforces. Hum Resour Manage. 2015;54(5):773-791. doi:10.1002/hrm.21701
  25. Groth Andersson S, Denvall V. Data Recording in Performance Management: Trouble With the Logics. Am J Eval. 2017;38(2):190-204. doi:10.1177/1098214016681510
  26. 26.Boswell C, Rodrigues E. Policies, politics and organisational problems: multiple streams and the implementation of targets in UK government. Policy Polit. 2016;44(4):507-524. doi:10.1332/030557315X14477577990650
  27. Weber EJ, Mason S, Carter A, Hew RL. Emptying the corridors of shame: organizational lessons from England's 4-hour emergency throughput target. Ann Emerg Med. 2011;57(2):79-88.e71. doi:10.1016/j.annemergmed.2010.08.013
  28. Lægreid P, Neby S. Gaming, Accountability and Trust: DRGs and Activity-Based Funding in Norway. Financial Accountability & Management. 2016;32(1):57-79. doi:10.1111/faam.12080
  29. Mannion R, Harrison S, Jacobs R, Konteh F, Walshe K, Davies HT. From cultural cohesion to rules and competition: the trajectory of senior management culture in English NHS hospitals, 2001-2008. J R Soc Med. 2009;102(8):332-336. doi:10.1258/jrsm.2009.090066
  30. Locker TE, Mason SM. Are these emergency department performance data real? Emerg Med J. 2006;23(7):558-559. doi:10.1136/emj.2005.032748
  31. Locker T, Mason S, Wardrope J, Walters S. Targets and moving goal posts: changes in waiting times in a UK emergency department. Emerg Med J. 2005;22(10):710-714. doi:10.1136/emj.2004.019042
  32. Locker TE, Mason SM. Digit preference bias in the recording of emergency department times. Eur J Emerg Med. 2006;13(2):99-101. doi:10.1097/01.mej.0000195677.23780.fa
  33. Greene JC. Mixed Methods in Social Inquiry (Vol. 9). San Fransisco, CA: Jossey-Bass; 2007.
  34. Jones P, Chalmers L, Wells S, et al. Implementing performance improvement in New Zealand emergency departments: the six hour time target policy national research project protocol. BMC Health Serv Res. 2012;12:45. doi:10.1186/1472-6963-12-45
  35. Denis JL, Lamothe L, Langley A, et al. The reciprocal dynamics of organizing and sense-making in the implementation of major public-sector reforms. Can Public Adm. 2009;52(2):225-248. doi:10.1111/j.1754-7121.2009.00073.x
  36. Ferlie E, Fitzgerald L, McGivern G, Dopson S, Bennett C. Making wicked problems governable?: the case of managed networks in health care. Oxford, United Kingdom: Oxford University Press; 2013.
  37. Greene JC. The emergence of mixing methods in the field of evaluation. Qual Health Res. 2015;25(6):746-750. doi:10.1177/1049732315576499
  38. Tenbensel T, Chalmers L, Jones P, Appleton-Dyer S, Walton L, Ameratunga S. New Zealand's emergency department target - did it reduce ED length of stay, and if so, how and when? BMC Health Serv Res. 2017;17(1):678. doi:10.1186/s12913-017-2617-1
  39. Jones P, Sopina E, Ashton T. Resource implications of a national health target: The New Zealand experience of a Shorter Stays in Emergency Departments target. Emerg Med Australas. 2014;26(6):579-584. doi:10.1111/1742-6723.12312
  40. Locker TE, Mason SM. Analysis of the distribution of time that patients spend in emergency departments. BMJ. 2005;330(7501):1188-1189. doi:10.1136/bmj.38440.588449.AE
  41. Chalmers L. Inside the Black Box of Emergency Department Time Target Implementation in New Zealand. Auckland: Health Systems, School of Population Health, University of Auckland; 2014.
  42. Hamblin R. Regulation, measurements and incentives. The experience in the US and UK: does context matter? J R Soc Promot Health. 2008;128(6):291-298.
  43. Van Thiel S, Leeuw FL. The Performance Paradox in the Public Sector. Public Perform Manag Rev. 2002;25(3):267-281. doi:10.2307/3381236
  44. Jones P, Wells S, Harper A, et al. Impact of a national time target for ED length of stay on patient outcomes. N Z Med J. 2017;130(1455):15-34.
Volume 9, Issue 4
April 2020
Pages 152-162
  • Receive Date: 21 February 2019
  • Revise Date: 18 October 2019
  • Accept Date: 18 October 2019
  • First Publish Date: 01 April 2020