Framing Bias in the Interpretation of Quality Improvement Data: Evidence From an Experiment

Document Type : Original Article


School of Public Affairs and Administration, Rutgers University, Newark, NJ, USA


A growing body of public management literature sheds light on potential shortcomings to quality improvement (QI) and performance management efforts. These challenges stem from heuristics individuals use when interpreting data. Evidence from studies of citizens suggests that individuals’ evaluation of data is influenced by the linguistic framing or context of that information and may bias the way they use such information for decision-making. This study extends prospect theory into the field of public health QI by utilizing an experimental design to test for equivalency framing effects on how public health professionals interpret common QI indicators.
An experimental design utilizing randomly assigned survey vignettes is used to test for the influence of framing effects in the interpretation of QI data. The web-based survey assigned a national sample of 286 city and county health officers to a “positive frame” group or a “negative frame” group and measured perceptions of organizational performance. The majority of respondents self-report as organizational leadership.
Public health managers are indeed susceptible to these framing effects and to a similar degree as citizens. Specifically, they tend to interpret QI information presented in a “positive frame” as indicating a higher level of performance as the same underlying data presenting in a “negative frame.” These results are statistically significant and pass robustness checks when regressed against control variables and alternative sources of information.
This study helps identify potential areas of reform within the reporting aspects of QI systems. Specifically, there is a need to fully contextualize data when presenting even to subject matter experts to reduce the existence of bias when making decisions and introduce training in data presentation and basic numeracy prior to fully engaging in QI initiatives.


Main Subjects

  1. Corso LC, Wiesner PJ, Halverson PK, Brown CK. Using the essential services as a foundation for performance measurement and assessment of local public health systems. J Public Health Manag Pract. 2000;6(5):1-18.
  2. Riley WJ, Moran JW, Corso LC, Beitsch LM, Bialek R, Cofsky A. Defining quality improvement in public health. J Public Health Manag Pract. 2010;16(1):5-7. doi:10.1097/PHH.0b013e3181bedb49
  3. McLees AW, Nawaz S, Thomas C, Young A. Defining and assessing quality improvement outcomes: a framework for public health. Am J Public Health. 2015;105 Suppl 2:S167-173. doi:10.2105/ajph.2014.302533
  4. Behn RD. Why measure performance? Different purposes require different measures. Public Adm Rev. 2003;63(5):586-606. doi:10.1111/1540-6210.00322
  5. Jakobsen ML, Baekgaard M, Moynihan DP, van Loon N. Making sense of performance regimes: Rebalancing external accountability and internal learning. Perspectives on Public Management and Governance. 2017;1(2):127-141. doi:10.1093/ppmgov/gvx001
  6. de Lancer Julnes P, Holzer M. Promoting the utilization of performance measures in public organizations: An empirical study of factors affecting adoption and implementation. Public Adm Rev. 2001;61(6):693-708. doi:10.1111/0033-3352.00140
  7. Moynihan DP. The dynamics of performance management: Constructing information and reform. Georgetown University Press; 2008.
  8. James O, Jilke SR, Van Ryzin GG. Experiments in public management research: Challenges and contributions. Cambridge University Press; 2017.
  9. Baekgaard M, Christensen J, Dahlmann CM, Mathiasen A, Petersen NBG. The role of evidence in politics: Motivated reasoning and persuasion among politicians. Br J Polit Sci. 2017;1-24. doi:10.1017/S0007123417000084
  10. Hjortskov M, Andersen SC. Cognitive biases in performance evaluations. J Public Adm Res Theory. 2015;26(4):647-662. doi:10.1093/jopart/muv036
  11. Marvel JD. Unconscious bias in citizens’ evaluations of public sector performance. J Public Adm Res Theory. 2015;26(1):143-158. doi:10.1093/jopart/muu053
  12. Kahneman D, Tversky A. Prospect theory: An analysis of decision under risk. Econometrica. 1979;47(2):263-292. doi:10.2307/1914185
  13. Erwin PC. The performance of local health departments: a review of the literature. J Public Health Manag Pract. 2008;14(2):E9-18. doi:10.1097/01.phh.0000311903.34067.89
  14. Handler A, Issel M, Turnock B. A conceptual framework to measure performance of the public health system. Am J Public Health. 2001;91(8):1235-1239.
  15. Mauer BJ, Mason M, Brown B. Application of quality measurement and performance standards to public health systems: Washington State's approach. J Public Health Manag Pract. 2004;10(4):330-337.
  16. Kroll A. Explaining the use of performance information by public managers: A planned-behavior approach. Am Rev Public Adm. 2015;45(2):201-215. doi:10.1177/0275074013486180
  17. Baekgaard M, Nielsen PA. Performance information, blame avoidance, and politicians' attitudes to spending and reform: Evidence from an experiment. J Public Adm Res Theory. 2013;25(2):545-569. doi:10.1093/jopart/mut051
  18. George B, Baekgaard M, Decramer A, Audenaert M, Goeminne S. Institutional isomorphism, negativity bias and performance information use by politicians: A survey experiment. Public Adm. 2018;1-15. doi:10.1111/padm.12390
  19. Baekgaard M, Serritzlew S. Interpreting performance information: Motivated reasoning or unbiased comprehension. Public Adm Rev. 2016;76(1):73-82. doi:10.1111/puar.12406
  20. Olsen AL. Citizen (dis) satisfaction: An experimental equivalence framing study. Public Adm Rev. 2015;75(3):469-478. doi:10.1111/puar.12337
  21. Jakobsen M, Petersen NBG, Laumann TV. Acceptance or Disapproval: Performance Information in the Eyes of Public Frontline Employees. J Public Adm Res Theory. 2018;29(1):101-117. doi:10.1093/jopart/muy035
  22. Accreditation Background.  Accessed February 1, 2017.
  23. Carman AL, Timsina L. Public health accreditation: rubber stamp or roadmap for improvement. Am J Public Health. 2015;105 Suppl 2:S353-359. doi:10.2105/ajph.2015.302568
  24. Behn RD. The PerformanceStat potential: A leadership strategy for producing results. Brookings Institution Press and Ash Center for Democratic Governance and Innovation; 2014.
  25. Donabedian A. The definition of quality and approaches to its assessment. Exploration in quality assessment and monitoring. Ann Arbor: Health Administration Press; 1980.
  26. Kahneman D, Tversky A. Prospect theory: An analysis of decision under risk. Handbook of the fundamentals of financial decision making: Part I. World Scientific; 2013:99-127.
  27. Levin IP, Schneider SL, Gaeth GJ. All frames are not created equal: a typology and critical analysis of framing effects. Organ Behav Hum Decis Process. 1998;76(2):149-188.
  28. Levin IP, Gaeth GJ. How consumers are affected by the framing of attribute information before and after consuming the product. J Consum Res. 1988;15(3):374-378.
  29. Edwards A, Elwyn G, Covey J, Matthews E, Pill R. Presenting risk information--a review of the effects of "framing" and other manipulations on patient outcomes. J Health Commun. 2001;6(1):61-82. doi:10.1080/10810730150501413
  30. Olsen A. Leftmost-digit-bias in an enumerated public sector? An experiment on citizens' judgment of performance information. Judgm Decis Mak. 2013;8(3):365-371.
  31. Weiner MD, Puniello OT, Noland RB. Conducting Efficient Transit Surveys of Households Surrounding Transit-Oriented Developments. Transp Res Rec. 2016;2594(1):44-50. doi:10.3141/2594-08
  32. Dillman DA, Smyth JD, Christian LM. Internet, phone, mail, and mixed mode surveys: The tailored design method. John Wiley & Sons; 2014.
  33. Atzmuller C, Steiner PM. Experimental vignette studies in survey research. Methodology. 2010;6(3):128-138. doi:10.1027/1614-2241/a000014
  34. Hainmueller J, Hangartner D, Yamamoto T. Validating vignette and conjoint survey experiments against real-world behavior. Proc Natl Acad Sci U S A. 2015;112(8):2395-2400. doi:10.1073/pnas.1416587112
  35. Alexander CS, Becker HJ. The use of vignettes in survey research. Public Opin Q. 1978;42(1):93-104. doi:10.1086/268432
  36. Groves RM, Fowler FJ Jr, Couper MP, Lepkowski JM, Singer E, Tourangeau R. Survey Methodology. John Wiley & Sons; 2011.
  37. Meier KJ, O’Toole LJ, Jr. Subjective organizational performance and measurement error: Common source bias and spurious relationships. J Public Adm Res Theory. 2012;23(2):429-456. doi:10.1093/jopart/mus057
  38. Remler DK, Van Ryzin GG. Research methods in practice: Strategies for description and causation. Sage Publications; 2010.
  39. Holzer M, Ballard A, Kim M, Peng S, Deat F. Obstacles and opportunities for sustaining performance management systems. Int J Public Adm. 2019;42(2):132-143. doi:10.1080/01900692.2017.1405445
  40. Moynihan DP, Lavertu S. Does involvement in performance management routines encourage performance information use? Evaluating GPRA and PART. Public Adm Rev. 2012;72(4):592-602. doi:10.1111/j.1540-6210.2011.02539.x
  41. Elting LS, Martin CG, Cantor SB, Rubenstein EB. Influence of data display formats on physician investigators' decisions to stop clinical trials: prospective trial with repeated measures. BMJ. 1999;318(7197):1527-1531.
  42. Ogiela L, Ogiela MR. Cognitive techniques in visual data interpretation. Springer; 2009.
  43. Kahan DM. Ideology, motivated reasoning, and cognitive reflection: An experimental study. Judgm Decis Mak. 2013;8(4):407-424. doi:10.2139/ssrn.2182588
  44. Hvidman U, Andersen SC. Perceptions of public and private performance: Evidence from a survey experiment. Public Adm Rev. 2016;76(1):111-120. doi:10.1111/puar.12441
  45. Marvel JD. Public opinion and public sector performance: Are individuals' beliefs about performance evidence-based or the product of anti–public sector bias? International Public Management Journal. 2015;18(2):209-227.
  46. Cavalluzzo KS, Ittner CD. Implementing performance measurement innovations: evidence from government. ‎Account Organ Soc. 2004;29(3-4):243-267. doi:10.1016/S0361-3682(03)00013-8
Volume 8, Issue 5
May 2019
Pages 307-314
  • Receive Date: 20 June 2018
  • Revise Date: 30 November 2018
  • Accept Date: 16 February 2019
  • First Publish Date: 01 May 2019