---
topic: [[Assessing]]
people:
created: 2023-02-03
---
*Using long-term guiding principles as the outcome instead of short-term SMART objectives.*
Principles based program evaluation is an innovation described at Children's Hospital of Philadelphia to help guide programatic development and evaluation (Balmer, 2023). The goal is to determine if the program is moving towards the shared principles and values, as opposed to closing the loop on specific objectives.
This reminds me of the behavior change principle around habits: [[To build better habits, change your identity]] (Clear, 2018). If our identity is collaboration and wellness, we can ask how we did in terms of collaboration and wellness, instead of specific numerical [[SMART objectives]].
It's kind of like [[the answers you get come from the questions you ask]], and how the [[the theme system]] helps provide flexible guidance with [[reflection]] instead of "yes or no" objectives. And, if one of our questions allows room for [[emergence]] and [[Learning]], "What is happening? Are there other explanations as to what is happening?" (Haji, 2013), by getting [[🐓 Idea Farm/Backlog/beyond "did it work?"]] and the movement towards more wholistic outcomes and [[logic models]] such as those in Frye and Hammer, AMEE No. 67.
![[CleanShot 2023-02-03 at 09.57.20.jpg]]
This matters because when adherence to [[Developing Values]] are the goal, and we can focus on adherence to values and if we live our values, we can [[co-design]] iteratively, and use [[design-based research]] (Novak, 2022) to continuously move the needle towards our values. This is similar to the LEAN management principles of "the people doing the work have the best solutions," and the [[Appreciative Inquiry]] (Loty, 2014) based approach to "grow sustainable practices through identifying what already works."
Finally, one key problem in assessment is "early negative closure," meaning you try an intervention and decide it doesn't work because it doesn't make the SMART objectives in the time you provided. If you approach your educational interventions with a growth [[Mindset]] (Dweck, 2006)(Yeager, 2012) and instead apply the "not yet" assessment to your intervention, if you are moving towards your values, you can keep learning and measuring, and make progress towards your SMART objectives. This approach will allow the learning and progress to continue.
##### What would the opposite argument be?
Our learners, patients, and administrators expect outcomes, not values. We can only sell solutions that reach outcomes. By focusing on long-term values, we limit our ability to prove our value and the bosses and administrations that need us to show an outcome difference will not continue to support us. We live in the system we are in, and that system values outcomes over values.
%%
tags: #note/idea | #on/evaluation | #on/assessment | #on/objectives | #on/progress | #on/codesign
%%
##### Sources:
Allen, L. M., Hay, M., & Palermo, C. (2021). Evaluation in health professions education—Is measuring outcomes enough? _Medical Education_, medu.14654. [https://doi.org/10.1111/medu.14654](https://doi.org/10.1111/medu.14654)
Balmer, D. F., Anderson, H., & West, D. C. (2023). Program Evaluation in Health Professions Education: An Innovative Approach Guided by Principles. _Academic Medicine_, _98_(2), 204–208. [https://doi.org/10.1097/ACM.0000000000005009](https://doi.org/10.1097/ACM.0000000000005009)
Carol S. Dweck. (2006). _Mindset: The New Psychology of Success: Vol. 1st ed_. Random House; eBook Collection (EBSCOhost). [https://login.treadwell.idm.oclc.org/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=nlebk&AN=737546&site=eds-live&scope=site](https://login.treadwell.idm.oclc.org/login?url=https://search.ebscohost.com/login.aspx?direct=true&db=nlebk&AN=737546&site=eds-live&scope=site)
Clear, J. (2018). _Atomic habits: Tiny changes, remarkable results: an easy & proven way to build good habits & break bad ones_. Avery, an imprint of Penguin Random House.
Frye, A. W., & Hemmer, P. A. (2012). Program evaluation models and related theories: AMEE Guide No. 67. _Medical Teacher_, _34_(5), e288–e299. [https://doi.org/10.3109/0142159X.2012.668637](https://doi.org/10.3109/0142159X.2012.668637)
Haji, F., Morin, M.-P., & Parker, K. (2013). Rethinking programme evaluation in health professions education: Beyond ‘did it work?’: Rethinking health professions programme evaluation. _Medical Education_, _47_(4), 342–351. [https://doi.org/10.1111/medu.12091](https://doi.org/10.1111/medu.12091)
Johnston, S., Coyer, F. M., & Nash, R. (2018). Kirkpatrick’s Evaluation of Simulation and Debriefing in Health Care Education: A Systematic Review. _Journal of Nursing Education_, _57_(7), 393–398. [https://doi.org/10.3928/01484834-20180618-03](https://doi.org/10.3928/01484834-20180618-03)
Loty, J. (2014). The Power of Appreciative Inquiry: A Practical Guide to Positive Change. _Performance Improvement_, _53_(8), 45–48. [https://doi.org/10.1002/pfi.21433](https://doi.org/10.1002/pfi.21433)
Novak, D. A., & Hallowell, R. (2022). Design-Based Research: A Methodology for Studying Innovation in Teaching and Learning in Medical Education. _Academic Medicine_, _97_(7), 1088–1088. [https://doi.org/10.1097/ACM.0000000000004601](https://doi.org/10.1097/ACM.0000000000004601)
Yeager, D. S., & Dweck, C. S. (2012). Mindsets That Promote Resilience: When Students Believe That Personal Characteristics Can Be Developed. _Educational Psychologist_, _47_(4), 302–314. [https://doi.org/10.1080/00461520.2012.722805](https://doi.org/10.1080/00461520.2012.722805)