topic: [[Leading]]
created: 2023-08-14
*When adding another layer of safety doesn't make things safer, instead contributes to the complexity which unintentionally makes things work less well.*
This reminds me of [[remove to gain]], a basic idea that [[Everything should be make as simple as possible, but not simpler]], usually the benefits are in simplifying not making unique solutions.
It's kind of like [[assume my instincts are wrong]], the first reaction to a problem is the "easy" and "intuitive" response. Don't go with it. It seems like _of course we should make a pop-up hard stop so every time someone tries to order a PE protocol CT, a pop-up goes through the risk factors and risk assessment, and requests a D-Dimer._ However, the unintended impact of that is that people learn to click through this, because most of the time lack of access to that information is not why they are ordering a PE protocol CT. Then, they are taught to ignore **ALL** pop-ups. The move to expertise involves the discovery, through failure, of the [[counter-intuitive]] ideas.
Safety Clutter is the accumulation of policies and processes that are intended to but do not contribute to safety. This article (1) addresses the what, why, and how to improve safety clutter.
- “What are the mechanisms that create clutter? (Section 3) (3)
- What causes safety clutter and makes it hard to remove? (Section 4)
- What are the effects of safety clutter? (Section 5)
- What can we do about safety clutter? (Section 6)” (Rae et al., 2018, p. 5)
This matters because [[safety clutter]] is a major driver of [[burnout]]. People start to see the leadership and decision-makers as incompetent, because all day long they are receiving these distributed messages which are invalid in their world. "Click this, click that, get through the day" when it distracts from instead of adds to safety.
- Here the answer is difficult, and involves getting to [[ask five whys to learn the real issue]]. Don't try to solve the top level issue. Ask if the fifth issue can be solved with the solution at hand.
- [[the philosophy of digital minimalism]] should be instituted here.
- Ensure with [[Design thinking]] that you have tested, have found out the real world impact, and things are released as pilots, with the [[epistemic humility]] humility to be able to recall the initial design.
- [[e-iatrogenesis is the unintended harm from use of an EMR]], think about how you might unintentionally cause e-iatrogenesis with your solution. What barriers have you put in place, that when instituted over the long-term, will cause [[second order effects]] that are undesireable.
##### What would the opposite argument be?
When there is a critical incident, with a root cause analysis showing a problem with the system as it is created, how **do** you resist the urge to "make sure it doesn't happen again?" Can we let go of that as a goal? What does [[Safety I and Safety II]] tell us about this approach?
tags: #note/idea | #on/design | #on/solutions | #on/safety
##### Sources:
1: Rae, A. J., Provan, D. J., Weber, D. E., & Dekker, S. W. A. (2018). Safety clutter: The accumulation and persistence of ‘safety’ work that does not contribute to operational safety. _Policy and Practice in Health and Safety_, _16_(2), 194–211. [https://doi.org/10.1080/14773996.2018.1491147](https://doi.org/10.1080/14773996.2018.1491147)