[[Safety]] tags:: #note/idea | #on/risk | #on/theory | #on/systems people:: #people/naseemtaleb dates:: 2022-10-01 *Important events that have such outside impacts that their rarity doesn't matter as much as their effects.* This reminds me of [[Termination Shock Book]], and how the possibility for a large, non-linear event that causes an unexpectedly large outcome is actually very high, you just don't know where or when. It's kind of like [[second order effects]], and how in a complex world, you just won't be able to predict when the non-linear effects will occur. Since [[Systems]] contain balancing and positive re-enforcing loops, it will be pro-actively hard to tell those loops and how they will add to cause large effects. This is a theoretical way of looking at risk, instead of looking at the likelihood of an event, you should instead look to protect yourself from rare but outsized events, if possible. [[black swan events]] have the following dynamics: - outlier events - Outsized impact - Predictable (in retrospect) but not prospectively These dynamics combine in a way that makes [[dealing with complexity]] rise in importance when considering risk. It matters, because if we treat everything like an investment portfolio, including [[research as an investment portfolio]], and [[Safety]], etc. we will have more [[Antifragility]]. _What we do not know becomes more important than what we do know!_ If we knew it, the black swan event could not happen, because it would be predictable and we would protect ourselves from it. Avoid the [[illusion of understanding]]. Work towards [[Blue Ocean Strategy]], harness the unknown and [[don't chase the soccer ball]], move to where the crowd is _not_. ### What would the opposite argument be? By their very nature black swan events are low likelihood, so the work and resources required to "protect" against them is only for people who already have the resources, that they won't need them. ## Sources: [[The Black Swan]] [[Antifragility]]