Category: Food for thought

  • Second-Order Thinking

    First-order thinking is fast and easy. It happens when we look for something that only solves the immediate problem without considering the consequences. For example, you can think of this as I’m hungry so let’s eat a chocolate bar.

    Second-order thinking is more deliberate. It is thinking in terms of interactions and time, understanding that despite our intentions our interventions often cause harm. Second order thinkers ask themselves the question “And then what?” This means thinking about the consequences of repeatedly eating a chocolate bar when you are hungry and using that to inform your decision. If you do this you’re more likely to eat something healthy.

    First-level thinking looks similar. Everyone reaches the same conclusions. This is where things get interesting. The road to out-thinking people can’t come from first-order thinking. It must come from second-order thinking. Extraordinary performance comes from seeing things that other people can’t see.

    Second-Order Thinking: What Smart People Use to Outperform

    A more succinct version by the same author.

    Second-order thinking is the practice of not just considering the consequences of our decisions but also the consequences of those consequences. Everyone can manage first-order thinking, which is just considering the immediate anticipated result of an action. It’s simple and quick, usually requiring little effort. By comparison, second-order thinking is more complex and time-consuming. The fact that it is difficult and unusual is what makes the ability to do it such a powerful advantage.

    Chesterton’s Fence: A Lesson in Second Order Thinking

  • Why we fail to prepare for disasters

    Considering the current COVID-19 pandemic, one of the best things I have read on why we failed to get a grasp on this.

    Part of the problem may simply be that we get our cues from others. In a famous experiment conducted in the late 1960s, the psychologists Bibb Latané and John Darley pumped smoke into a room in which their subjects were filling in a questionnaire. When the subject was sitting alone, he or she tended to note the smoke and calmly leave to report it. When subjects were in a group of three, they were much less likely to react: each person remained passive, reassured by the passivity of the others.

    As the new coronavirus spread, social cues influenced our behaviour in a similar way. Harrowing reports from China made little impact, even when it became clear that the virus had gone global. We could see the metaphorical smoke pouring out of the ventilation shaft, and yet we could also see our fellow citizens acting as though nothing was wrong: no stockpiling, no self-distancing, no Wuhan-shake greetings. Then, when the social cues finally came, we all changed our behaviour at once. At that moment, not a roll of toilet paper was to be found.

    Why we fail to prepare for disasters

    But at the end it gets scary, really scary.

    Because Covid-19 has spread much faster than HIV and is more dangerous than the flu, it is easy to imagine that this is as bad as it is possible to get. It isn’t. Perhaps this pandemic, like the financial crisis, is a challenge that should make us think laterally, applying the lessons we learn to other dangers, from bioterrorism to climate change. Or perhaps the threat really is a perfectly predictable surprise: another virus, just like this one, but worse. Imagine an illness as contagious as measles and as virulent as Ebola, a disease that disproportionately kills children rather than the elderly.

    What if we’re thinking about this the wrong way? What if instead of seeing Sars as the warning for Covid-19, we should see Covid-19 itself as the warning?

    Next time, will we be better prepared?

    Why we fail to prepare for disasters

    If COVID-19 is a warning, then what will the actual disaster look like?

  • The Stockdale Paradox

    You must never ever ever confuse, on the one hand, the need for absolute, unwavering faith that you can prevail despite those constraints with, on the other hand, the need for the discipline to begin by confronting the brutal facts, whatever they are.

    The Stockdale Paradox

  • The Ellsberg Paradox

    You’re in a room with two large urns.

    The urns are covered so you can’t see inside them. But you know the urn on the left contains 50 white marbles and 50 black marbles. The urn on the right also contains 100 marbles, but the ratio of white to black marbles is unknown, with every ratio as likely as any other.

    […]

    What Ellsberg found is that people overwhelmingly choose to draw the ball from the urn with a known set of probabilities, rather than take a chance on the urn with an unknown ratio.

    This is despite the fact that the second urn could have better odds of drawing black marbles, like 99 to 1 or even 100 to no white marbles. Of course, the ratio in the unknown urn could also be tilted in the other direction. There’s no way of knowing.

    The fact is, the probability of drawing a black marble from either urn is identical.

    To verify this for yourself, just simplify the example.

    Instead of 100 marbles, imagine there are only 2. In the known urn, there is 1 black and 1 white. In the unknown urn, one-third of the time you’d be picking out of an urn with 2 black marbles. Another third of the time, 2 white marbles. And another third of the time, the urn has 1 of each.

    When you sum these probabilities up, you see that the chance of picking a black marble in the second urn is identical to picking one from the first urn: 50%.

    […]

    The takeaway? People exhibit strong aversion to ambiguity and uncertainty, meaning they have an inherent preference for the known over the unknown.

    The Ellsberg Paradox: Why You Don’t Take Risks and Settle for the Mediocre

    Thanks to Finshots for dropping this one in my inbox.