Rationality is Generalizable

In order to make some extra money during this pandemic, I’ve been doing a bit of work fixing up one of my mom’s rental houses. All the work is within my preexisting skillset, and it’s pretty nice to have a physical job to give me a break from my mental work.

When fixing up a house between tenants, the first item on the to-do list is to create the to-do list, called a “punch list”. In order to create it, I walked through the house and noted down the problems: this railing is loose, this stair tread is broken, these tiles are cracked, etc. Once I was finished, I’d made a list that would probably take me about a week to complete.

And then I tripled that time, and sent it to my mom as my time estimate.

The reason I did this was in order to combat something known as the planning fallacy, one of many endemic flaws in the human brain.

When humans make plans, they envision a scenario where nothing goes unexpectedly. But reality doesn’t work the way human-brains-making-plans do. Fixing those cracked tiles ends up requiring ripping out eight layers of rotted wood underneath filling in the resulting two-inch-deep gap with concrete, then leveling it out with plywood before laying the new tiles. Repairing the crack in the living room wall ends up requiring replacing the whole gutter system, which was causing water to run through the bricks on the outside and into the drywall. When we just see some cracked tiles or some chipping paint, we don’t imagine the root problem that might need to be fixed: we just consider replacing a few tiles or repairing a bit of cracked wall.

This generalizes far beyond fixing houses. When a group of students were asked for estimates for when they thought they would complete their personal academic projects,

  • 13% of subjects finished their project by the time they had assigned a 50% probability level;
  • 19% finished by the time assigned a 75% probability level;
  • and only 45% finished by the time of their 99% probability level.

As Buehler et al. wrote, “The results for the 99% probability level are especially striking: Even when asked to make a highly conservative forecast, a prediction that they felt virtually certain that they would fulfill, students’ confidence in their time estimates far exceeded their accomplishments.”

Humans planning things envision everything going according to their plan, with no unforeseen delays: exactly the same as if you ask them for a best-case estimate, in fact. In real life, what happens is somewhat worse than the worst-case estimate.

There are some useful debiasing techniques for combating the planning fallacy. The most useful of these is to utilize the outside view instead of the inside view: to consider how similar projects have gone in the past, instead of considering all the specific details of this particular project. (Considering specific details drives up inaccuracy.)

I’ve used this technique often, but in this particular circumstance, I couldn’t. While I’d done every individual thing that would need to be done to finish this house before, I had never done all of them in sequence.

Considering this problem, you might advise me to ask someone who had done whole houses before. I have easy access to such a person, in fact. The issue with this solution is that this person always makes overly optimistic estimates for how long it’s going to take to complete projects.

You would think her experience would make it easier for her to take the outside view, to consider this house in the context of all the other houses she had fixed. This doesn’t happen, so what gives?

Roy et al. propose a reason: “People base predictions of future duration on their memories of how long past events have taken, but these memories are systematic underestimates of past duration. People appear to underestimate future event duration because they underestimate past event duration.”

In light of all this, my best course of action, whenever I cannot take an outside view myself, is to take my normal (= optimistic) estimate and triple it.

I made that three-week time estimate around the first of June. Today is the 16th, and I’m finishing the last of the work today.

This whole situation, like the planning fallacy in particular, is generalizable. Learning about cognitive psychology, and the persistent flaws in all human brains, is oftentimes more useful for coming up with correct answers than experience.

You might not think that the tactics for “how to not be stupid” would be as generalizable to every field as they are. Each field has its own tips and tricks, its own separate toolbox, carpentry is not like neurosurgery is not like musical composition… But we are all human brains, us carpenters and neurosurgeons and composers, and we are all using the same flawed circuitry generated by the same optimization process. Here, as in planning, the special reasons why each field is different detract from an accurate conclusion.

Knowing about one cognitive bias produced a better time estimate than thirty years of experience in the field. Rationality does not always produce such a large improvement, but it does prevent humans from making the types of stupid mistakes we are prone to. In my personal opinion, if a decision is worth making, it is worth making rationally.

Leave a Reply

Your email address will not be published. Required fields are marked *