Rationality is Generalizable

In order to make some extra money during this pandemic, I’ve been doing a bit of work fixing up one of my mom’s rental houses. All the work is within my preexisting skillset, and it’s pretty nice to have a physical job to give me a break from my mental work.

When fixing up a house between tenants, the first item on the to-do list is to create the to-do list, called a “punch list”. In order to create it, I walked through the house and noted down the problems: this railing is loose, this stair tread is broken, these tiles are cracked, etc. Once I was finished, I’d made a list that would probably take me about a week to complete.

And then I tripled that time, and sent it to my mom as my time estimate.

The reason I did this was in order to combat something known as the planning fallacy, one of many endemic flaws in the human brain.

When humans make plans, they envision a scenario where nothing goes unexpectedly. But reality doesn’t work the way human-brains-making-plans do. Fixing those cracked tiles ends up requiring ripping out eight layers of rotted wood underneath filling in the resulting two-inch-deep gap with concrete, then leveling it out with plywood before laying the new tiles. Repairing the crack in the living room wall ends up requiring replacing the whole gutter system, which was causing water to run through the bricks on the outside and into the drywall. When we just see some cracked tiles or some chipping paint, we don’t imagine the root problem that might need to be fixed: we just consider replacing a few tiles or repairing a bit of cracked wall.

This generalizes far beyond fixing houses. When a group of students were asked for estimates for when they thought they would complete their personal academic projects,

  • 13% of subjects finished their project by the time they had assigned a 50% probability level;
  • 19% finished by the time assigned a 75% probability level;
  • and only 45% finished by the time of their 99% probability level.

As Buehler et al. wrote, “The results for the 99% probability level are especially striking: Even when asked to make a highly conservative forecast, a prediction that they felt virtually certain that they would fulfill, students’ confidence in their time estimates far exceeded their accomplishments.”

Humans planning things envision everything going according to their plan, with no unforeseen delays: exactly the same as if you ask them for a best-case estimate, in fact. In real life, what happens is somewhat worse than the worst-case estimate.

There are some useful debiasing techniques for combating the planning fallacy. The most useful of these is to utilize the outside view instead of the inside view: to consider how similar projects have gone in the past, instead of considering all the specific details of this particular project. (Considering specific details drives up inaccuracy.)

I’ve used this technique often, but in this particular circumstance, I couldn’t. While I’d done every individual thing that would need to be done to finish this house before, I had never done all of them in sequence.

Considering this problem, you might advise me to ask someone who had done whole houses before. I have easy access to such a person, in fact. The issue with this solution is that this person always makes overly optimistic estimates for how long it’s going to take to complete projects.

You would think her experience would make it easier for her to take the outside view, to consider this house in the context of all the other houses she had fixed. This doesn’t happen, so what gives?

Roy et al. propose a reason: “People base predictions of future duration on their memories of how long past events have taken, but these memories are systematic underestimates of past duration. People appear to underestimate future event duration because they underestimate past event duration.”

In light of all this, my best course of action, whenever I cannot take an outside view myself, is to take my normal (= optimistic) estimate and triple it.

I made that three-week time estimate around the first of June. Today is the 16th, and I’m finishing the last of the work today.

This whole situation, like the planning fallacy in particular, is generalizable. Learning about cognitive psychology, and the persistent flaws in all human brains, is oftentimes more useful for coming up with correct answers than experience.

You might not think that the tactics for “how to not be stupid” would be as generalizable to every field as they are. Each field has its own tips and tricks, its own separate toolbox, carpentry is not like neurosurgery is not like musical composition… But we are all human brains, us carpenters and neurosurgeons and composers, and we are all using the same flawed circuitry generated by the same optimization process. Here, as in planning, the special reasons why each field is different detract from an accurate conclusion.

Knowing about one cognitive bias produced a better time estimate than thirty years of experience in the field. Rationality does not always produce such a large improvement, but it does prevent humans from making the types of stupid mistakes we are prone to. In my personal opinion, if a decision is worth making, it is worth making rationally.

Another Reason to Get Straight to the Work World

I’ve discussed in previous posts some reasons you should get a real-world job either before or instead of going to college. For one thing, college has an extremely high opportunity cost, in both time and money. For another, the purpose of college has become muddled to such an extent that the reasons people tell you to go are almost entirely desynchronized with the actual reasons you may want to go.

Today, I have another reason that you should at least take a gap year to work a bit first. And this one applies even if you’re 100% sold on college.

When I took a marketing job, I expected to do, well, marketing. Yeah, the job was in San Francisco, so I expected (and wanted) to do marketing for tech companies, but that didn’t change my fundamental assumption. My job title was “Digital Marketer” and so I thought I was going to do digital marketing.

As I found out over the course of the next few months, an employer will use any skill you have if they can find a use for it. By the four-month mark, I had done everything from graphic design to sales to web design to JavaScript programming.

This isn’t just because I work for a micro-company, although this probably happened faster and more thoroughly because of that. Any company will do this. And that’s the key distinction between the work world and college.

If you sign up for a college class in marketing, you won’t accidentally end up programming in JavaScript or creating website wireframes. You’ll do the coursework – nothing more, nothing less. When you go to college, you get exactly what you sign up for. When you get a real-world job, your responsibilities may start out as what you expected, but eventually you’ll probably end up doing a whole bunch of stuff that wasn’t in the original job description, based on a combination of what the company needs and what you can do.

In short: College is static; the work world is flexible.

Often, the fact that college works this way feeds the harmful “that’s not my job” mentality, which will poison your career and dampen your options. If you’re reluctant to take on any responsibility beyond the bare minimum of what you were hired to do, you’ll never be given any additional responsibility. Even if you avoid this mentality, getting some real-world work experience early on will serve you well, in or out of college.

If you’re in the sort of profession where you need a college degree, or you’ve otherwise decided you’re Going To College, consider taking a gap year, or getting a part-time job in your field early into your degree. The flexibility you acquire from doing real work is worth its weight in gold.