A Letter to my Cousin Rose

I wrote this for unrelated reasons, but I’m posting it here as an update to “I am a 4-Year College Opt-Out. Here’s Why.” It’s not necessary for you to read the original post in order to understand this one; in fact, I’ve restated most of the original post here. Still, I’m leaving the original post alone, because I believe it’s important for people to see my progression over time.


Dear Rose,

It’s been a long time since we’ve seen each other; my memory of you is frozen at the age of 5. But I know you’re 14 now and starting high school. Your mother told me that you were considering your future: if and where you’ll go to college, what you’ll do for a career, and all those major life-determining questions we’re expected to answer in our adolescence. These answers are more complex and nuanced than most people realize, and since I’m closer to you in age than your parents are, I thought I would share my experiences in this area with you.

For me, awareness of college started the summer I turned 8. My siblings were debating with my parents where to go out to dinner, and as with most families trying to decide on things, we voted on it. The kids were initially in the majority with one decision, but then the parents threw a wrench in the rules: “adults get five votes.”

I didn’t mind, but I was curious about the reasoning. “So we get five votes when we turn 13?” I asked, being a Jew who becomes culturally an adult at 13. (By the way, Rose: congratulations on your bat mitzvah; I’m sorry I couldn’t be there!) “No,” said my mom, “it would be silly if you could age into it. For our purposes, an adult is someone who’s graduated from college.”

From this and other similar conversations, I decided I was going to college. But when a child decides to pursue something, it’s not because they’ve done a cost-benefit analysis and found it’s the logical conclusion based on their knowledge and past experience. A child thinks something is worth pursuing if it sounds impressive, fun, or cool.

In and of itself, this wasn’t a problem. Children choose to pursue plenty of silly ideas: when I was 8 I also wanted to make a career out of inventing a time machine. 

But then society perpetuated the problem by leading me away from ever questioning my belief. “Of course college is the right choice for you,” spoke the voice of the populous. “You’re smart, capable, and a hard worker. And you want a good job, right? You need college to get a good job.” I didn’t question these comments: they came from people I knew, trusted, and both knew and trusted to understand more than me about the world.

So as I was entering my senior year of high school, I was just assuming I would go to college. That’s what you do, right? But despite this, I had gotten really sick of taking classes. The things I worked on in my courses seldom related to the real world, and if they did at all, they reflected real-life work through a funhouse mirror. Due to dual-enrollment, I was close to graduating high school with my Associate’s in computer science. At last, I thought, I could start doing meaningful work and creating value for real people! Wait, no, I couldn’t. I had to go to college. Didn’t I?

Finally, I realized that I had pursued college with partially mistaken and mostly absent reasoning. This was not the right way to go about making a major life choice. When I began genuinely considering my options with a fully sentient brain, I came to the conclusion that I did not have nearly enough evidence on which to base a decision that would cost me years, and tens to hundreds of thousands of dollars, if I chose wrong.

This terrified me, but I had no way to remedy it. I didn’t have enough data to decide whether or not college was the right choice. The only way to gather that data would be to get a job in or near the area in which I wanted to work, figure out what types of degrees the people working in my desired field had, and then go to a 4-year college or not based on that.

So, at the end of high school, that’s exactly what I did. I went through a selective program that matches young people with startups, and chose to move to San Francisco and work as a digital marketing consultant. 

While living in SF, I met a lot of technology professionals: programmers, business analysts, technical writers. Before I moved, I’d never thought about the differences between these professions, nor had I made any effort to choose one. Now, because I understood what they were and knew people who did them, I could find out which I would be good at, which I would like, and which paid the best; the combination of which I used to choose a target career. (I decided on business systems analysis.)

Now that I had an idea of what career I wanted, I could work backwards. Do people working in that career have college degrees? What types of degrees do the best new hires at their company with similar jobs have? If they have degrees, where are they from, and what are their majors? 

Based on all the data I gathered, pure programmers often didn’t have degrees at all, or had Associate’s or Bachelor’s degrees in unrelated things from schools I’d never heard of. Pure writers were the same way. Consultants and analysts were much more likely to have Bachelor’s degrees. Finally, data scientists, especially those in research-intensive roles, often had Master’s degrees or PhDs.

From a year of living as a self-supporting, independent adult and working full time for a technology company, I decided with input from friends and associates that the best fit for me was to be a business systems analyst, most of which have Bachelor’s degrees. Therefore, I decided to get a Bachelor’s degree in Computer Information Systems, which, as you may have heard, I’m now working on.

I have three points of advice for you, Rose, from my experience.

First, if you haven’t already looked into dual-enrollment during high school, I recommend it. It’s much more cost-effective in terms of both time and money to get as much of your college work done as possible while you’re still in high school, and I know you’re smart and hardworking enough to do it.

Second, and perhaps most importantly, don’t go to college just because everyone does it. Even if you get a full-ride scholarship, it will still cost you 3-4 years of your life which might be better spent working. 

Third, you may have heard from various adults that you start by choosing what you want to study, then where you want to study it, then what career you want to get. This was the way our parents approached college, but it’s the opposite of how we should do so. Begin by doing research into what career you want to get, then use that to determine whether or not you need a college degree, and if so, which type. From there, you can choose a major that best suits the career choice you’ve made, and use that to decide between colleges.

I know this has been a long letter and is probably a lot to absorb, but I hope it has been useful. If you have any questions, or just want to chat – I would love to catch up – just let me know.

Love,
Your cousin,
Jenya

Another Reason to Get Straight to the Work World

I’ve discussed in previous posts some reasons you should get a real-world job either before or instead of going to college. For one thing, college has an extremely high opportunity cost, in both time and money. For another, the purpose of college has become muddled to such an extent that the reasons people tell you to go are almost entirely desynchronized with the actual reasons you may want to go.

Today, I have another reason that you should at least take a gap year to work a bit first. And this one applies even if you’re 100% sold on college.

When I took a marketing job, I expected to do, well, marketing. Yeah, the job was in San Francisco, so I expected (and wanted) to do marketing for tech companies, but that didn’t change my fundamental assumption. My job title was “Digital Marketer” and so I thought I was going to do digital marketing.

As I found out over the course of the next few months, an employer will use any skill you have if they can find a use for it. By the four-month mark, I had done everything from graphic design to sales to web design to JavaScript programming.

This isn’t just because I work for a micro-company, although this probably happened faster and more thoroughly because of that. Any company will do this. And that’s the key distinction between the work world and college.

If you sign up for a college class in marketing, you won’t accidentally end up programming in JavaScript or creating website wireframes. You’ll do the coursework – nothing more, nothing less. When you go to college, you get exactly what you sign up for. When you get a real-world job, your responsibilities may start out as what you expected, but eventually you’ll probably end up doing a whole bunch of stuff that wasn’t in the original job description, based on a combination of what the company needs and what you can do.

In short: College is static; the work world is flexible.

Often, the fact that college works this way feeds the harmful “that’s not my job” mentality, which will poison your career and dampen your options. If you’re reluctant to take on any responsibility beyond the bare minimum of what you were hired to do, you’ll never be given any additional responsibility. Even if you avoid this mentality, getting some real-world work experience early on will serve you well, in or out of college.

If you’re in the sort of profession where you need a college degree, or you’ve otherwise decided you’re Going To College, consider taking a gap year, or getting a part-time job in your field early into your degree. The flexibility you acquire from doing real work is worth its weight in gold.

Why Rationality?

I’ve identified as a rationalist for about five years now. The dictionary definitions are a bit off from what I mean, so here’s my definition.

Epistemic rationality: believing, and updating on evidence, so as to systematically improve the correspondence between your map and the territory.  The art of obtaining beliefs that correspond to reality as closely as possible.  This correspondence is commonly termed “truth” or “accuracy”, and we’re happy to call it that.

Instrumental rationality: achieving your values.  Not necessarily “your values” in the sense of being selfish values or unshared values: “your values” means anything you care about.  The art of choosing actions that steer the future toward outcomes ranked higher in your preferences.  On LW we sometimes refer to this as “winning”.

Eliezer Yudkowsky, “What Do We Mean By ‘Rationality’?”, LessWrong

Of course, these two definitions are really subsets of the same general concept, and they intertwine considerably. It’s somewhat difficult to achieve your values without believing true things, and similarly, it’s difficult (for a human, at least) to search for truth in absence of wanting to actually do anything with it. Still, it’s useful to distinguish the two subsets, since it helps to distinguish the clusters in concept-space.

So if that’s what I mean by rationality, then why am I a rationalist? Because I like believing true things and achieving my values. The better question here would be “why is everyone not a rationalist?”, and the answer is that, if it was both easy to do and widely known about, I think everyone would be.

Answering why it isn’t well-known is more complicated than answering why it isn’t easy, so, here are a handful of the reasons for the latter. (Written in the first person, because identifying as a rationalist doesn’t make me magically exempt from any of these things, it just means I know what they are and I do my best to fix them.)

  • I’m running on corrupted hardware. Looking at any list of cognitive biases will confirm this. And since I’m not a self-improving agent—I can’t reach into my brain and rearrange my neurons; I can’t rewrite my source code—I can only really make surface-level fixes to these extremely fundamental bugs. This is both difficult and frustrating, and to some extent scary, because it’s incredibly easy to break things irreparably if you go messing around without knowing what you’re doing, and you would be the thing you’re breaking.
  • I’m running on severely limited computing power. “One of the single greatest puzzles about the human brain,” Eliezer Yudkowsky wrote, “is how the damn thing works at all when most neurons fire 10-20 times per second, or 200Hz tops. […] Can you imagine having to program using 100Hz CPUs, no matter how many of them you had?  You’d also need a hundred billion processors just to get anything done in realtime. If you did need to write realtime programs for a hundred billion 100Hz processors, one trick you’d use as heavily as possible is caching. That’s when you store the results of previous operations and look them up next time, instead of recomputing them from scratch. […] It’s a good guess that the actual majority of human cognition consists of cache lookups.” Since most of my thoughts are cached, when I get new information, I need to resist my brain’s tendency to rely on those cached thoughts (which can end up in my head by accident and come from anywhere), and actually recompute my beliefs from scratch. Else, I end up with a lot of junk.
  • I can’t see the consequences of the things I believe. Now, on some level being able to do this (with infinite computing power) would be a superpower: in that circumstance all you’d need is a solid grasp of quantum physics and the rest would just follow from there. But humans don’t just lack the computing power; we can believe, or at least feel like we believe, two inherently contradictory things. This concept is, in psychology, called “cognitive dissonance”.
  • As a smart human starting from irrationality, knowing more information can easily hurt me. Smart humans naturally become very good at clever arguing—arguing for a predetermined position with propositions convoluted enough to confuse and confound any human arguer, even one who is right—and can thus use their intelligence to defeat itself with great efficiency. They argue against the truth convincingly, and can still feel like they’re winning while running away from the goal at top speed. Therefore, in any argument, I have to dissect my own position just as carefully, if not more carefully, than I dissect those of my opponents. Otherwise, I come away more secure in my potentially-faulty beliefs, and more able to argue those beliefs against the truth.

This is a short and incomplete list, of some of the problems that are easiest to explain. It’s by no means the entire list, or the list which would lend the most emotional weight to the statement “it’s incredibly difficult to believe true things”. But I do hope that it shed at least a little light on the problem.

If rationality is really so difficult, then, why bother?

In my case, I say “because my goal is important enough to be worth the hassle”. In general, I think that if you have a goal that’s worth spending thirty years on, that goal is also worth trying to be as rational as humanly possible about. However, I’d go a step further. Even if the goal is worth spending a few years or even months on, it’s still worth being rational about, because not being rational about it won’t just waste those years or months; it may waste your whole career.

Why? Because the universe rarely arrives at your doorstep to speak in grave tones, “this is an Important Decision, make it Wisely”. Instead, small decisions build to larger ones, and if those small decisions are made irrationally, you may never get the chance to make a big mistake; the small ones may have already sealed your doom. Here’s a personal example.

From a very young age, I wanted to go to Stanford. I learned that my parents had met there when I was about six, and I decided that I was going to go too. Like most decisions made by six-year-olds, this wasn’t based on any meaningful intelligence, let alone the full cost-benefit analysis that such a major life decision should have required. But I was young, and I let myself believe the very convenient thought that following the standard path would work for me. This was not, itself, the problem. The problem was that I kept on thinking this simplified six-year-old thought well into my young adulthood.

As I grew up, I piled all sorts of convincing arguments around that immature thought, rationalizing reasons I didn’t actually have to do anything difficult and change my beliefs. I would make all sorts of great connections with smart interesting people at Stanford, I thought, as if I couldn’t do the same in the workforce. I would get a prestigious degree that would open up many doors, I thought, as if working for Google isn’t just as prestigious but will pay you for the trouble. It will be worth the investment, the cached thoughts of society thought for me, and I didn’t question them.

I continued to fail at questioning them every year after, until the beginning of my senior year. At that point, I was pretty sick of school, so this wasn’t rationality, but a motivated search. But it was a search nonetheless, and I did reject the cached thoughts which I’d built up in my head for so long, and as I took the first step outside my bubble of predetermined cognition, I instantly saw a good number of arguments against attending Stanford. I realized that it had a huge opportunity cost, in both time and money. Four years and hundreds of thousands of dollars should not have been parted with that lightly.

And yet, even after I realized this, I was not done. It would have been incredibly easy to reject the conclusion I’d made because I didn’t want all that work to have been a waste. I was so close: I had a high SAT, I’d gotten good scores on 6 AP tests, including the only two computer science APs (the area I’d been intending to major in), and I’d gotten National Merit Commended Scholar status. All that would have been left was to complete my application, which I’m moderately confident I would have done well on, since I’m a good writer.

That bitterness could have cost me my life. Not in the sense that I would die for it immediately, but in the sense that everyone is dying for anything they spend significant time on, because everyone is dying. And it was here that rationality was my saving grace. I knew about the sunk cost fallacy. I knew that at this point I should scream “OOPS” and give up. I knew that at this point I should lose.

I bit my tongue, and lost.

I don’t know where I would end up if I hadn’t been able to lose here. The optimistic estimate is that I would have wasted four years, but gotten some form of financial aid or scholarship such that the financial cost was lower, and further, that in the process of attending college, I wouldn’t gain any more bad habits, I wouldn’t go stir-crazy from the practical inapplicability of the material (this was most of what had frustrated me about school before), and I would come out the other end with a degree but not too much debt and a non-zero number of gained skills and connections. That’s a very optimistic estimate, though, as you can probably tell given the way I wrote out the details. (Writing out all the details that make the optimistic scenario implausible is one of my favorite ways of combatting the planning fallacy.) There are a lot more pessimistic estimates, and it’s much more likely that one of those would happen.

Just by looking at the decision itself, you wouldn’t think of it as a particularly major one. Go to college, don’t go to college. How bad could it be, you may be tempted to ask. And my answer is, very bad. The universe is not fair. It’s not necessarily going to create a big cause for a big event: World War I was caused by some dude having a pity sandwich. Just because you feel like you’re making a minor life choice doesn’t mean you are, and just because you feel like you should be allowed to make an irrational choice just this once doesn’t mean the universe isn’t allowed to kill you anyway.

I don’t mean to make this excessively dramatic. It’s possible that being irrational here wouldn’t have messed me up. I don’t know, I didn’t live that outcome. But I highly doubt that this was the only opportunity I’ll get to be stupid. Actually, given my goals, I think it’s likely I’ll get a lot more, and that the next ones will have much higher stakes. In the near future, I can see people—possibly including me—making decisions where being stupid sounds like “oops” followed by the dull thuds of seven billion bodies hitting the floor.

This is genuinely the direction the future is headed. We are becoming more and more able to craft our destines, but we are flawed architects, and we must double and triple check our work, else the whole world collapses around us like a house on a poor foundation. If that scares you, irrationality should scare you. It sure terrifies the fuck out of me.

The Purpose of College, Past and Present

A lot of people complain that college isn’t doing a good job preparing people for the workforce. They toss around statistics like “only 27% of college grads have jobs related to their majors” and “only 62% of college grads have a job that requires a college degree”. Evidently, the only goal of college nowadays is career preparation, and colleges (or maybe college graduates) are failing at this goal.

Why is that? What’s wrong?

To answer that, let’s look back 200 years, to the Revolutionary War. When the first publicly-funded colleges came into existence, their purpose was to educate the top 5-10% of highly gifted white boys, so that they could become leaders and exceptional men. Our modern sensibilities might be offended by the exclusiveness of “small percentage of highly gifted white boys”—it leaves out girls, people of color, and non-highly-gifted white boys—but for the time, it was very progressive. In Britain, college was explicitly a privilege for only the very wealthy. The Americans, then, wanted to move away from that; they decided that college should be available to any highly gifted white boy, regardless of income.

One of the main contributors to this ideal was Thomas Jefferson, who wrote Bill 79, “A Bill for the More General Diffusion of Knowledge”, which was presented numerous times through the 1770s and 80s before a heavily revised version was put into law in 1796.

This was still the general attitude through the 1800s. When W.E.B. Du Bois argued for the education of African Americans, he expressly said that college is for only the “talented tenth”, or the “best minds of the race”, so that they can “reach their potential and become leaders”.

“The Negro race, like all races, is going to be saved by its exceptional men. The problem of education, then among Negroes must first of all deal with the Talented Tenth; it is the problem of developing the best of this race that they may guide the mass away from the contamination and death of the worst, in their own and other races.”

This view remained until the turn of the twentieth century, when it gradually started to change. College became a status symbol; after all, it was only available to the top 5-10% of people, so having gotten into college meant you had something special. Something special that employers wanted working for them. College graduates, then, got their pick of the best jobs.

Since everyone wants great jobs, everyone wanted college. As such, the GI Bill was passed, opening college to a much larger population following WW2. The problem was that college itself was not causing people to get great jobs, it was a status symbol that was merely correlated with getting great jobs, and people committed the fallacy of post hoc ergo propter hocI’ve written about this before.

As college became available to more and more people, it became less useful as a symbol. The symbol was, after all, about being part of an elite few. When this happened, college stopped guaranteeing good jobs, or in fact any jobs at all. But yet, the cultural zeitgeist had shifted: college was for jobs now, not for educating future leaders. It doesn’t matter that the curriculum has hardly changed at all since 1851. College is supposed to magically procure jobs, despite the fact that it has absolutely no way to do that.

College had never been designed to prepare people for specific careers that required specific skills. For the elite future leaders it was designed for, it taught more generic things like developing a cultured and educated and intellectual character. This is great and all, but it doesn’t give you diddly for actual marketable skills.

During the actual time period, when the top 5% were going to college to learn how to be great leaders or whatever, everyone else was learning actual job skills through trade apprenticeships. In fact, many of the leaders did this too: they both went to college and apprenticed at a trade, so they could have a means of making a living while they worked on shaping the nation. Being a person who shapes a nation doesn’t come with a paycheck.

So. Why is college failing at getting people jobs? It wasn’t designed to do that in the first place. During the brief period that it did do that, it was because of correlation based on rarity and status, not causation based on education. And now, despite being basically the same thing that it’s always been, college is saddled with the artificial purpose of getting jobs for graduates, which it is incapable of doing.

Once you know the real history, all the artificial, retroactive explanations of the modern day fall away. All the justifications of the continued existence and high price of college fail to make sense. You start to notice that there is literally nothing that colleges do or pretend to do that can’t be done more effectively somewhere else for a tiny fraction of the cost.

You want a liberal arts education? Sure! Go to the library. Read the classics. Read Shakespeare and Dante and then learn Latin and read Caesar and Catullus and Cicero. Go to museums and learn about art. Go to symphonies and learn about music. You don’t need a university for a liberal arts education.

You want to learn more pragmatic stuff, like math and coding and writing? Sure! Take some online courses. I recommend Gotham Writers, Udemy, Free Code Camp, Edhesive, and Khan Academy.

You want a great career? Sure! Look over the job market, see what kinds of skills are marketable, see what kinds of skills you’ve got, and start looking. If you’re just getting started, I recommend the Praxis program.

You want to network with other smart, interesting, accomplished people? Sure! There are a huge number of online groups and forums, as well as tons of conventions and other in-person gatherings.

You want a status symbol to put on your resume? Well, okay, maybe for that you want college. Get good grades in high school, get 5s on APs, ace the hell out of the SAT, get National Merit, don’t forget your extracurriculars… basically work your butt off for four years in high school, then apply to a bunch of colleges. If you’re both lucky and successful, you’ll get the opportunity to pay a bunch of money in order to work your butt off for four more years so you can put Harvard, Stanford, Yale, or whatever on your resume. And yes, this is a very useful and valuable status symbol. But it’s taken eight years of your life and possibly hundreds of thousands of dollars.

In case you want a different route than that, try this. Go apply for a job at prestigious companies in your chosen field, and if you don’t get in, develop your resume and skillsets until you do. It’ll require about as much hard work as college, but unlike college, you’ll actually learn useful skills in the process. Also unlike college, you won’t be in debt by the end; rather, you’ll have been paid for your trouble.

The only reason you should consider going to college is if you’re planning on going into a field that is very, very strict about requiring one. (“All the job listings say they require one” does not count.) For example, if you want to be an accountant, you need a CPA. You cannot sit the CPA exam without 150 credit hours of college, so you’re kinda stuck there. Similar concept with doctors, lawyers, and the like.

Still, in those circumstances, it’s just because the world hasn’t caught up to the uselessness of college yet.

So, should you go to college? Maybe. It depends on your specific goals. But is college the right path for everyone? Absolutely not. And is it a surprise that college isn’t doing a good job at preparing people for careers, given the history? Absolutely not. Actually, it’s a surprise that it’s doing as well as it is.

For everyone to have a good path, the entire educational system needs to be overhauled. But for any given individual, just think long and hard about your career. Don’t march in lockstep down the college path just because college.


A version of this post was published on Praxis’s blog on Nov 5, 2018. Check it out!

What Is a Tech Cert, and What Is It For?

Last month, as a part of my portfolio project, I got three MTA (Microsoft Technology Associate) certifications, in Java, JavaScript, and HTML/CSS. I documented this fact through my project updates, but if you don’t know what those are, I didn’t offer a lot of explanation.

So, let me take this opportunity to explain what a technology certification is, why it matters, and why if you had to choose between some certs and a college degree, you should choose the certs.

There are a wide variety of different certs, offered by a large number of companies, which demonstrate proficiency with a ton of disparate technologies. Each cert is accorded a certain level of respect in the tech space, based on how central the company offering the cert is to the area of technology tested by the cert. For example: SQL Server, one of the most common flavors of SQL, is owned by Microsoft. As such, the MCSE (Microsoft Certified Systems Engineer) in SQL is one of the most highly respected SQL certs.

Most companies who offer certifications offer them in tiers. I’ll use Microsoft as an example. The MTA is the lowest level of Microsoft certification. After that, the next level is the  MCP (Microsoft Certified Professional), and then the MCSA (Microsoft Certified Solutions Associate). To get an MCSA, you need to pass three or four related MCP exams. The top tier is the MCSE. To get an MCSE, you need to already have an MCSA, then pass one or two additional (much more difficult) exams.

These certifications demonstrate specific things to employers. For example, “I have an MTA in this” means “I am skilled enough at this to get an entry-level job doing it”. “I have an MCSE in this” means “I am an expert at this and have a lot of experience using it”.

Tech certs and college degrees occupy very different niches. Where tech certs demonstrate that you have a certain level of hands-on skill with a specific technology or technological area, a college degree demonstrates (hopefully) that you have a broad proficiency with technology in general. Where a certification is product-specific, company-specific, and clearly leveled, a degree is more subject area focused and doesn’t necessarily include any particular language or technology, and while it does  come in levels (AS, BS, MS, PhD),  the levels correlate to time spent, more than skills earned.

If I say “I have a CCIE”, you know that I have a very high level of technical knowledge about routers and networking in general, and Cisco routers specifically (Cisco being one of the main manufacturers of routers). This is incredibly useful knowledge to employers, who now know exactly what I can do for them. If, however, I said “I have a Master’s degree in computer information technology”, you only know that I’ve spent six years immersed in technology. You don’t know what kind of technology, and if you’re an employer looking to hire someone who knows how to work with Cisco routers, you’ve got no clue if I can do that. My degree might have required one class in networking, or ten, or none at all. You have no idea.

It’s not just that degrees can be highly unspecific and not very useful to employers looking for specific skills. When someone says “I have a degree”, you don’t even know if their knowledge is up-to-date.

Certifications are always up-to-date, in one of two ways. Some certifications are only valid for a certain amount of time. For example, the CISSP (Certified Information Systems Security Professional, a cybersecurity certification) expires after three years, at which point it needs to be renewed. Other certifications are version-specific. For example, if you get an MCSE in SQL Server 2012-14, you have it forever, but you’ll probably want to get one in SQL Server 2016 as well once the newer version becomes ubiquitous.

But a degree doesn’t work like this. Once someone is taught a thing in a class, there is no requirement that they maintain that knowledge to keep the degree up-to-date. Furthermore, the thing taught in the class may not even have been up-to-date at the time it was taught. A lot of colleges have reputations for teaching things that are out of date, not out of malice, but because tech changes faster than colleges can. It’s rare to find college classes that teach the latest and greatest, and it’s common to find colleges teaching material that’s out of fashion, unnecessary, or just plain obsolete.

There’s yet another problem with degrees: they aren’t vendor-specific. I mentioned the CCIE before: the networking certification offered by Cisco. That last part is important. The benefit of a CCIE isn’t just that it says “This person knows routers.” It’s that it says “This person knows routers. Sincerely, the guy who makes the routers.” With a degree, it’s like “This person knows how tech works. Sincerely, a college that you may or may not have ever heard of.” So it’s not just the lack of information, it’s also the lack of credibility behind that information.

It is also important to note something kind of unique about tech: many of the seasoned working professionals in the tech space don’t have degrees in tech. This is because when these professionals were starting out in tech, the field was so new that there weren’t many degrees in it. They got degrees in other things: math, electrical engineering, underwater basket weaving, whatever. The thing that made them tech professionals was their technical knowledge, not what came after “BS in” on their resume. And nowadays, with the advent of certifications, these professionals have MCSEs or CCIEs, not college degrees.

Basically, if you have to choose between a candidate for your job who has a college degree in technology, and a candidate who has no degree but has several certifications and a good portfolio, you’re going to pick the latter. Even so, there is a reason for degrees in tech to exist besides “colleges wanted to jump on the tech bandwagon”.

Firstly, college degrees in tech demonstrate a commitment to learning technology over an extended period. Since you can get certs over however long you want so long as you keep them up-to-date, an employer has no idea whether the candidate with three certs has spent one year or five learning that material. College, by contrast, has a consistent, fast pace, so an employer can infer from a college degree that the candidate is capable of maintaining a high workload for an extended period of time.

Second, college degrees are designed to give you an understanding of the fundamentals of the field you’re entering, including at least the basics in the breadth of the field, plus some significant depth in at least one area. A major downside of having only certs is that you may not have a good foundation: you may be a one-trick pony who knows only one thing, and may do that one thing well, but may not know about how it fits into the bigger picture, what it depends on, or what depends on it.

The big advantage of a degree is that you’re going to come out of it with a general understanding of every major sub-discipline. You come out of a degree in tech with a general knowledge of networking, operating systems, electronics, math and logic, and programming. This is generally how every degree works, and it is in fact a valuable service that colleges provide.

The theoretical ideal candidate has both a degree in technology (for the breadth of knowledge and commitment) and multiple certifications (for specific, up-to-date knowledge about vendor technology). Often in the real world, though, employers will pass up someone with a nice shiny degree in favor of someone who has the right certs and a good portfolio.

Why College Should Not Be Free

At the moment, there is a debate over whether or not college should be made free for everyone. And at first glance, the obvious answer is yes. College is outrageously expensive, and making it free would allow everyone access without forcing anyone into debt.

But there’s a presumption backing this “obvious” answer, and the presumption is that college is necessary. It would be perfectly reasonable in a context where the thing involved (i.e., food, clean water, etc.) is a basic human need, but college is not. I covered some of the reasons in my essay, I Am a Four-Year College Opt-Out—high monetary cost, high opportunity cost, lack of applicability of the coursework to the real world, etc.—but in essence, college is not the only path to success, and for many people, it’s not even a very good one.

But even if college isn’t necessary, what harm would it do for college to be free for those who want to attend? After all, debt is crippling the nation’s youth, and wouldn’t it be nice if that went away? Of course only a stingy old fart who doesn’t think young people “deserve” an education would say free college is bad, right? Why am I writing this essay?

To answer these questions, let me skip back in time and tell you a story. It is 1920. The uppermost level of compulsory education isn’t 12th grade, it is 8th. High school as we know it now does not exist; in its place sits something that looks more like college: an elite, expensive program which only accepts the top 5% of applicants. Because high schools only accept the very best, those who graduate are almost guaranteed high-paying jobs.

This began to change in 1954. Some guys saw all the high school graduates getting great jobs and had a bright idea. If we make high school free, they thought, then everyone will be able to get a high school education, and thus, a great job! The problem was that the only reason high school graduates had gotten great jobs was because of the rarity of their education. And over the next fifty years, as high school was made free and subsequently mandatory, a high school education became completely useless. The only thing gained was four more years of compulsory schooling before children, now more properly young adults, could begin working.

This has already happened before! This whole argument and discussion, should we take this elite program and make it free, of course we should because it’ll give everyone good jobs, it’s happened before! It will be just as ineffective this time as it was last time, because nothing has changed. Post hoc ergo propter hoc is still a logical fallacy, and people are still making it. In the exact same way as before. Those who fail to learn from history are, as they say, doomed to repeat it.

Now let me present a possible vision of the future. It is 2056. College has been made free and mandatory, and the little value it still had has been completely erased. The only thing gained has been four more years of compulsory schooling before young adults, now more properly adults, can begin working. The societal definition of “child” goes from “under 18” to “under 24”. The average human lifespan doesn’t change, it simply becomes normal that humans spend the first full quarter of their lives in the artificial school environment, which is just as pointless as ever: the students care as little about learning as the teachers care about teaching, nobody gets paid enough, and everyone is miserable. One day, a 24-year-old kid reads an article which says people are trying to make graduate school free, and she thinks, “Huh. That seems like a good idea. Then, everyone could have a good education for free.”

In writing this, I don’t want to cast my ballot on this issue as “the system is fine the way it is”, because the system is not fine. But the way to fix it is not to commit the same logical fallacy that we already made less than a century ago. The problem is complicated, and a complicated problem cannot be fixed with a three-word solution. “Make college free” is not the answer.

I hope this essay can open a discussion on the real answer. It will have to contain a solution to the college debt crisis. It will have to take into account the fact that our current public school system was designed to churn out good factory workers, despite the fact that we now need entrepreneurs instead. Preferably, it should contain a solution to the public school system’s current problem of not teaching important skills (how to pay taxes, what laws exist and how to change them, etc), but I know better than to get my hopes up. I’ll settle for just finding a way to teach skills that are legitimately important for purposes of starting a career, such as the importance of both cheerfulness and good writing. But even if this essay can do none of that, I hope that it at least made you consider this debate in a different light.

See you tomorrow.

I am a 4-year-college opt-out. Here’s why.

A few days back, a family friend asked when I planned on going to college. I said, “I’m not. At least not right now.” I didn’t have the time to explain my reasoning to her, so I don’t think she understood. But here, I have the time and the words, and I’ll try to explain the reasoning behind this massive and unconventional life choice.

Let’s skip back ten years, to the summer of 2008. My siblings and I are debating with our parents about where to go for dinner. As with most families trying to decide on things, we vote on it. By purely counting heads, the option the kids want should win, but my parents throw a wrench into the rules: “adults get five votes”. Suddenly, the kids are outnumbered.

I don’t mind all that much – I still get free food, after all – but I’m curious as to the reasoning. “So we get five votes when we turn 13?” I ask, being a Jew, who gets her bat mitzvah and becomes an adult at 13. “No,” says my mother, “it would be silly for you to be able to just age into it. You have to earn your five votes. For our purposes, an adult is someone who’s graduated from college.”

From that point onwards, I made it my goal to get into Stanford, where both my parents went, and in fact where they met. It seemed an accomplishable goal: both my parents had gone, and so from a genetic standpoint I had everything I needed. Furthermore, I considered, they were not genetic flukes in terms of intelligence: most of my grandparents had gone to high-end schools. My maternal grandfather went to Harvard, my paternal grandfather to Yale.

I took my first class at my community college at 14, thinking it would up my chances for getting into Stanford if I already had an Associate’s by the time I graduated from high school. My brother, who had decided on a similar track, took the class with me. I wasn’t sure about a major yet, but it also didn’t really matter: there were a ton of prerequisites I had to take, for both high school and college, before I needed to worry about it. So, we took Spanish 1.

I had a great time in that class for a number of reasons. I was absolutely stoked to be going to college, albeit a podunk community college. My professor was great (only later did I find out that this was a blessing rather than a rule), the coursework required a lot of study but was nonetheless fun, and I got awesome grades. I felt I was preparing well to go to Stanford in four years.

The knowledge that I was going to a four-year college, and furthermore, I was going to a top-tier college (Stanford preferably, but Yale, Harvard, or something else comparable would also do), saturated my entire childhood. I made every decision based on what would get me into the colleges I wanted to go to. By my sophomore year, I’d either taken or planned for seven AP tests. When it came time to study for the PSAT, I spent nine months doing so to the near-exclusion of all else. For my Associate’s, I chose only those electives that would prepare me to apply to the colleges I wanted to go to.

Until sometime in the spring of 2018, when everything changed.

Unlike a lot of major life shifts, it didn’t happen slowly. It happened in one fell swoop of three chaotic weeks, as I realized three fundamental things.

Firstly, I was sick of taking classes. It had been four years, and community college courses had turned from a joy to a slog. Seldom did anything I worked on in my courses relate to the real world, and if it did at all, it reflected real-life work through a funhouse mirror. I was close to graduating high school with my Associate’s in computer science, and I felt I saw the light at the end of the tunnel. At last, I thought, I could start doing meaningful work and creating value for real people! Wait, no, I couldn’t. I had to go to college. Didn’t I?

I started to doubt my rationale for pursuing college so ardently. I’d decided I would do it when I was a child, mostly because my parents had both done it. When a child thinks something is worth pursuing, it’s not because they’ve done a cost-benefit analysis and decided that it’s the logical conclusion based on their knowledge and previous experience. A child thinks something is worth pursuing because it sounds impressive, fun, or cool.

Further, societal expectations had pushed me away from questioning the idea of going to college. Even when I questioned the usefulness of college, I needed only to look at any book or article, or to talk to any human being, and I would have my wishes to attend college validated. On top of that, even the people who said college might not be a necessity for everyone continued to say it was the best option for smart people. And given our societal propensity for scoring children on standardized tests, it was always very apparent to me how smart I was, at least from an intellectual standpoint.

But now it became apparent that college was not the best option for me, or even a terribly good one. Everyone knows that the cost of college in dollars is excessively and often prohibitively high, but on top of that, I had to face the opportunity cost. My goals in life, like most other peoples’, had to do with the real world, with making money in real life, with having a career. If I went to college, I would put all that off for four more years. And for what? A name on a résumé and a few connections. The former might not even be necessary: I didn’t know enough about the work world yet to know whether any of my future employers would even care whether I had a degree or not!

Lastly, I realized that I had another option. Sometime in the spring, I heard about a business internship program called Praxis. Their business model: create a more practical college alternative by giving young people a six-month professional bootcamp, followed immediately by a chance to apply what they’ve learned through a six-month internship at a startup.

The process of learning about Praxis was what kickstarted me out on questioning the path I’d presumed my life would take from childhood. I had to face the facts: recently, despite my stated goal and plan for getting into a top-tier school, I was moving towards it like a duty, an obligation. When I was younger, learning had been a joy; now, I yearned to apply what I learned. I had kept going because I saw college as an inevitable end for a smart person like me; if not that, what else?

The answer to that previously-unanswerable question became Praxis. The application process was intensive, with a multitude of essays and interviews on a very tight timeframe, but I came out the other end with a scholarship and a plan. A very different plan than the one I’d had before, but also a plan I liked a lot better. A plan that brought the light at the end of the tunnel closer, instead of further away.

It was still hard to cope with my decision. For the next few months after my turning-point, I doubted myself a lot. It felt horrible that I’d spent so long working monomaniacally towards a goal only to quit at the last second. But I had to remind myself, I wasn’t quitting. I was choosing a better alternative, since I had more information at seventeen than I’d had at eight (surprise surprise!). I reminded myself that the statistics showed the uselessness of college as a preparation for real-world jobs. That tons of people, entrepreneurs especially, became very successful without degrees. That the field I was going into—technology—didn’t have a strict degree requirement (unlike, say, accounting, where you cannot practice without a CPA, and to sit the CPA exam you need ~150 credit hours of college). That Praxis provided me with the sort of community I was hoping to get from a top-tier school.

At the time of this writing, I’m a month into the six-month professional bootcamp. So far, I’ve hand-coded my personal website (the one you’re on right now!), fixed up my LinkedIn and résumé, and created a personal pitch deck (more on that in this article). Everything I’ve done is immediately applicable to my career.

Contrast this with the inapplicable classes and assignments from last year: AP Latin, during which I badly translated texts by Caesar and Virgil that had been translated much better by others, and tried impossibly hard to be a little less horrible at literary analysis; AP English Composition, during which I wrote a ton of essays and analyses I’m never going to publish because the prompts are so obscure and the topics would be boring to read about, and also tried to be a little less horrible at literary analysis; and AP Java, which consisted mainly of writing code on paper, by hand, with a pencil: something no programmer in their right mind ever does.

Finally I’m working on projects and learning skills that will actually matter to me in the long run. While I was in school, I frequently had to say to myself, “This may seem obscure or stupid or useless, but it’s moving me towards my eventual goal, so it’s worth it.” Now, I don’t need to: everything I do has an obvious connection to my goal. I was dreading the next four years of my future; now, I have a fresh start.

I’m looking forward to it.