Why My Rabbi Asked, “Who Here is an Atheist?”

One morning in autumn many years ago, I was sitting in synagogue with my family. My granddad used to drag us there when he came over for the high holidays. Most of the service was spent on ritual prayers and readings in Hebrew, so I wasn’t paying much attention.

That is, until my rabbi asked a very odd question. “Who here is an atheist? Please raise your hands.”

I blinked in confusion as I watched the hands go up around me. From my vantage point (standing on top of the chair so I could see when the rabbi blew the shofar, which was always my favorite part of every service), I could see that maybe three-quarters of the synagogue had put their hands up.

Seeing the hands of my family raised around me as an indication that it was socially acceptable to do so, I put mine up as well. None of us had ever really believed the God stuff, after all, but I’d always thought we were a minority in this respect. Evidently not.

The rabbi nodded. Though his speech has eroded in my memory, it went something like this. “Faith is a tool to be used towards the goal of doing good deeds. If you wish to use that tool, you may; though I see many of you are not in need of it. But all of us must remember that it is just a tool. If you have all the faith and love for God in the world, but you are cruel to your fellow man, you are not a good Jew. You cannot fall into the Christian trap of worshipping the tool in absence of its purpose; you would not praise a hammer except for its ability to pound in nails.”

I came away from this with the realization cemented in my mind that Judaism is not fundamentally a religion. It is fundamentally an ethnicity and a culture.

If Judaism were primarily a religion, it would have some pretty major problems. For one, Jews aren’t allowed to proselytize: that thing that Christians do where they try to convert you to Christianity, we can’t do that. Nowhere in our holy books does it say that you’ll go to Hell if you’re not a Jew. And the reason for that is another reason that Judaism wouldn’t work well as a major religion: converting to Judaism is really hard. The two main ways of converting are marrying a Jew and being adopted by a Jewish family.

If you look at Judaism as a culture and ethnicity that simply arose from a religion, though, these things make sense. The quality of “Jewish-ness” is within my family, within my bloodline, and unless I choose to marry or adopt you (either of which would add you to my family), I can’t convert you.

Further, all Jews have what’s called right of return. Since I have it, I would be able to immigrate to Israel and gain Israeli citizenship if I wanted to, because it is my homeland, albeit indirectly. This right couldn’t exist if Judaism were much of anything besides an ethnicity.

Because Judaism isn’t primarily a religion, being a good Jew is the same thing as being a good person in general: be kind, don’t break just laws, have good morals, etc. Again, this makes pretty intuitive sense: we can’t be judged against our faith, so the only thing we can be judged against is our morality.

By contrast, when you have an actual religion (I’m going to use Christianity as an example, but I’m not picking on Christians; many religions work this way), there tends to be a problem with morality. A good Christian is someone who puts their love of God first. But sometimes, people tack “to the exclusion of all else” onto the end of that sentence, and the religious leaders don’t seem to mind. Actually, frequently the people who think that way are the religious leaders.

As a result, you have a lot of Christians (some of whom I’ve met) who say they follow Christ, but who seem to have completely missed the whole “love thy neighbor” thing. They were praised for their faith instead of for being a good person.

But, as my rabbi said, you shouldn’t praise the tool in absence of its purpose. Don’t praise faith in absence of its ability to help you be kind.

Explain Your Culture

I answered a lot of questions about culture growing up. As an American Jew, my culture was a minority, so nobody really knew about it. They didn’t know what I believed, what foods I ate on what holidays, what purpose those foods or those holidays had within the culture, etc.

Like many people in minority cultures, I was always happy to answer these questions. My family has had several non-Jews over for our holidays over the years, and when our goyish (informal term for non-Jewish) guests inevitably ask questions about the rituals or foods, we tell them. Once time I brought in kosher macaroons to work for Rosh Hashanah and I got to explain both the holiday and the concept of kosher.

These are highly informal and easy explanations. Our goal isn’t to proselytize—Jews aren’t allowed to proselytize anyway, but even if it was allowed, that’s not our goal so we wouldn’t do it—our goal is simply to educate. For example:

“This little funny hat is called a yarmulka, and men are supposed to wear it to bring them closer to God. Women don’t need to wear them because the ability to give birth brings us closer to God.”

“We prepare these foods because they’re culturally significant, or just because we like them. But we need to make sure that if we make something just because we like it, that it follows our dietary rules for holidays. Those rules are called kosher.”

“Rosh Hashanah is the Jewish new year. Our holidays run on a lunar calendar, not a solar one, so they shift around on the Christian calendar. And the current Jewish year is 5779, because our years don’t start from the birth of Jesus, they start from the birth of the Jewish race.”

Christians in America have it completely the opposite way. They can practically assume that their culture is ubiquitous, which has a lot of implications.

If your culture is ubiquitous, you never have to explain your holidays. You can just presume that people know about them. You can talk in depth about highly specific issues with just about anyone, because you can presume they have the necessary cultural background. Every business closes its offices in observation of your holidays.

To help my American Christian pals understand what it’s like to not be a cultural majority, consider this.

Imagine you had to ask your boss for time off to celebrate Christmas, which he has never heard of. Imagine driving over an hour to get to the only church in your area, when at the same time there are three different synagogues within a two-mile radius of your house. Imagine your entire culture decides to make Labor Day into a huge celebration, because you’re all sick of not doing anything while the rest of the country celebrates Rosh Hashanah. (This is exactly what happened with Chanukah. It’s actually a very minor holiday that American Jews made into a much bigger deal because they wanted something to do at Christmastime.)

Unless you decide to move to a non-European country, you’re probably not going to experience any of this personally, but that’s fine. There’s nothing inherently wrong or right about being a member of either a majority or minority culture.

There is, however, one thing that members of majority cultures could learn from members of minority cultures: an attitude of explanation.

Growing up Jewish, I never really understood Christianity. Not for any lack of Christians around me, for a lack of Christians around me who were willing to answer questions. People in majority cultures aren’t used to answering simple questions about their culture; if I asked who Jesus was, people would look at me like I’d just said I’d never heard of toilet paper. In their eyes, I’ve just said I don’t know about something they thought was both ubiquitous and completely impossible to live without. By contrast, however, I’ve had a ton of people ask me who Moses is.

Similarly basic question, different culture.

But if every member of a majority culture has this attitude, then the small percentage of the population that wasn’t raised with that culture is left out of the loop. They didn’t learn about the culture growing up, and they never will.

So, the best thing to do if you’re a member of a majority culture is to be willing to answer questions. Even questions that seem like they ought to be obvious.

The (Perceived) Problem with Long-Distance Relationships

Is love an emotion or a choice?

If you’re like many people, you’ll say that it’s an emotion. It’s the floaty, bubbly feeling you get around someone. It’s the perfection of every little thing they do. It’s the pointlessness of the rest of the universe when you’re together. To quote Dean Martin, when the world seems to shine like you’ve had too much wine, that’s amore.

What if I told you you’re wrong? And what if I told you that this definition of love is the biggest cause of failed relationships?

Hear me out.

Relationships are hard. Lots of people say that. And on a surface level, if you’ve been in a relationship, you understand the truth of the statement intuitively. But let’s look deeper. If love is an emotion, how can relationships be hard? Deciding to keep working on a project even though it’s complicated and difficult is hard. Deciding to not give up on your little sister even though she’s being an entitled brat is hard. Deciding to apologize to your lover after a fight is hard.

Being happy isn’t hard. Being sad isn’t hard. Being angry isn’t hard. And being in love isn’t hard.

What’s hard is maintaining a relationship.

Thus, there have to be multiple components to love. One part is of course the feeling awesome at the beginning, but another is what many people call commitment: the choice to be together, to care about each other, to support each other through thick and thin and such. From experience, the latter is much more important. Your brain acclimates to anything after a while, even the company of The Perfect Person™, and eventually the emotion will fade. Your commitment will not. Do you honestly think that those couples who’ve been together for 70+ years are still love-drunk?

What does this have to do with failed relationships, long distance ones especially?

If you think that love is an emotion, you’ll just quit when you acclimate to their presence and the emotion leaves. You’ll think you don’t love them anymore. In reality, you’re simply no longer infatuated – that’s the term I’ve come to realize refers to that initial state of love-drunkenness. You’re perfectly capable of continuing to love that person if you simply commit.

Since distance can prolong infatuation by virtue of not seeing the person very often, long-distance couples who move in together are most susceptible to this problem.

If, on the other hand, you know that love is a choice, you won’t need to worry about what happens when the world stops shining. It’ll keep on turning nonetheless. Your love will go on. You’ll actually be able to work through the logistics of a real relationship as opposed to simply drifting through it because nothing except that person’s presence matters.

It’s not like once the infatuation wears off, you have no feelings for this person anymore – you’re still affectionate and loving – but you no longer feel like the only sustenance you need is their company. You start to decide that no, that habit isn’t endearing, it’s annoying. You start to notice stuff they do that isn’t perfect. And over time they see those things in you, too. But if you’re committed, you both work around the things that you can’t change, and you work on fixing the things you can, and ideally you both become better people in the process.

That is what relationships are about. That is the difference between love and mere infatuation.

Do People Want To Learn?

I’ve written before about how the public school system doesn’t teach the right things. But there’s a bigger problem underlying the whole rotten mess of concrete and bureaucracy that is the modern public school system. There’s one single assumption that underlies the whole thing, and that one assumption is untrue.

That false assumption? “People don’t naturally want to learn.”

If you believe people don’t naturally want to learn, then what about babies and toddlers? Nobody formally teaches little kids to sit and crawl and walk and talk, but everybody knows that all little children learn these things. It’s really obvious that little children are wired to learn and to learn voraciously. Just look at any two-year-old who annoys the grownups by asking so many questions.

So if the concept of “people don’t want to learn” doesn’t happen until later, when exactly does it happen? If you look at kids, it seems to happen right around school age. Children who a year ago would be annoying with their extreme curiosity mellow out, then proceed to sink further into “I hate learning”.

Still, the same exact children who don’t want to learn in school continue to learn voraciously about things that interest them. It may be things adults don’t approve of, like cartoon characters, or video game stats, or how to bypass the screen time lockouts on their phones. But this is still learning, and it’s still curiosity. It’s learning in absence of being forced to learn, which is why it continues to be fun. So evidently, people can and do learn things that they’re motivated to learn and interested in learning, at all ages.

“But people don’t learn the things they need to learn!” you may exclaim. Let me ask you, what exactly is it that we teach in school that people need to learn? And how do we know that they’re not going to learn those things naturally, outside of school?

What do people need to learn? Reading. Writing. Basic arithmetic. How to exist as an adult. But everyone learns these things of necessity; you can’t function in the world without them. You don’t need school to teach that. And after they have the minimum knowledge they need to function in the world, individuals follow their specific interests to logical conclusions.

Still, what about all those other things that we teach in schools? Spanish, differential equations, mitochondria, whatever? What about how to get into college?

Interestingly, there is a strong and growing subculture of people who raise their kids with no enforced education. And the research shows that these kids can get into college and have successful careers at rates equal to or even greater than that of the publicly or privately schooled population. (Sources: Smithsonian, KQED)

So if just letting kids do what they want is so great, why do we all think instinctively that it shouldn’t work?

John Holt wrote this in his book How Children Learn. “All I am saying in this book can be summed up in two words—Trust Children. Nothing could be more simple—or more difficult. Difficult, because to trust children we must trust ourselves—and most of us were taught as children that we could not be trusted. […] What we have to do is break this long downward cycle of fear and distrust, and trust children as we ourselves were not trusted.”

We don’t think unschooling should work even though it does because the societal wisdom about children, which we all have somewhere in our brains, is wrong. We were taught not to trust how children naturally learn. But we were taught by the very system that profits off not allowing children to learn naturally; we were taught propaganda.

If you don’t need to force people to learn, then, is there no place for teachers, classes, students?

No. There is still a place for that. Just look at all the non-mandatory classes that people take over their lives. People take classes in music and art and tech and science and history and every other thing. Classes can be a very effective way to learn… if the people in them want to learn.

When I was getting started as an artist, I experimented with a number of media through taking classes. When I signed up for them, they were explicitly “for adults”. Not because they had any risqué content, just because they didn’t have anybody to be the schoolteacher, the authority figure. They were meant for adults because they trusted adults. They didn’t trust children.

With some combination of my mom’s persuasive skills and my dashing charm (just kidding, I was like twelve; it was 100% my mom’s persuasive skills) I got into these classes “for adults”. One of them was a wildlife drawing class.

It was a ton of fun and a great experience. I’d been out of school for a while at that point, so I didn’t think it was strange that the teacher just walked around giving advice and making critiques, telling us to help ourselves to complimentary cookies and soda while we drew. I made a few friends in that class, most of whom were many times my age.

A few years later, I took a ceramics class. This one was explicitly “for teenagers”; I think the age range was 15-18 or 13-18 or something like that. The kind of thing that’s meant as an extracurricular for high schoolers.

It was a weird experience. Besides the complete lack of age diversity, there were a ton of really weird rules and expectations. No more than one person was allowed to leave the studio at one time to use the bathroom. I wasn’t particularly annoyed since it didn’t inconvenience me, I was just baffled. It was so unnecessary.

Not only was the class setup weird, but the teacher was also weird. They (I don’t remember their gender) were really distant and not friendly at all, and they seemed to expect this kind of deference. You know those pompous customers you get working retail, where they just expect you to hand them the universe on a silver platter? This teacher acted a bit like that.

I talked to my mom about it on the ride home, and she informed me that it wasn’t that the class or the teacher was weird. It was because it was a class for teenagers.

With classes for adults, you can be sure that 100% of the people there are there because they want to be. Nobody forces an adult to take an art class. If the student has learned what they wanted to learn, the objective of the class has been achieved. But with classes for teenagers, it’s a completely different story. The teacher can’t be sure that the student wants to be there, or wants to learn. Further, they don’t have to answer to the student; the real master for a teacher of teens is those teens’ parents.  The teacher tries their best to make the class interesting and fun, but they have to control what the kids do so that the parents are pleased, and generally act like a schoolteacher, which severely limits their ability to do that.

There is still a place for classes and teachers. These are valuable things. But the public school environment, where the students don’t want to learn and the teachers don’t want to teach and literally nobody wants to be there at all, that is not useful.

So where do we go from here? How does the establishment change?

I propose using the funds that are currently being funneled into the public school system and use it to fund optional classes, held at public libraries. After the “school subjects” are made optional, we can decide to make things mandatory which are important for everyone to know regardless of their interests; things which are necessary for functioning in modern society. Teaching basic technology, psychology, and economics would be a good start: after all, there’s an awful lot of people, tech, and money in the world right now. It also makes significant sense to teach people stuff like basic self-care and first aid, what laws there are, how to pay taxes, how to get insurance, etc etc. These mandatory things, then, can fill the psychological void left by the public school system (appeasing all the grownups who love telling kids what to do), as well as filling the physical void of the empty school buildings.

What do you think? If you’ve got ideas for how the system could be changed, or reasons why it shouldn’t be, stick them in the comments. I’d love to hear from you.

View From the Bicycle Café

Bicycle Café, North Park, PA

For my painting class, I had to go around town and do five “thumbnail sketches” of landscapes. Only one of those sketches would be used to do a landscape painting, which was the next assignment.

This came out of one of the sketches my professor rejected.

The Purpose of College, Past and Present

A lot of people complain that college isn’t doing a good job preparing people for the workforce. They toss around statistics like “only 27% of college grads have jobs related to their majors” and “only 62% of college grads have a job that requires a college degree”. Evidently, the only goal of college nowadays is career preparation, and colleges (or maybe college graduates) are failing at this goal.

Why is that? What’s wrong?

To answer that, let’s look back 200 years, to the Revolutionary War. When the first publicly-funded colleges came into existence, their purpose was to educate the top 5-10% of highly gifted white boys, so that they could become leaders and exceptional men. Our modern sensibilities might be offended by the exclusiveness of “small percentage of highly gifted white boys”—it leaves out girls, people of color, and non-highly-gifted white boys—but for the time, it was very progressive. In Britain, college was explicitly a privilege for only the very wealthy. The Americans, then, wanted to move away from that; they decided that college should be available to any highly gifted white boy, regardless of income.

One of the main contributors to this ideal was Thomas Jefferson, who wrote Bill 79, “A Bill for the More General Diffusion of Knowledge”, which was presented numerous times through the 1770s and 80s before a heavily revised version was put into law in 1796.

This was still the general attitude through the 1800s. When W.E.B. Du Bois argued for the education of African Americans, he expressly said that college is for only the “talented tenth”, or the “best minds of the race”, so that they can “reach their potential and become leaders”.

“The Negro race, like all races, is going to be saved by its exceptional men. The problem of education, then among Negroes must first of all deal with the Talented Tenth; it is the problem of developing the best of this race that they may guide the mass away from the contamination and death of the worst, in their own and other races.”

This view remained until the turn of the twentieth century, when it gradually started to change. College became a status symbol; after all, it was only available to the top 5-10% of people, so having gotten into college meant you had something special. Something special that employers wanted working for them. College graduates, then, got their pick of the best jobs.

Since everyone wants great jobs, everyone wanted college. As such, the GI Bill was passed, opening college to a much larger population following WW2. The problem was that college itself was not causing people to get great jobs, it was a status symbol that was merely correlated with getting great jobs, and people committed the fallacy of post hoc ergo propter hocI’ve written about this before.

As college became available to more and more people, it became less useful as a symbol. The symbol was, after all, about being part of an elite few. When this happened, college stopped guaranteeing good jobs, or in fact any jobs at all. But yet, the cultural zeitgeist had shifted: college was for jobs now, not for educating future leaders. It doesn’t matter that the curriculum has hardly changed at all since 1851. College is supposed to magically procure jobs, despite the fact that it has absolutely no way to do that.

College had never been designed to prepare people for specific careers that required specific skills. For the elite future leaders it was designed for, it taught more generic things like developing a cultured and educated and intellectual character. This is great and all, but it doesn’t give you diddly for actual marketable skills.

During the actual time period, when the top 5% were going to college to learn how to be great leaders or whatever, everyone else was learning actual job skills through trade apprenticeships. In fact, many of the leaders did this too: they both went to college and apprenticed at a trade, so they could have a means of making a living while they worked on shaping the nation. Being a person who shapes a nation doesn’t come with a paycheck.

So. Why is college failing at getting people jobs? It wasn’t designed to do that in the first place. During the brief period that it did do that, it was because of correlation based on rarity and status, not causation based on education. And now, despite being basically the same thing that it’s always been, college is saddled with the artificial purpose of getting jobs for graduates, which it is incapable of doing.

Once you know the real history, all the artificial, retroactive explanations of the modern day fall away. All the justifications of the continued existence and high price of college fail to make sense. You start to notice that there is literally nothing that colleges do or pretend to do that can’t be done more effectively somewhere else for a tiny fraction of the cost.

You want a liberal arts education? Sure! Go to the library. Read the classics. Read Shakespeare and Dante and then learn Latin and read Caesar and Catullus and Cicero. Go to museums and learn about art. Go to symphonies and learn about music. You don’t need a university for a liberal arts education.

You want to learn more pragmatic stuff, like math and coding and writing? Sure! Take some online courses. I recommend Gotham Writers, Udemy, Free Code Camp, Edhesive, and Khan Academy.

You want a great career? Sure! Look over the job market, see what kinds of skills are marketable, see what kinds of skills you’ve got, and start looking. If you’re just getting started, I recommend the Praxis program.

You want to network with other smart, interesting, accomplished people? Sure! There are a huge number of online groups and forums, as well as tons of conventions and other in-person gatherings.

You want a status symbol to put on your resume? Well, okay, maybe for that you want college. Get good grades in high school, get 5s on APs, ace the hell out of the SAT, get National Merit, don’t forget your extracurriculars… basically work your butt off for four years in high school, then apply to a bunch of colleges. If you’re both lucky and successful, you’ll get the opportunity to pay a bunch of money in order to work your butt off for four more years so you can put Harvard, Stanford, Yale, or whatever on your resume. And yes, this is a very useful and valuable status symbol. But it’s taken eight years of your life and possibly hundreds of thousands of dollars.

In case you want a different route than that, try this. Go apply for a job at prestigious companies in your chosen field, and if you don’t get in, develop your resume and skillsets until you do. It’ll require about as much hard work as college, but unlike college, you’ll actually learn useful skills in the process. Also unlike college, you won’t be in debt by the end; rather, you’ll have been paid for your trouble.

The only reason you should consider going to college is if you’re planning on going into a field that is very, very strict about requiring one. (“All the job listings say they require one” does not count.) For example, if you want to be an accountant, you need a CPA. You cannot sit the CPA exam without 150 credit hours of college, so you’re kinda stuck there. Similar concept with doctors, lawyers, and the like.

Still, in those circumstances, it’s just because the world hasn’t caught up to the uselessness of college yet.

So, should you go to college? Maybe. It depends on your specific goals. But is college the right path for everyone? Absolutely not. And is it a surprise that college isn’t doing a good job at preparing people for careers, given the history? Absolutely not. Actually, it’s a surprise that it’s doing as well as it is.

For everyone to have a good path, the entire educational system needs to be overhauled. But for any given individual, just think long and hard about your career. Don’t march in lockstep down the college path just because college.


A version of this post was published on Praxis’s blog on Nov 5, 2018. Check it out!

Public School: Rethink the Concept

Let me ask you a question. If you could magically instill every youth in America with specific knowledge, what would you teach them?

Presumably, you’d want to teach them something that would be useful to every one of them, so, what kinds of things are important for every American? How about you teach them how the American government works. The world economy. The Fortune-500 companies. You could tell them which things are legal and illegal, because though everyone knows murder is illegal, there are other things that are more complicated and less obvious. You could teach them their human rights.

Perhaps you could also teach people how to take care of themselves. You could explain what medicines to take for what problems, symptoms for common ailments, and under what circumstances to go to the doctor. You could tell them about things that are harmful to their health: smoking, vaping, unprotected sex, etc. You could talk about symptoms of mental illnesses and healthy ways to cope. You could teach them first aid.

Why not also talk about practical life skills? How to get a job, vote, pay taxes, get a mortgage, get and maintain insurance, or budget finances. Most people are going to become parents, how about we teach them how to raise children?

These are not theoretical questions. We have a method of instilling knowledge into American youth. It’s called public school.

If you think about it, the basic concept is ingenious. We have a program with mandatory attendance, for which purpose we have the resources to transport children to and from a truly gargantuan number of individual buildings. At each building, we have a standardized curriculum, which has specific yearly checkpoints for completion. For twelve whole years, from age six to eighteen, we have the undivided attention of the nation! The undivided attention of the future!

Yet alas, we squander our opportunity. We teach pointless trivia that, in the age of the internet, can be found out instantly. We force people to learn things that aren’t useful to the majority of them.

Why do we do this?

Governments move slowly. The things which we teach in school today would have been much more useful to have memorized when you actually didn’t have a calculator on you, sixty or so years ago. Part of the problem is that the bureaucracy just hasn’t caught up yet.

But there’s another problem. Though people are pushing to change schools, they’re all pushing in different directions. Many of them aren’t asking the fundamental question: “what is the point of this period of mandatory education, anyway?” And of those that are asking, most reply that the goal is college. As if that does anything other than pass the buck.

It seems to me that the buck should stop immediately. The purpose of educating youth is to prepare them to be adults. One part of being an adult is making a living. Another part of being an adult is being a good citizen (knowing what laws exist and how the government works, perhaps also learning history and civics). Adults need to be financially self-sufficient. Adults need to know how to avoid scams. Adults need to know how to raise children – even if they themselves don’t have children, they will inevitably be around kids at some point. Adults need to know how to care for themselves and others.

We teach none of that in high school or college.

A lot of people have it stuck in their head that it has to work this way. That public school is supposed to be useless; as if it’s a necessary evil. That teaching everybody calculus and teaching nobody first aid is a reasonable state of affairs. It’s not.

There needs to be a complete rethinking of the purpose of the school curriculum. Not just “how do we do a better job of preparing more people for college“. Not just “how do we tweak the existing formula to make it a little better in some areas”. We need to completely rethink the concept.

Why You Should Mix Your Own Black

Mix your own black what? Mix your own black paint.

 

You can buy black paint from a store. It will be just about the purest possible black, the exact color of “black as the pit”. But here’s why you shouldn’t do that.

First of all, what black you use is important. You’ll use it as a base for all your darkest colors. Since without dark tones, light ones don’t stand out, what color you use to mix those shadows is going to be one of the most important parts of your painting.

A lot of artists like to talk about the “soul” in a piece of art. That seems confusing, but here’s what it means. Each person sees the world differently, so what you choose to paint (in terms of both the subject you paint and what colors you use to paint it) depends on how you see the world. How exactly you choose to make your black depends on how you see the world, too. So, if you mix your own black, it will fit in better with the rest of your painting. They both have your “soul”.

One of the apparent downsides to mixing your own black is that it will never be consistent. But this is actually an upside in disguise. For example, for this painting, the black I made was tinted purple. However, the black on my palette above (which is for a different painting) is tinted brown. A purple-black suited the former painting more, where a brown-black suits the latter. If I’d used store-bought black, I wouldn’t have gotten to make the decisions which led to blacks suited to their respective paintings: they would all be banal and generic.

Could I have mixed one single color into store-bought black to attain a tint? Yes. But in that case, I as the artist would only have chosen one color; the paint company chose the rest of the colors to make the black for me. For me to have the most control over my own painting, I choose instead to mix my black.

So basically, you should mix your own black because then, it fits with your painting better. You made them both, and not only that, you made the black to suit the painting.

There’s one more reason to mix your own black: you get better greys. When you use store-bought black, your grey turns out as a very lifeless, generic, neutral grey. But the problem is that most greys are not generic neutral: they’re tinted with something. The tablecloth on my dining table is a light grey tinted with yellow. The paper of my Oxford Classical Dictionary is a light grey tinted with orange. The surface of my electric keyboard is a mid grey tinted with blue. If I were painting these things, I would want my greys to reflect all these differences, which is why I would mix my own. Store-bought black will not give you interesting greys.

Now that you know why you should mix black, let me briefly tell you how.

To mix black, just take your darkest colors and mix them in different proportions, depending on what you want your resulting black to look like. When I mixed my purple-black, I took ultramarine blue and crimson in equal parts, then mixed in a bit of dark green and burnt sienna. By contrast, when I mixed my brown-black (above), I replaced the crimson with burnt sienna and vice versa, and added proportionately a lot more of the dark green. Same colors (I only own nine total), but the different proportions produced a different result.

That’s all there is to it! Now, go forth and mix yourself some black.

What Is a Tech Cert, and What Is It For?

Last month, as a part of my portfolio project, I got three MTA (Microsoft Technology Associate) certifications, in Java, JavaScript, and HTML/CSS. I documented this fact through my project updates, but if you don’t know what those are, I didn’t offer a lot of explanation.

So, let me take this opportunity to explain what a technology certification is, why it matters, and why if you had to choose between some certs and a college degree, you should choose the certs.

There are a wide variety of different certs, offered by a large number of companies, which demonstrate proficiency with a ton of disparate technologies. Each cert is accorded a certain level of respect in the tech space, based on how central the company offering the cert is to the area of technology tested by the cert. For example: SQL Server, one of the most common flavors of SQL, is owned by Microsoft. As such, the MCSE (Microsoft Certified Systems Engineer) in SQL is one of the most highly respected SQL certs.

Most companies who offer certifications offer them in tiers. I’ll use Microsoft as an example. The MTA is the lowest level of Microsoft certification. After that, the next level is the  MCP (Microsoft Certified Professional), and then the MCSA (Microsoft Certified Solutions Associate). To get an MCSA, you need to pass three or four related MCP exams. The top tier is the MCSE. To get an MCSE, you need to already have an MCSA, then pass one or two additional (much more difficult) exams.

These certifications demonstrate specific things to employers. For example, “I have an MTA in this” means “I am skilled enough at this to get an entry-level job doing it”. “I have an MCSE in this” means “I am an expert at this and have a lot of experience using it”.

Tech certs and college degrees occupy very different niches. Where tech certs demonstrate that you have a certain level of hands-on skill with a specific technology or technological area, a college degree demonstrates (hopefully) that you have a broad proficiency with technology in general. Where a certification is product-specific, company-specific, and clearly leveled, a degree is more subject area focused and doesn’t necessarily include any particular language or technology, and while it does  come in levels (AS, BS, MS, PhD),  the levels correlate to time spent, more than skills earned.

If I say “I have a CCIE”, you know that I have a very high level of technical knowledge about routers and networking in general, and Cisco routers specifically (Cisco being one of the main manufacturers of routers). This is incredibly useful knowledge to employers, who now know exactly what I can do for them. If, however, I said “I have a Master’s degree in computer information technology”, you only know that I’ve spent six years immersed in technology. You don’t know what kind of technology, and if you’re an employer looking to hire someone who knows how to work with Cisco routers, you’ve got no clue if I can do that. My degree might have required one class in networking, or ten, or none at all. You have no idea.

It’s not just that degrees can be highly unspecific and not very useful to employers looking for specific skills. When someone says “I have a degree”, you don’t even know if their knowledge is up-to-date.

Certifications are always up-to-date, in one of two ways. Some certifications are only valid for a certain amount of time. For example, the CISSP (Certified Information Systems Security Professional, a cybersecurity certification) expires after three years, at which point it needs to be renewed. Other certifications are version-specific. For example, if you get an MCSE in SQL Server 2012-14, you have it forever, but you’ll probably want to get one in SQL Server 2016 as well once the newer version becomes ubiquitous.

But a degree doesn’t work like this. Once someone is taught a thing in a class, there is no requirement that they maintain that knowledge to keep the degree up-to-date. Furthermore, the thing taught in the class may not even have been up-to-date at the time it was taught. A lot of colleges have reputations for teaching things that are out of date, not out of malice, but because tech changes faster than colleges can. It’s rare to find college classes that teach the latest and greatest, and it’s common to find colleges teaching material that’s out of fashion, unnecessary, or just plain obsolete.

There’s yet another problem with degrees: they aren’t vendor-specific. I mentioned the CCIE before: the networking certification offered by Cisco. That last part is important. The benefit of a CCIE isn’t just that it says “This person knows routers.” It’s that it says “This person knows routers. Sincerely, the guy who makes the routers.” With a degree, it’s like “This person knows how tech works. Sincerely, a college that you may or may not have ever heard of.” So it’s not just the lack of information, it’s also the lack of credibility behind that information.

It is also important to note something kind of unique about tech: many of the seasoned working professionals in the tech space don’t have degrees in tech. This is because when these professionals were starting out in tech, the field was so new that there weren’t many degrees in it. They got degrees in other things: math, electrical engineering, underwater basket weaving, whatever. The thing that made them tech professionals was their technical knowledge, not what came after “BS in” on their resume. And nowadays, with the advent of certifications, these professionals have MCSEs or CCIEs, not college degrees.

Basically, if you have to choose between a candidate for your job who has a college degree in technology, and a candidate who has no degree but has several certifications and a good portfolio, you’re going to pick the latter. Even so, there is a reason for degrees in tech to exist besides “colleges wanted to jump on the tech bandwagon”.

Firstly, college degrees in tech demonstrate a commitment to learning technology over an extended period. Since you can get certs over however long you want so long as you keep them up-to-date, an employer has no idea whether the candidate with three certs has spent one year or five learning that material. College, by contrast, has a consistent, fast pace, so an employer can infer from a college degree that the candidate is capable of maintaining a high workload for an extended period of time.

Second, college degrees are designed to give you an understanding of the fundamentals of the field you’re entering, including at least the basics in the breadth of the field, plus some significant depth in at least one area. A major downside of having only certs is that you may not have a good foundation: you may be a one-trick pony who knows only one thing, and may do that one thing well, but may not know about how it fits into the bigger picture, what it depends on, or what depends on it.

The big advantage of a degree is that you’re going to come out of it with a general understanding of every major sub-discipline. You come out of a degree in tech with a general knowledge of networking, operating systems, electronics, math and logic, and programming. This is generally how every degree works, and it is in fact a valuable service that colleges provide.

The theoretical ideal candidate has both a degree in technology (for the breadth of knowledge and commitment) and multiple certifications (for specific, up-to-date knowledge about vendor technology). Often in the real world, though, employers will pass up someone with a nice shiny degree in favor of someone who has the right certs and a good portfolio.

11pm, First St.

A selection from a recent painting of mine. WordPress wouldn’t let me upload the full photo so I had to crop it 🙁

I met my fiancé at a convention in Baltimore, at one of the last events of the second day. After it was over, we couldn’t find anything else to go to or do, so we walked through the rooftop garden. Eventually we ended up sitting on a concrete bench at the edge of the sidewalk, looking into the cloudy sky. We sat together and talked until it started raining.

This is painted mostly from my memory and partially from the few photos I found of the Baltimore Convention Center rooftop garden (it’s a shame there aren’t more; it’s beautiful). Originally I tried to draw this with markers, but it wasn’t working the way I wanted, so I decided to pull out the oil paints and paint it instead.