Drawbacks to Formal Education

By Brian Tomasik

First published: . Last nontrivial update: .

Summary

This piece enumerates some downsides of educating oneself in a formal degree program. Of course, whether these reasons are decisive depends on the specific situation.

Contents

Introduction

It was May 2009. I sat in a hot room in the Lang Performing Arts Center at Swarthmore College, waiting for my last Honors exam to begin. I thought to myself with excitement: "After 11 years of graded tests, this may be the last written exam I ever have to take in my life." And so far, that fact has remained true.

I really like school. I've had a deep love of learning ever since around 8th grade, when I began taking altruism seriously and realized that what I learned had real-world importance. But I also found school to be somewhat tiring. It required lots of memorization, skills practice, and assignments that didn't have relevance to my personal goals.

Now that I'm not engaged in formal education, I'm more deeply in love with learning than ever. Some days I become obsessed with exploring a topic and dread the moment when I'll have to put it down to do something else. I'm able to absorb many topics that expand my view of the world and give me a clearer picture of the landscape of reality.

Many times during 2013-2014, I asked myself whether I should pursue formal continued education so that all the learning I'm doing on my own could be translated into a signal of expertise—either to show future employers or just to quote in my biographical summaries. Each time I decided that formal education was suboptimal given my situation. The following section lists the reasons.

Why not formal education?

Cost

Many reputable master's degree programs cost at least ~$30-40K per year. (This may not be true in some European countries where Master's programs are fully funded by the state.) There's also a time cost to apply and possibly move.

Teaching load

In a PhD program, the financial cost of tuition may be compensated by the time cost of teaching. I'm told that in some European countries, even humanities PhDs are well funded, so the pay can be decent. In the US, philosophy PhD programs involve steep competition for less funding.

As an example, in the University of Minnesota (UMN) philosophy PhD program, teaching assistants (TAs) earn $15,000 per year plus health care plus tuition for a nominal 20 hours of work per week. (Personally I'm slow and expect I'd take longer. One friend of mine in another kind of program said his TA duties took 20-40 hours/week. However, another friend reports spending only ~6 hours/week.) Say the health insurance is worth, generously, $10,000. Ignoring tuition, the value is $25,000 for 20 hours/week. But if you were instead working in software, you could earn over $100,000 per year at a time cost of 40-50 hours/week. If you don't attend grad school, you don't need tuition. So it looks like earning for a year or two and then living on the savings would allow for more total research time than being a grad student who does TA work.a

Of course, there may be significant altruistic benefit to serving as a TA because you can influence young minds. And you do learn important material in the process. These may be significant countervailing factors. But other grad-student tasks like grading papers/exams and applying for grants are less edifying. On the other hand, UMN also offers paid research-assistant positions, which could be even more useful than teaching depending on the topic.

My irregular sleep schedule

I'm not good at maintaining a regular sleep schedule. Even if I do keep one, it often involves staying up too late and being tired for early-morning classes. Being tired in a class is a waste of time. In contrast, when I learn on my own, I'm always completely alert, because if I'm not, I can just go sleep until I feel refreshed.

Fixed syllabi

I learn best by studying whatever I want whenever I want, based on intrinsic motivation. For instance, maybe I've gotten really excited about cognitive architectures recently and want to read about those. Other times I might desperately want to learn physics. When you follow your motivations, learning is effortless because you're always doing exactly what you want to do. In contrast, learning in a formal course means accepting the pace of the instructor. And even if you've gotten sick of the material, you still have to learn it. I feel that when I'm doing something I'll want to do later but don't feel like doing now, I'm wasting willpower points.

Loss of intrinsic motivation

Some studies suggest that if you pay people more for a job, the intrinsic satisfaction they derive from it declines. In a similar way, learning is fun until someone makes you do it. Once you get graded and have to complete assignments, the shift switches from learning for its own sake to learning because you have to.

Narrow focus

Exploring the best ways to make an impact on the world means taking a big-picture view and knowing something about everything. Getting a degree to impress employers in a particular field means focusing narrowly. These goals are not particularly compatible.

Locked-in focus

Enrolling in a degree program commits you to spending the next 2 to 7 years on that topic. Historically my views on the most important subject for me to learn about have changed on a timescale more like every year. It doesn't make sense to get part-way through a program only to decide that another focus area may have higher priority.

Wrong focus

In a degree program, you direct your attention to what material you'll need in order to complete the homework and do well on the exams. When I learn, I seek big-picture understanding of a subject and look for connections between the material and altruism. These are two different types of tasks, and in view of inattentional blindness, focusing on one may hinder focus on the other.

In July 2014 I re-learned some biology material I had been taught in 9th grade. This time, rather than aiming to memorize vocabulary words, I was looking for a high-level sense of how molecular biology works and what kinds of sentience-like complexities such systems contain. Learning the same material gave me a significantly different perspective based on what I was trying to get out of it.

As another example, I've occasionally toyed with studying physics in a formal setting, but then I realize that manipulating symbols in problem sets is not my main purpose. Rather, I want an intuitive picture of how physics works at a high level so that I can adjust my philosophical orientation at large. This is better served by casual dabbling in physics to get a gist of the subject rather than, say, learning the exact methods of solving equations.

Can't skip less relevant material

I think it's useful to expose myself to some material that might at first glance seem unimportant because my initial assessment might have been wrong. However, once I have enough experience in a subject, I can generally tell what's more important and what's less important for my purposes. When I learn on my own I can skip the less relevant details, but this isn't true in a structured course.

Tests are silly

Memorization doesn't make sense. Time limits don't make sense. In some subjects there can be some purpose in developing a facile handle of the material—such as maybe memorizing multiplication tables. But generally in the real world, when you're working on a task, you develop facility naturally with whatever you do a lot of. There's no need to memorize the countries of Africa or the names of cell organelles or how to solve differential equations. You can just look them up if they become important later. Most material isn't important to retain in one's head, which is why students quickly forget it after the course ends. Forgetting is an important part of intelligence. What matters is whether you get the "gist" of the material so that you know what to look up if it becomes relevant later on.

Whither formal education?

I think the reasons to attend college rather than studying autodidactically are, in priority order:

  1. signaling intelligence and assiduity
  2. networking with others, especially people who may become influential later on
  3. actually learning the material.

Most of what I learned in college I could have absorbed on my own at least as easily. To be sure, what I learned was very important, but there's no reason to pay tens of thousands per year for something you could get for free (or maybe a few hundred dollars to buy the textbooks). College is essentially an expensive status symbol. I wonder how long conventional college education will remain in the face of online courses.

The prestige of elite universities is overblown. I know a number of people who attended non-elite schools, or didn't even go to college at all, who are really smart. Presumably there's a statistical difference on average between elite and non-elite students, but it's hard to discern in a real-life context. When people's alma maters aren't explicitly named, I often can't tell which people in a group have elite backgrounds and which don't. I find the same for elite vs. non-elite professors: I can't tell based on their papers and lectures whether I should regard them as at the top of their fields or not, until I look at their affiliations. Similar comments apply for being a specialist in a field. In the age of Wikipedia and the Internet, ordinary people can become deeply informed about academic topics, and I'm aware of several autodidacts who know much more about a particular subject area than those who studied that general field at school. Of course, there's selection effect at work here: The autodidacts whom I read are obviously more capable than the average person given that they're writing about the topic. But my point is that one doesn't need a degree in a field before one can make contributions in that field. In general, I find that those who express most awe for advanced degrees are those who know least about the subject matter in question.

I think my pursuing continuing education would not be worth the signaling value. In general, I feel that people should do what they know is most useful at any given time and not worry about long-term plans to signal their competence. Prove what you know by what you do. Make use of your knowledge. Too many instrumental subgoals over too long a time horizon can lead to getting sidetracked. I think it's better to adopt an agile approach in which you directly start tackling problems and see where that takes you—rather than making elaborate plans that become irrelevant when the winds change.

In any case, I've never found that I couldn't do something because I lacked formal credentials. If you want to publish a book or give a TED talk, you should probably start writing and lecturing for public audiences right now, rather than spending 7 years to add three letters after your name. There is a correlation between having a PhD and being intellectually influential, but this is easily explained by confounding variables.

All of that said, I do think college is extremely important in society at large—especially liberal-arts education. Not everyone will explore diverse topics on their own, and the distribution requirements of colleges help create more well rounded students. There's a social benefit to the pairing of liberal-arts course requirements with the employer-desired signal of college education. Without college, probably society would be more narrow-minded.

Learning from colleagues vs self-study

Sometimes it's assumed that you'll learn a lot more about a topic by being immersed in an organization where people are working on the topic than you could learn from the outside. This is sometimes seen as a significant argument for pursuing careers where you'll get access to the inner workings of a company or research field.

I think if your goal is to learn some very specific things being done by a particular group, then yes, there is no substitute for being there directly. Often organizations have a lot of "tribal knowledge" that's not written down, and if it is written down, it's probably in internal chats and private documentation.

If you have a personality such that you're energized when people around you are working on something, and you find it unmotivating to read things on your own, then it also becomes more likely that immersion in a particular environment will amplify your knowledge of that domain significantly.

However, I think there are many cases where being immersed in a particular environment is not necessary when your goal is merely to learn about a given field at a relatively high level of abstraction. That's because there's often a great deal of information publicly available to learn from: academic papers, blog posts, interviews, social-media posts, Reddit discussions, open-source software projects, etc. I find that Reddit is often a good place to see people in a given field speaking candidly about a topic with a minimum of corporate-speak obfuscation.

When I worked at Microsoft from 2009 to 2013, I learned a great deal about the particular things our team was doing, but I didn't learn that much about technology in general (or even, for that matter, about parts of Microsoft outside of my core team). Since I left officially working in the software field, I've continued learning about information technology on my own, filling in a lot of the gaps that I was formerly ignorant of. I feel like the information I've gleaned on my own has been of higher quality than what I gleaned in a company environment, because when I learn on my own, I can seek out explanations for beginners and really try to understand why things are as they are. In a company, the primary imperative is producing output, not learning the theory behind how things work. At Microsoft I used a lot of technologies that I didn't understand and didn't have time to unpack. Due to my self-study in post-Microsoft years, I now have much better insight into some of the tools I had been using at Microsoft.

In summary, I think joining a particular community is great if your goal is to really optimize for understanding what that particular community is doing. But the insight this provides into the rest of the world is limited. Most of what we know about the rest of the world comes from public information sources.

Footnotes

  1. Incidentally, this same sort of logic explains why I haven't considered a career as a professor, even though I think I would be well suited to such a job. Professors work a lot (an average of 61 hours/week), and not a lot of that is spent on research: "just 17 percent of the work week and 27 percent of weekend work". A professor's salary at 30+ hours/week of teaching and administrative duties is worse than a programmer's salary at 40-50 hours/week of programming work. Once again, it seems better to earn to fund yourself unless the teaching or reputation benefits of being a professor are significant enough. Also, professor salaries should only decline going forward if, as seems likely, MOOCs take off substantially more.  (back)