Throughout my education, I had the repeated feeling that
everything I had done before was pointless. It was as though there was no accumulation,
no building on prior knowledge. Cross the finish line and graduate high school,
and all those silly extracurriculars are forgotten and nobody cares what you
learned. Graduate college, and then you can go into any graduate studies
program. It helps to have majored in the subject you're going to grad school for, but only a little. There is plenty of time to make up for your deficiencies, anyway.
This has been a constant theme in my life. I graduated high school with a good start. I'd had AP chemistry, calculus, and biology, so I probably started college with about one college semester's worth of course work completed. (If not the actual credit; I declined to take the AP chemistry or biology exams for actual credit.) But I was quickly learning that there wasn't much accumulation or overlap. Taking upper-level physics, it probably helped me a little to have taken linear algebra (for solving problems that collapsed into simultaneous linear equations) and calculus and some differential equations (for equations of motion in classical mechanics and solving for the wave function in quantum mechanics). But in reality I simply re-learned that stuff ad hoc when I needed it. I don't know to what extent I actually "applied" the math I had learned. If my education is any kind of guide, there is very little "transfer of learning" in the world. Almost everything is ad hoc, with an extremely limited application.
My humanities and social science courses in high school left me poorly prepared for my college courses. Again, there was almost no building on past learnings. I re-learned world history, having little recollection of my high school courses on the same topic. After college, I began to read tremendous amounts of non-fiction. In grad school I read everything I could get my hands on from Steven Pinker, Dan Dennett, and Richard Dawkins. And I thought, "How was I missing all of this? How did my college education fail to give me a solid grounding on these basic topics of 'philosophy of science'? How did my 'general education' fail to synthesize all of these separate topics of philosophy, biology, history, and psychology into a coherent whole?" But there I was, learning it on my own.
It gets better. Around the end of my graduate school career, I started studying for the actuarial exams. My friend and I printed off some practice problems, which were really basic probability and statistics questions. I found that most of the questions were fairly simple, and the more difficult questions had some simple concept behind them that I just hadn't learned yet. But I'd never taken a stats course in my life. Even the limited amount of "probability theory" from my physics classes, implicit in statistical mechanics and quantum mechanics, was never taught to me at a basic level. If anything prepared me for this exam, it was my philosophy course on logic and reasoning which did cover some basic probability theory. I studied for the first actuarial exam with a cheap study guide and a large sample of free practice questions I got online. I did read a couple of books on probability theory and distributions, which I borrowed from the university's library. By now I've finished all of the actuarial exams (all nine, or ten depending on how you count the multi-part exam). Plus two online "modules", which are shorter exams that cover a lot of simple insurance concepts. Plus a class each on corporate finance, linear regression, time series analysis, and microeconomics.
After taking my microeconomics class, I had something of a viewquake. It was another "How in the hell was I missing these concepts?!" kind of moment. I consider it an utter failure of my general education, at the high school and college level, that I never had to learn economics. I had a very good textbook. It was Steven Landsburg's price theory text. It lays out everything very nicely and contains some of the most clear-headed reasoning I have ever seen. I have since read anything and everything by Landsburg, and I read tons of books about economics by other authors. One such book was Jeffrey Miron's Drug War Crimes. It was the first place where I encountered the distinction between a "positive statement" and a "normative statement." A positive statement is a statement that's either true or false, something that is fundamentally an empirical question. A normative statement is a "should" question, requiring some kind of value judgment. So many people blather on, egged on by counter-blatherers, because they can't be clear about which kind of statement they are making. Suddenly I had a framework for thinking about the news, analyzing government policy, sometimes even reaching conclusions. There is no coherent way to do this without economics, so I don't know how so many people can finish a formal education without learning any. I keep hearing the argument that "education is about having informed citizens." I don't know what the advocates for this position have in mind. Are students supposed to agitate for the sake of agitating? Are they supposed to learn a canonical set of conclusions from their liberal arts education, without really knowing how to defend those conclusions? Are they supposed to feel more confident in their political conclusions (because now they are "educated") without actually being given the tools to analyze policy? Help me out here. I'm genuinely stumped.
All this brings me up through my actuarial education. Surely with those exams completed, I had learned all the skills required to do my job, right? Nope. My day-to-day job occasionally uses an equation or concept from the formal actuarial syllabus. But I'm more of a data scientist than an actuary. There is some material on generalized linear models and clustering and principal component analysis on one of the upper-level actuarial exams. But to do my job I had to learn to learn the R language, some SQL, the hundreds (thousands?) of little tricks and pitfalls of data manipulation, and various statistical concepts underlying predictive modeling. It's something I've picked up on the job as a result of having to do this or that project. When you need a skill to complete a task, you can generally acquire it fairly quickly. And if you work a modern job, you will most likely be given a problem that nobody has figured out before. It may be something fairly trivial even, like analyzing a novel data set that is specific to your company. But some tasks will stretch you beyond what you've learned in your formal education. If you're lucky and your job is the appropriate level of "challenging", you will constantly be stretched beyond your on-the-job experience, too. I expect this to keep happening to me. My R skills will become irrelevant and I'll have to learn Python instead, or whatever replaces Python. SQL will become obsolete and I'll have to learn MDX. And then someone will invent a clever graphical user interface for all these tools, rendering my hard-won programming skills obsolete once again. I'll have to constantly retool to keep up with emerging trends. I feel like I've gone through at least one cycle like this already. Early on at my first actuarial job, I got pretty good at Excel and VBA. I never use VBA anymore, and Excel just won't cut it for most of the analyses I do.
If your model for education is that you go to school, learn to assemble widgets, then get a job making widgets for the rest of your life, your model is wrong. It is at least a half century out of date. I still hear stunningly naive statements by adults who are several years or decades into their careers.
"How did you pass those actuarial exams without taking a bunch of statistics courses?"
"I think I'll use my irrelevant, 20-year-old foreign language skills to get a different job."
"I don't like my job, so I'll go back to school for another masters degree."
And sure, there's nothing wrong with taking some formal stats courses to help you with statistics-heavy exams, or reviving an old and rusty skill set, or studying a new topic in school. But in each of these cases I sensed a hidden assumption that one needs formal education to actually do stuff, or that stuff from your formal education is inherently useful for landing a job. Nope. Nope on both counts.
If you want a good liberal arts education, pick a topic that you enjoy and read deeply about it. Follow discussions, relevant blogs, academic arguments, lectures on Youtube, internet shouting matches even, track down citations and underlying data. Have a range of interests beyond your core topic. That kind of hobby will round you out as a person, even if it doesn't necessarily build any job-relevant skills. If you want a good career, get a college degree in something that's as technical as you can tolerate. Can't hack math or engineering? Do chemistry. Or biology, or economics, or philosophy, or whatever, and take some courses in other stuff that interests you. Look into industry exams, like the actuarial exams, or the CPCU exams (there are several exam series and certifications geared toward the insurance industry), or Six Sigma, or a nursing degree. Look for good supplemental education. For me that was Datacamp, plus a bunch of textbooks on machine learning, plus Andrew Ng's Coursera course, plus any interesting webinars I could find, plus playing with every interesting new R library I found. The certifications are great for signalling "I'm a competent person," and you'll need this to land the right job. But you'll learn more practical skills just doing your job and reading supplemental materials on your own.
Two things prompted me to lay this all out. Bryan Caplan's book The Case Against Education is coming out soon (February 2nd in fact!). The book will argue that most of formal education is about signalling your existing talent, not building human capital. (You can check out some of his posts at Econlog about his book.) Given my career trajectory, that all rings true. David Henderson's book The Joy of Freedom (a joy to read, by the way) has a long chapter about the uselessness of education. It's a pretty radical indictment of formal schooling, but it rings true. He starts out by asking you to write down the 10 most important things you've learned in your life. Most of them are probably practical skills unrelated to schooling (how to drive, dress, tie shoes, etc.). You may even write down a few items that you actually did learn about in school, but Henderson makes the point that your formal education probably just barely got you started. Learned to type in school? Sure, but you probably learned to type well by practicing on your own time. Learned math in school? Sure, but you probably learned to apply it to real-world problems by figuring out something at work or by running your own business. So maybe school gives you a bare-bones foundation that you then build upon, but it's hard to believe you wouldn't acquire that on your own as needed.
This has been a constant theme in my life. I graduated high school with a good start. I'd had AP chemistry, calculus, and biology, so I probably started college with about one college semester's worth of course work completed. (If not the actual credit; I declined to take the AP chemistry or biology exams for actual credit.) But I was quickly learning that there wasn't much accumulation or overlap. Taking upper-level physics, it probably helped me a little to have taken linear algebra (for solving problems that collapsed into simultaneous linear equations) and calculus and some differential equations (for equations of motion in classical mechanics and solving for the wave function in quantum mechanics). But in reality I simply re-learned that stuff ad hoc when I needed it. I don't know to what extent I actually "applied" the math I had learned. If my education is any kind of guide, there is very little "transfer of learning" in the world. Almost everything is ad hoc, with an extremely limited application.
My humanities and social science courses in high school left me poorly prepared for my college courses. Again, there was almost no building on past learnings. I re-learned world history, having little recollection of my high school courses on the same topic. After college, I began to read tremendous amounts of non-fiction. In grad school I read everything I could get my hands on from Steven Pinker, Dan Dennett, and Richard Dawkins. And I thought, "How was I missing all of this? How did my college education fail to give me a solid grounding on these basic topics of 'philosophy of science'? How did my 'general education' fail to synthesize all of these separate topics of philosophy, biology, history, and psychology into a coherent whole?" But there I was, learning it on my own.
It gets better. Around the end of my graduate school career, I started studying for the actuarial exams. My friend and I printed off some practice problems, which were really basic probability and statistics questions. I found that most of the questions were fairly simple, and the more difficult questions had some simple concept behind them that I just hadn't learned yet. But I'd never taken a stats course in my life. Even the limited amount of "probability theory" from my physics classes, implicit in statistical mechanics and quantum mechanics, was never taught to me at a basic level. If anything prepared me for this exam, it was my philosophy course on logic and reasoning which did cover some basic probability theory. I studied for the first actuarial exam with a cheap study guide and a large sample of free practice questions I got online. I did read a couple of books on probability theory and distributions, which I borrowed from the university's library. By now I've finished all of the actuarial exams (all nine, or ten depending on how you count the multi-part exam). Plus two online "modules", which are shorter exams that cover a lot of simple insurance concepts. Plus a class each on corporate finance, linear regression, time series analysis, and microeconomics.
After taking my microeconomics class, I had something of a viewquake. It was another "How in the hell was I missing these concepts?!" kind of moment. I consider it an utter failure of my general education, at the high school and college level, that I never had to learn economics. I had a very good textbook. It was Steven Landsburg's price theory text. It lays out everything very nicely and contains some of the most clear-headed reasoning I have ever seen. I have since read anything and everything by Landsburg, and I read tons of books about economics by other authors. One such book was Jeffrey Miron's Drug War Crimes. It was the first place where I encountered the distinction between a "positive statement" and a "normative statement." A positive statement is a statement that's either true or false, something that is fundamentally an empirical question. A normative statement is a "should" question, requiring some kind of value judgment. So many people blather on, egged on by counter-blatherers, because they can't be clear about which kind of statement they are making. Suddenly I had a framework for thinking about the news, analyzing government policy, sometimes even reaching conclusions. There is no coherent way to do this without economics, so I don't know how so many people can finish a formal education without learning any. I keep hearing the argument that "education is about having informed citizens." I don't know what the advocates for this position have in mind. Are students supposed to agitate for the sake of agitating? Are they supposed to learn a canonical set of conclusions from their liberal arts education, without really knowing how to defend those conclusions? Are they supposed to feel more confident in their political conclusions (because now they are "educated") without actually being given the tools to analyze policy? Help me out here. I'm genuinely stumped.
All this brings me up through my actuarial education. Surely with those exams completed, I had learned all the skills required to do my job, right? Nope. My day-to-day job occasionally uses an equation or concept from the formal actuarial syllabus. But I'm more of a data scientist than an actuary. There is some material on generalized linear models and clustering and principal component analysis on one of the upper-level actuarial exams. But to do my job I had to learn to learn the R language, some SQL, the hundreds (thousands?) of little tricks and pitfalls of data manipulation, and various statistical concepts underlying predictive modeling. It's something I've picked up on the job as a result of having to do this or that project. When you need a skill to complete a task, you can generally acquire it fairly quickly. And if you work a modern job, you will most likely be given a problem that nobody has figured out before. It may be something fairly trivial even, like analyzing a novel data set that is specific to your company. But some tasks will stretch you beyond what you've learned in your formal education. If you're lucky and your job is the appropriate level of "challenging", you will constantly be stretched beyond your on-the-job experience, too. I expect this to keep happening to me. My R skills will become irrelevant and I'll have to learn Python instead, or whatever replaces Python. SQL will become obsolete and I'll have to learn MDX. And then someone will invent a clever graphical user interface for all these tools, rendering my hard-won programming skills obsolete once again. I'll have to constantly retool to keep up with emerging trends. I feel like I've gone through at least one cycle like this already. Early on at my first actuarial job, I got pretty good at Excel and VBA. I never use VBA anymore, and Excel just won't cut it for most of the analyses I do.
If your model for education is that you go to school, learn to assemble widgets, then get a job making widgets for the rest of your life, your model is wrong. It is at least a half century out of date. I still hear stunningly naive statements by adults who are several years or decades into their careers.
"How did you pass those actuarial exams without taking a bunch of statistics courses?"
"I think I'll use my irrelevant, 20-year-old foreign language skills to get a different job."
"I don't like my job, so I'll go back to school for another masters degree."
And sure, there's nothing wrong with taking some formal stats courses to help you with statistics-heavy exams, or reviving an old and rusty skill set, or studying a new topic in school. But in each of these cases I sensed a hidden assumption that one needs formal education to actually do stuff, or that stuff from your formal education is inherently useful for landing a job. Nope. Nope on both counts.
If you want a good liberal arts education, pick a topic that you enjoy and read deeply about it. Follow discussions, relevant blogs, academic arguments, lectures on Youtube, internet shouting matches even, track down citations and underlying data. Have a range of interests beyond your core topic. That kind of hobby will round you out as a person, even if it doesn't necessarily build any job-relevant skills. If you want a good career, get a college degree in something that's as technical as you can tolerate. Can't hack math or engineering? Do chemistry. Or biology, or economics, or philosophy, or whatever, and take some courses in other stuff that interests you. Look into industry exams, like the actuarial exams, or the CPCU exams (there are several exam series and certifications geared toward the insurance industry), or Six Sigma, or a nursing degree. Look for good supplemental education. For me that was Datacamp, plus a bunch of textbooks on machine learning, plus Andrew Ng's Coursera course, plus any interesting webinars I could find, plus playing with every interesting new R library I found. The certifications are great for signalling "I'm a competent person," and you'll need this to land the right job. But you'll learn more practical skills just doing your job and reading supplemental materials on your own.
Two things prompted me to lay this all out. Bryan Caplan's book The Case Against Education is coming out soon (February 2nd in fact!). The book will argue that most of formal education is about signalling your existing talent, not building human capital. (You can check out some of his posts at Econlog about his book.) Given my career trajectory, that all rings true. David Henderson's book The Joy of Freedom (a joy to read, by the way) has a long chapter about the uselessness of education. It's a pretty radical indictment of formal schooling, but it rings true. He starts out by asking you to write down the 10 most important things you've learned in your life. Most of them are probably practical skills unrelated to schooling (how to drive, dress, tie shoes, etc.). You may even write down a few items that you actually did learn about in school, but Henderson makes the point that your formal education probably just barely got you started. Learned to type in school? Sure, but you probably learned to type well by practicing on your own time. Learned math in school? Sure, but you probably learned to apply it to real-world problems by figuring out something at work or by running your own business. So maybe school gives you a bare-bones foundation that you then build upon, but it's hard to believe you wouldn't acquire that on your own as needed.
No comments:
Post a Comment