Don't start a start-up or apply for seed funding. There are lots of great hackers out there, and only so many spots for top websites. The odds of putting together both a solid product, good marketing, and the right timing are incredibly slim. I talk to people all the time who are working on start-ups with no benefits, no salary, and only a small chance of hitting it big and becoming profitable.
Even if you do get accepted to one of the top seed start-up firms, you're still not guaranteed to be successful. You have to realize that you're really just an exploitable resource for venture capitalists.
On a more serious note, I'd say that anyone who's thinking of going to grad school should read this. Then, if they're still certain they want to go, they should go for it. Just because something's hard or might not work doesn't mean you shouldn't try.
FAIL: On track in your career as a software engineer, but with a couple of years of lost earnings.
Humanities academic:
WIN: tenure track position. Salary caps out at entry level software engineer, but great lifestyle.
FAIL: adjunct professor or barista. You are 30-40 years old, and less employable than you were at age 22. (Some humanities PhD's do well for themselves, but getting the PhD was just a side errand. )
Note: STEM academic is vastly better than humanities, but be prepared to get out early. If you aren't at the top at any stage of the game, quit.
FAIL: On track in your career as a software engineer, but with a couple of years of lost earnings.
You might be -2 years earnings but you'll gain equivalent to +4 years experience, just from the extra hours you'll put in, maybe even more. It's still a net win.
I'd say the "startup fail" scenario is even less negative, because I have the feeling that having the experience of even a failed startup makes you more attractive.
I agree that this is true within academia. The maximum accepted age for a first-year graduate student is about 27; first postdoc, 32; tenure-track, 36. If you don't have tenure at 40, you're flushed.
This does not seem to be true outside of academia. If you're good, your career does not take a dive in your 40s. However, the mediocre or uninterested programmers need to move into management before this, because management is more ageist than technology.
In the sciences, I disagree with "If you aren't at the top at any stage of the game, quit". It can be the right decision to get a PhD in an 11-20 program, so long as you have options aside from an R1 academic job.
1) You don't have to wait 3-7 years wading through work before you start a start-up.
2) Your entrepreneurial experience will most likely be viewed by businesses as valuable. (Failure is acceptable in Silicon Valley.) On the other hand, how do businesses view 30+ year olds with esoteric, non-business-related over-education?
3) The maximum number of profitable businesses or websites out there is some % of the number of people on Earth. There is a very limited demand for professors, capped by the number of schools out there.
4) If you get funding, you'll probably make a lot more than an assistant professor. (Sad but true. There's a big difference between humanities and engineering/professional academic positions.)
5) The upside for a start-up is at least two orders of magnitude higher than academia.
That's false, if you define top by revenue. There is no reason to believe there's any limit to the amount of newly created wealth consumers can absorb.
There's no limit to wealth that can be created, the ability of consumers to consume does have limits. But consumables are far from the only way to create value in an economy.
Academia is the right job environment for a certain set of people, but they deserve decent living conditions. They shouldn't be exploited just because society views them (correctly or otherwise; mostly otherwise, I believe) as naive pansies.
Few are arguing that academics ought to be millionaires (although they're certainly more deserving of that than most of the assholes running Wall Street, the Fortune 500, and Hollywood) but they should be paid well enough to support a lifestyle comfortable enough to enable a "life of the mind" (which most professors, for what it's worth, don't get to have). The phrase, "we don't pay you to work here, we pay you so you can work here" seems to have been lost on academia.
Also, academia (in the humanities) has much more of a downside.
If your startup fails, you spend a year or two of your life learning a lot of useful skills, but end up out of a job when the money runs out. It sucks, but you end the travail much better off than when you started, and probably better off than if you had toiled in a mediocre corporate job.
In the humanities, if your academic career fails, you've spent a decade of your live to get a relatively useless degree, have weak job skills for someone your age, and probably damaged self-esteem. You're not better off than you were when you began; you're in your mid-30s with nothing to show but a publication no one cares about.
This problem is not limited to the humanities. Academia in all fields depends on an army of young grad students and postdocs to staff the research trenches, many times more than there ever will be tenured positions for.
In fact it is more insiduous in the sciences, in that people think they are gaining marketable skills that they can fall on as a Plan B. For example many delude themselves that they can program, because they had to write a few scripts to manipulate data in their research career. They get weeded out at early stages for software engineering positions ("Have you ever used a version control system" is a Damoclean interview question).
"They get weeded out at early stages for software engineering positions ("Have you ever used a version control system" is a Damoclean interview question)."
If by "Damoclean", you mean "capricious", then I agree, but it sounds more like an interview anti-pattern. That question doesn't tell you anything about the candidate's capability of understanding a version control system, which is a far more important property than their particular experience with one VCS or another.
This year alone, I've learned at least a half-dozen new tools and technologies that I'd never used in grad school. Yet here I am, using them to do productive work. Would you really not hire me because I'd never used a tool before?
I know this sounds unkind, but it is not a question of whether someone knows a particular tool, which I agree is not always essential. The overwhelming chances are that if a person has never used any version control system at all, they have not written software of any complexity and/or they have not written software in collaboration with others. The fact that they could, in fact, happen to be brilliant at both of those given a chance is an unknown quantity - and a pivotal part of hiring is to minimise the risk of someone not working out.
Of course there are always good answers to questions even when the skill is not there; personally, I have yet to get one on this question.
My advice to anyone who is a non-comp-sci grad (or lacks real software engineering experience for other reasons) that wants to turn into a programmer is to join an open source project and cut your teeth on that; not only is it going to give you an introduction to many things that count but it does so in a way that makes it very easy for a potential employer to verify them.
A counter argument: if they are getting the PhD as evidence of mastery in a topic they love, it doesn't matter what comes after it, unless circumstances will force them to give up what they love until they die.
If one can see just this far, they will be able to plan accordingly. Suppose they do, then there is no reason to change course. The point is to know what you want and why you want it; having figured this out, screw what other people think, unless your plans require that as a parameter.
It's fine to say that having a PhD in the humanities for the "love" of the subject is enough, but in at least some of the humanities, that won't get you anywhere. Philosophy, for instance, is an ongoing discussion, and unless you're in academia attending conferences and writing papers and getting published in the same journals that everyone else is, you're not really participating in the field. And you can't really do that unless you're faculty somewhere. It sucks, but that's also where the good philosophy gets done, and you can't do renegade philosophy outside of academia without losing a lot of important context and good work.
A counter argument: if they are getting the PhD as evidence of mastery in a topic they love, it doesn't matter what comes after it, unless circumstances will force them to give up what they love until they die.
The intrusion of bad "circumstances" is the norm in academia. Actually, it's the norm in life, but academia is set up in a way that makes it nearly impossible to recover.
Also, the "love" people have for a subject or field is highly conditional (not that it should be any other way; that wouldn't make sense). This is mere attachment. Unconditional love for something so abstract probably cannot exist, and if it can, is not desirable in any case. It's legitimate to be intellectually fascinated by finite fields or 19th-century literature, but to say that one loves these things is a stretch. A person who's capable of developing a fascination with X is equally capable of finding a "love" for Y; that's a core trait irrespective of niche-bound particulars.
So when someone is forced into an unappealing niche by academic politics, lives a crappy lifestyle on account of the university's stinginess, and has a fractured family life due to overwork and constant geographic uncertainty, all while having to cope with distracting administrative and bureaucratic chores, it makes sense that this "love" of subject matter would disappear entirely. Most people realize, upon growing up, that they love their families and decent lifestyles more than intellectual abstractions.
In my academic career, I've worked on foundations of quantum mechanics, numerical scattering theory, and now I'm working on medical imaging and computational geometry. On the side, I'm doing some web development (database stuff and also business optimization). I could do basically any technical work and enjoy it. There are some things I like slightly more and am moderately better at (not always the same thing), but those differences are marginal.
To answer your specific question, it's definitely a place worth checking out. There are great CS theory/math people here. The culture is friendly but a bit isolating. You will need to actively work at meeting people you don't directly work for.
I don't disagree with you, although that doesn't contradict what I said: I use "love" in a very strict sense, in the same way you use it in the unconditional sense.
So another issue I didn't make clear was that many people who declare their love for a subject don't really know to what extent this love stretches. In my book it means being alive implies doing X, until life ends.
I can't vouch for any other generations, but it's hard to tell people my age (born 1985) that they /can't/ accomplish anything even if they try their hardest. Life is a lot more fatalistic and free will a lot weaker than we've been taught.
It's very sobering and depressing to learn that scarcity is a very real thing. There are only so many spots for the jobs that people want, and most of the time people get there through personal relationships, luck, or other things most people just can't realistically be expected to have.
This is not unique to your generation. Mine (I'm 49) had different but similar myths -- the rock star, the uber-athlete, the astronaut. This is part of our collective human culture, and to some extent it's sort of a larger extension of what the article is about. It's very difficult to be honest with young people about their odds of making a large-scale difference in a world of six billion people. Even just the opportunity to 'do what you love' is simply not in the cards for the majority. I'm not sure what we can do about that if anything, but it's a fact and it often takes 30+ years before it starts to sink in. Maybe that's evolution's way of keeping us from feeling hopeless before we've had children.
Once you have kids, all that ambition can be transferred to them. That might explain why some parents seem to be completely insane when it comes to their kids.
I call the mindset of my peer group the "Disney culture". It is characterized by magical thinking, such as "if you follow your heart, everything will turn out for the best!"
To some extent, this is valuable. It encourages people to take risks and be creative. The wonderful explosion of technology and prosperity in the 1990s was an expression of magical thinking.
But I think we place too little value on financial stability, independence, and other more boring traditional values. This could be because we've never really had to face hard times until now. I wonder if we will pick up these traits during the recession.
Pretty depressing reading, but good on him for trying to spread the word.
I'm sorry that the chances of getting a professorship in humanities is small, but I'm even sorrier that there doesn't seem to be many/any viable careers for a humanities student aside professorship.
That's problem #1 in the current format of humanities graduate education. The assistantships train students to teach classes and grade papers, and the research and dissertation process teach students how to get an academic paper published. It's optimized for generating new professors (tenure-track or otherwise), and there's not much demand to tailor it in other directions since the dream career of most humanities grad students is being a tenured professor. This is less of a problem in subject areas where entering students already have a different career in mind, e.g. engineering.
I won't speculate what those alternative careers could be, but in other improbable fields that attempt to reassert their relevance, applied career paths have a tendency to pop out and eventually be reincorporated into academia as a new, related discipline. For example, psychology has already forked and merged a few times.
the dream career of most humanities grad students is being a tenured professor
Well that kinda sorta is the problem. The dream career of a science grad student is to make some great discovery and win a Nobel prize - academia is a means, not an end in itself. Academia for the sake of academia is fine in principle - but who's going to pay for it?
Our society as a whole benefits from diversity, in the same way that humanity benefits from a wide gene pool. If you bias heavily towards similarity, you end up with a highly specialised but weak society.
Like etal, I won't speculate on what alternatives might be, but not having them is our collective loss.
I'm pretty sure that society doesn't benefit from all possible diversity. (And then there's the small details that humanities isn't all that diverse and we can get its diversity from a much smaller population.)
I note that "mother nature" weeds out most mutations.
> I note that "mother nature" weeds out most mutations
'Mother nature' actually creates mutations; without them, we wouldn't be able to adapt to nature's changing circumstance.
> And then there's the small details that humanities isn't all that diverse
I didn't say that humanities was diverse; I said that a diverse society needed to encompass the humanities as well, because that diversity is necessary for it to thrive.
I'd argue that Socrates and Plato (philosophy) are as well remembered as Pythagoras (maths). Surely that says something about the importance of the humanities?
I'm not saying that all diversity is bad. I'm saying that some/most diversity is useless at best.
Note that I'm also not saying that the humanities are useless or bad. I'm saying that it's quite possible that we could get all the benefits from the humanities and spend far less on them. No, that doesn't imply that we shouldn't let people study the humanities and end up driving cabs.
On the other hand, I have a hard time thinking of a better grad degree than a CS master's program. Your job prospects are better than an MBA's or a JD's, you will spend a lot less time in school than an MD, and you will have a much broader employment market than your friends in other engineering and science disciplines.
I was quite impressed by the number of firms at UCSD's job fair interested in recruiting CS students.
A master's degree in computer engineering could give CS a run for its money. There's enough overlap with CS that a Computer E grad student can learn as much CS as they like, while still having job prospects in a bunch of nice hardware-related jobs. And really, computer hardware is just plain fun.
My hardware will probably break your software. Beat that.
This is actually one of the drawbacks of learning both hardware and software: when something breaks, you can never say that's not your department. You can get away from that by simply buying stuff from other people, but you can't always trust it; I was using some off-the-shelf hardware yesterday that literally exploded. Like in Star Trek.
I guess that is what separates software people from hardware people. You say "my homework literally blew up yesterday". I think that's a little scary, a hardware guy thinks that's pretty cool.
Are you sure an advanced CS degree will matter as much as they do today in 5 or 10 years time? Then again it's only a 2 year commitment so it's a pretty different level of investment than a PhD.
Does anyone not see the irony in the author's premise when applied to an undergraduate's search for their first big job? Just replace "universities" with "companies" in the sentence below:
"It's hard to tell young people that universities recognize that their idealism and energy — and lack of information — are an exploitable resource."
Even if you do get accepted to one of the top seed start-up firms, you're still not guaranteed to be successful. You have to realize that you're really just an exploitable resource for venture capitalists.
On a more serious note, I'd say that anyone who's thinking of going to grad school should read this. Then, if they're still certain they want to go, they should go for it. Just because something's hard or might not work doesn't mean you shouldn't try.