Hacker News new | past | comments | ask | show | jobs | submit login
An Aspiring Scientist’s Frustration with Modern-Day Academia (junod.info)
211 points by mad on Sept 9, 2013 | hide | past | favorite | 155 comments



It's kind of hard to get through this, not because the points ring untrue (I agree with many), but because they seem a bit naive.

I think there's a certain type of person that is particularly attracted to advanced work in science and technology -- the systems purist, who believes strongly that the world "should" work in mostly objective ways. This is a strength in that the world doesn't change very much without these idealists pushing for their vision, but it's also a horrible weakness because these sorts of people don't tend to internalize that any project among groups of humans is going to be messy and inefficient and frustratingly subjective. The successful people adapt, and learn to bend the messy ecosystem to their will. The others...don't (they do things like writing dramatic, self-important missives about how the system is failing them, and cc it to the entire department. ahem.)

If you make it all the way to the end of a PhD program and you haven't realized that it's mostly about how to make steady, measurable progress in a messy, complicated, political world, you haven't yet reached the end of the process.


I see this a lot with new programmers that come to work for us. Many of them understandably think that some of our practices are obviously wrong and should be changed, but few of them figure out how to be effective in spite of the obvious correctness of their opinions.


Yep, agreed. It's a youthful affliction, and it isn't specific to PhD programs.

Time either beats it out of you, or you become this very pathetic, unfortunate sort of character as you get older. There are few things sadder than adults who blame the world for being insufficient for their needs.


Well that was an uninspired display of passive-aggressive ad hominem and cliche.


One can complain AND work on fixing things at the same time.


Can someone please change `wity' to `with' in the title?

This critique is well written and was interesting for me to read, having just finished my PhD a few months ago. I can relate to a number of the issues he raised. However, I must take issue with his point (8). While I think it certainly is important to make sure that academics contribute back to larger society in some capacity, I think it is a bit premature to conclude that this means turning the focus towards more applied work. There are simply too many examples of basic research with seemingly no value to anything of practical interest that turn out to be very value in industry and medicine some years down the road to dismiss incubation of purely theoretical work offhand. I think this fact is more or less appreciated by society at large, and they are very happy to support Edward Witten and Terry Tao and so forth, with the understanding that it is a long term investment with some large expected return. Of course these return have long tails, so most results will be useless, but that is beside the point if you are interested in total returns.

EDIT: typos


the difficulty with 'basic research with seemingly no value' (a good example is the fourier transform, which at the time was seen as a mathematical curiosity) - is that it lacks accountability. How do you make sure that the academe that is selling their project as 'basic research' isn't just pulling a fast one on you and sucking your funds to tickle their intellectual jollies?

This problem is even worse when you take taxpayer funds, because transparency and accountability are fundamental attributes of good governance, and the claim that 'it will be useful someday' potentially pushes the accountability aspect out to t=infinity.


It's very difficult to predict ahead of time what will be useful and what won't. Given this limitation, there is already a reasonable amount of accountability: many different people in the field are involved in both deciding whom to fund and whom to publish. Moreover, people track how well past research does (based on citations as well as impact more generally) to evaluate any individual researcher.

This is already more accountability than in some government departments or parts of large corporations!

Since nobody can predict everything ahead of time, we have to take risks and explore possible dead ends. Moreover, we really want to avoid getting everyone stuck in the same mindset, so we need people to challenge the status quo occasionally. Think of research as a large search problem--you sometimes have to take seemingly bad moves in order to avoid local maxima. Punishing risk-taking just because you're worried about somebody tickling their "intellectual jollies" is rather short-sighted.


>Moreover, people track how well past research does (based on citations as well as impact more generally) to evaluate any individual researcher.

Do you not see how this could be bad? You're implicitly subsidizing a select few - then who gets to be the gatekeeper that decides what is or isn't bad? Surely not, people who are selected because of their high training, where "high training" winds up also being indirectly judged as "people who were trained by the closed society of individuals who just happen to be the selectees"?

I mean, really, that's what we're doing. As the system becomes more closed and self-interested, and dependent on extracting funds from the government, the more the danger of becoming detached from reasonableness - and then you DO have to worry about people tickling their intellectual jollies.

Now, if we had the very same system, except it were funded as, say a private non-profit entity, it will for sure have the same problems. But - it's less morally suspect, because, at least the people donating their money into it should have known that risk beforehand - and can choose to directly defund their part in the rotten enterprise when they figure out what's going on.

> Punishing risk-taking just because you're worried about somebody tickling their "intellectual jollies" is rather short-sighted.

You're absolutely right. But that choice is not one that government should be making, because of its conflict of interest (accountability). Incidentally, there are problems with transparency in government funding reviews (NSF and NIH panels are conducted in secret with 'anonymous reviewers'), too, but those are implementation details, not fundamental problems.

I'm not suggesting that the risk-taking involved in judging basic science should not be done. I am suggesting that it should not be done by government.


> How do you make sure that the academe that is selling their project as 'basic research' isn't just pulling a fast one on you and sucking your funds to tickle their intellectual jollies?

The term "basic research" refers to the study of fundamental principles. It is undertaken precisely to "tickle intellectual jollies". This is not a bad thing. Basic research is important because it is critical for the advancement of Science to understand why something works the way it does. This approach is distinct from applied research, which is very often concerned with making a buck from the derivation of Scientific results.


> How do you make sure that the academe that is selling their project as 'basic research' isn't just pulling a fast one on you and sucking your funds to tickle their intellectual jollies?

what's the difference? would you want to fund a Feynman to tickle his intellectual jollies?


yes. But I would not want someone to force me to fund Feynman to tickle his intellectual jollies. Moreover, your example has survivorship bias. There are plenty of completely rediculous people you probably would'nt want to fund. For me, one such person is Thomas Cavalier-Smith.


Am I supposed to know why you have a problem with Thomas Cavalier-Smith, or is that just a non-sequitur?


no, just that I have a problem with him. But if you are curious: so much wasted effort. I am not a fan of siloing sciences, but TCS puts into stark perspective Rutherford's epithet "All science is either physics or stamp collecting" - and what TCS has done is stamp collecting at its worst. Over, and over again.


Physics, physics ... isn't that just a branch of applied math?


A good friend of me said: anything that is science is just a special case of either mathematics or physics (and anything that does not satisfy this criterion is bullshit or rambling on and on; but surely not science).


I don't agree with the taxpayer funds aspect. Why can we not agree that employing people to perform research is a good thing? There are lots of things that are taken as good investments by governments based on the fact that it will be useful in future years. Take for example teaching people to read and write, preventative healthcare, disaster planning, Sex education (it's a long list when you think about it). My point is that research for research's sake isn't actually a bad thing. It's generally cheap (when compared to employment ,social welfare or institutions like Prison or Hospitals) and provides useful outputs(I think everyone can agree that at least some positives come out of research). Anyway, I think in future years people will consider the fact that people aren't freely educated to 3rd & 4th level in much the same way as we view not getting educated to secondary level(18) in days gone by.


I'm not sure "it being a good thing" is sufficient rationale for government to pay for any given thing. For starters, you'll have arguments about what's good and what's not. Why do you decide to fund telescopes pointing at distant galaxies, versus anthropological expeditions to the Andaman islands? Or do we fund it all?

If it's so obviously good, then why aren't people paying out of their own pockets to help out?

I'm going to anticipate your answer: because they're too selfish. But is there any reason to believe that our elected officials are any less selfish? Look at the narcissistic idiots we elect - and ask, are these the people whom you really want holding the purse strings? I often see complaints, "petition congress not to defund science". Well you know what, scientists shouldn't be so darned entitled, it wasn't their money to begin with. Putting their faith in the continued patronage of a capricious bunch of authoritarians beholden to questionable special interests may not have been the smartest way to secure a sustainable (or a moral) stream of funds.

In the end, taxes are backed by "if you don't pay them, you're gonna be taken away by agents who will throw you in jail". That's blood money you're taking. Deficit spending, which is governments do these days, is even worse - at least with taxes there is the pretense of the voter having a say; when you borrow money on the credit of the nation - you are mortgaging the assets of future generations, who cannot go back in time and cast a vote against spending their money on a stupid idea.


In most democracies we elect politicians to decide if things are good and worth funding.

Please don't anticipate my answers :) at present people are paying out of their own pockets for 3rd and 4th level education. So this is happening.

Taxes are not yours. They belong to the state. There is a fallacy that exists that you own your taxes and "pay them" to the state. This is not the reality. The reality is that the state has always owned the taxes and by virtue of the system they have created (which the taxes are used to fund) enable you to earn a living. You owe taxes because you have used the resources of the state. Be it the land, the security forces or the markets or transport networks.


See definition of public good(economics):

In economics, a public good is a good that is both non-excludable and non-rivalrous in that individuals cannot be effectively excluded from use and where use by one individual does not reduce availability to others.[1] Examples of public goods include fresh air, knowledge, lighthouses, national defense, flood control systems and street lighting. Public goods that are available everywhere are sometimes referred to as global public goods.


your statement is an existence statement. Yes, public goods exist. In some cases government regulation of public goods against abuse is reasonable e.g. "clean air". But your statement doesn't give moral justification for government subsidy of public goods.

Let's put it in really stark terms. I want to build a lighthouse. Because it's a public good; it will keep ships in the ocean safe. Is it morally justified for me to hold a gun to your head and threaten to blow it off if you don't give ten dollars for the construction of the lighthouse? What about one dollar? Ten cents? Half a cent? Let's ratchet back a little bit. Maybe I don't threaten to blow your head off. I'll just build a cell and throw you in there. For ten years. Is it any better if it's six months? Six weeks? Does that make it any better? Does it make it any morally better if instead of taking the personal responsibility for detaining you, I delegate that responsibility to agents of a byzantine bureaucracy?


I don't think the provision of public goods are a moral choice, if anything it is an economic choice, where its subsidy is justified by two things: the benefit of the public good outweigh its costs, the general public will not sufficiently fund public goods out of their free will. The degree of benefit for the future is sometimes not measurable, however if the past is any indication than it has definitely provided major contributions to innovation.


>the general public will not sufficiently fund public goods out of their free will.

May I remind you:

>I'm going to anticipate your answer: because they're too selfish. But is there any reason to believe that our elected officials are any less selfish? Look at the narcissistic idiots we elect - and ask, are these the people whom you really want holding the purse strings?

>The degree of benefit for the future is sometimes not measurable, however if the past is any indication than it has definitely provided major contributions to innovation.

It's not like privately funded applied and basic science is chopped liver here. Applied: Salk and Sabin developed the polio vaccines without a public dime (pun intended) and didn't patent it either. Basic: Peter Mitchell proposed and validated the chemiosmotic effect (1970s) and even won a nobel prize on the discovery, without public help. Obviously, if you go far back enough to where the state was essentially uninvolved in science, most of the discoveries were made with private funding. Yes, of course, publically funded science has made discoveries, but what you are arguing is the broken window fallacy.


I agree with you that governments can be very suboptimal, but that doesn't diminish the existence or usefulness research as a public good. Some research can not be readily commercialised, but nevertheless form basic building blocks for other innovation, in such cases government funding is crucial.


>Some research can not be readily commercialised, but nevertheless form basic building blocks for other innovation, in such cases government funding is crucial.

http://en.wikipedia.org/wiki/Peter_D._Mitchell


Quit being such an ultra-capitalist arse. The funding of public goods is not and should not be an opt-in system. You contribute whether you like it or not because that's in the best collective interest.


And if an individual decides to opt out, what then? The current solution is "throw them in jail for a decade", which seems way too excessive for me. I keep hearing people complaining about how others get sent to jail for drug-related crimes, but not many similar complaints about tax evasion. Is this crime so bad that we have to throw a person away for years?


That's all fine and good until "we" decide that the "best collective interest" is to defund gradstudent's research in favor of something else more politically expedient. Then gradstudent throws a hissyfit because the collective interest is not his interest. Who's the selfish one then?


All of us.

The purpose of government is to marshal these selfish urges such that we benefit as a whole.

Whether or not that happens is a political problem. That the political system can be captured by politically expedient interests is a political problem.

Of course gradstudent should throw a hissyfit, if it is in their interest - would you expect anything less?


"The purpose of government is to marshal these selfish urges such that we benefit as a whole."

I think a lot of people think that's what government should do, but you should really think hard about why that's a bad idea. Who is "we"? Who gets to really decide what is "benefit"? Who gets to decide what is "selfish"? And how can I be that person so that I can screw over gradstudent, and bring about benefits, which, oh by the way, also REALLY benefit my friends and family? Some animals are just.... More equal than others.


> Who is "we"?

society

> Who gets to really decide what is "benefit"?

society

> Who gets to decide what is "selfish"?

individuals

> And how can I be that person ?

society is not one person.

> Some animals are just.... More equal than others.

Which is why the pigs need constant replacement, power must be diffusely held and the executive must be separate from the judiciary.


Spot on.


>Why can we not agree that employing people to perform research is a good thing?

There are lots of good things that aren't worth what they cost.


This is a very good point and I think I would like to refine my comment in response to it.

I think it is important to distinguish between individual vs collective accountability, as well as accountability within a field and the accountability of the field to larger society. My argument for supporting basic research in a field assumes that there is internal accountability within that field and that the research makes sense to those within it even if there is no direct application. In this view accountability of the individual to society is mediated by the accountability of the field at large to society. This is what allows for various sorts of bet hedging (if we think of a field as a firm) to improve the chances of hitting the rare event where a basic result has a big impact. If there is a deterioration of transparency accountability within a field, perhaps due to the mechanisms the OP describes in points (1-7) then (8) does become a problem. So I have perhaps a different understanding of (8) then the OP. I do not argue that some particular flavor of esoteric mathematics be supported because 'it will be useful someday'. Rather I argue that, so long as there is intellectual integrity within mathematics (or whatever other field) that one should expect some small fraction of the esoteric theoretical research will yield big returns, and that these returns will subsidize the failed efforts. This kind of mechanism allows for the possibility to support t=infinity timescale research with accountability checks of the collective at finite timescales.

Now, if we start to deviate from my assumption of integrity within a field and my scenario of field as idealized firms, there are all sort of interesting game theoretical questions that pop up. For example, what sort of internal rewards and punishments within a field are needed to keep things honest? I don't know, but I think it is interesting to think about.


>assumes that there is internal accountability within that field

Yeah, that is a huge, and completely wrong assumption about the sciences.

I could rattle off a list of names just in my field (you should google them to find out exactly what they did): Dalibor Sames/Bengu Sezen. Leo Paquette. Homme Hellinga. Peter Schultz/Jonathan Zhang. Geoffrey Chang. Armando Cordova. To varying degrees of fraud, or negligence (e.g. Chang wasn't fraudulent, just almost criminally[0] negligent) And those are the people who got caught, and the ones I know about. Guess how many of them still have faculty positions?

[0]I feel safe saying this (with the "almost" qualification) because misappropriation of federal funds is a crime.

Irrespective of your assumption, you still need some sort of justfication for giving taxpayer money to science. Presumably, because it's good for society. Well fine. If it's so good for society, why can't you find people to give money to science as a free-will donation?


>How do you make sure that the academe that is selling their project as 'basic research' isn't just pulling a fast one on you and sucking your funds to tickle their intellectual jollies?

Spoken like a true engineer


I'm actually in the sciences. I've seen the stupid ideas that PIs have proposed in the name of "basic sciece" and the grad students and postdocs they have burned in the process.


The only accountability you can have when dealing with 'research with seemingly no value' is the peer-review and/or peer-valuation process.

Otherwise, studies on ancient history, dead languages, etc. should be completely eliminated.

Yes, academia in that sense is a set of elitists, but then again, only the elite can properly evaluate one another's jobs.


I had a different but related reaction to point 8 specifically. He seems to imply (although it's not totally clear since he treats the academy as a single unit) that people who have come up with theory should turn their individual attention to practical applications after it's well-developed. But it may be that people who are good at theory simply aren't good at applications. It could be better if once something was ready for practical applications, those theory-people went and worked on developing theory in a different area, rather than trying to become applied. If we aren't seeing enough practical application, perhaps it's because we aren't rewarding application sufficiently (seems unlikely), or people aren't being rewarded sufficiently for practical applications of other people's theories (seems possible?), so we have less appliers than is optimal.


I have worked at EPFL for a short time as as post-doc, and it really is a great school. Its Ph.D. students are excellent, and so what he says has to be taken seriously.

Personally, I have come to the conclusion that much of how Academia works today is inevitable. How many people are truly excellent in any job? Always only a small percentage. Let's say that in Academia the percentage is especially high, as high as 20%. What are the other 80% supposed to do? They have to represent something that they are not, because people outside of Academia must believe that 100% of Academia are excellent (otherwise they would revolt where all their tax money goes). Everything bad in Academia follows from this.


"Let's say that in Academia the percentage is especially high, as high as 20%. What are the other 80% supposed to do?"

Collaborate with the excellent 20%? What is wrong with encouraging more collaboration between researchers? Why should the mediocre 80% be pressured into publishing low-quality, CV-padding papers when they could be working effectively with geniuses?


what happens to that percentage when a country decides that it doesn't have enough STEM people and forces people down an educational pipeline. And compromises its educational integrity in the process?


I guess you know the answer to that.


This student makes some excellent points, and it's a bit scary how some things reinforce each other.

Some journals, Nature being the most notable, chase impact factor above all else. The only thing they truly care about is that the papers they publish get cited. This means highly original work, which frequently doesn't get cited much for a long time, is less desirable than a turn-the-crank paper from a respected name.

Researchers have been taught to prize Nature publications (thanks to the same flawed metric of merit) and spend a disproportionate amount of time submitting papers to Nature. When you write a paper you choose your target journal, and writing for Nature means you need to avoid rocking the boat too much, because they receive so many submissions that one minor criticism from a referee will torpedo your paper, even if that criticism is demonstrably wrong. Sometimes invalid criticisms are made by referees who are jealous, threatened, or merely lazy, but the editor has so many other submissions to deal with that he/she won't take the time to evaluate and understand the referee recommendations. Real science invites debate, but Nature's editors run screaming from it.

Why do researchers care so much about playing this game if it compromises their writing and even their research? Even a tenured prof needs to hustle for funding if he wants to do more than teach and sit in his office. PI's are salesmen even more than they're managers! The very language itself is twisted at the core when applying for grants. The average researcher or project must be superlative in every way on paper or no money is going to come. You can't say you're doing something out of pure curiosity and the chances of it spawning another silicon valley are remote. It has to be imminently commercializable with potential for massive, revolutionary impact! Honesty will destroy you. If every research proposal delivered what it promised the World would be one giant Palo Alto. We would all be better off if some better way to allocate resources were found, but that's no easy task!

As for the PhD student... There is more than meets the eye here. You don't walk away from 4 years of work in the last month without reason. Perhaps his thesis is a hopeless mess. Perhaps his relationship with his supervisor is hostile. Perhaps he's mentally ill. The last few months are ridiculously stressful after all! It's also very possible he or she is a lot more than one month from finishing. Hopefully there are people who care about this person and will help him/her get back on track and finish. The game is flawed, but the best way to ensure nobody wins is to refuse to play it.


That's interesting, because I've also seen the opposite problem in high impact journals, as in the Arsenic-based-life paper published in Science last year. The journal was keen to get a big dramatic story, and must have let it in with fairly minor peer review. Then it got roundly criticised after publication.

I hate the way we have to 'sell' research to compete for funding and prestigious publications. But my impression is that most researchers regard this as a necessary evil, not as their primary purpose. Maybe I've been lucky in my academic experiences (two UK universities, biology), but I met a lot of people who are still genuinely interested in the science. They play the salesman game, of course, because they have to, but they help their PhD students and postdocs, try to judge research from other groups objectively, and care about intellectual honesty. Some of the best are the emeritus professors, the ones who simply carried on researching instead of retiring.


I would say that publishing is a key point of research, but merely in the sense of "finding something worth conveying, and then conveying it to those who can build on it and use it". How many people cite you is not the best metric of success, but obviously you would want as many people to read your paper, and incorporate it in their own work; that is a very democratic measure of research value.

In many cases, the paper is not the best method of conveying information, but it works well enough for most cases that no one has managed to usurp it. Editor reviewed journals might have some flaws in their process, but it is effective enough at picking out useful papers that scientists clearly chose to purchase them and use them as a source. So I wouldn't hate the publishing cycle per say, just the fact that in those cases where it is not exactly optimal, the tyranny of the majority is ruining the day of a small set of research types.


I'm an academic who is 13 years into his career. I see my job as consisting of three phases: 1) find important, unsolved problems. 2) find convincing answers to these problems. 3) communicate the answers to people who can use them. I usually get to organize my day around these principles. I feel very satisfied with my career.

I'm also in a field that has low financial barriers to doing research. The lab model that is predominant in the natural sciences is very very different.


There is some merit in the rant but it suffers from myopia, probably owing to the author's inexperience and subsequent lack of understanding how science really works.

Here's the secret: it is quite possible to get a permanent position with a reasonable teaching load at a reasonable university without lots of funding and without having to produce large numbers of publications in high-profile conferences/journals. Indeed the majority of the world's scientists do just this -- almost by definition, for otherwise selective journals/conferences/funding systems could not be selective: to be selective means to deny entry to the masses. The trick is to think long-term and build develop one's niche of science, until so much material of interest has been accumulated that something novel and substantial has become of it. In my experience, that takes a decade or two of an individual's labour. It's psychologically difficult to toil solitary for that long without positive community feedback (which in science really comes in form of grants and high-profile publications). But it is certainly possible. By the time you have created that niche, you can also create your own conferences/journals/prizes etc, and all of a sudden you are the eminent scientist in your field. That in turn makes it easy to get PhD students, grants etc.

Caveat: the above programme may not work so easily for pre-tenure academics at the top-prestige universities, or in fields that have very high start-up costs in terms of material and/or labour (e.g. high-energy physics). Fortunately, most of computer science is comparatively cheap though (needs only a laptop and an internet connection).

Given that this is a possible career path, and clearly many scientists who got famous only late in life, or posthumously, have had long periods of frustrating 'drought', one wonders if negative rants like that in the linked article, may not in part be a reflection of the authors lack of belief in his/her abilities.


I guess what stops most people from pursuing that road is the risk that, after "a decade or two", it turns out that your intuitions were wrong and you've got very little to show for the years of toiling. It's essentially a gamble on your life.


At least in the UK I know there's a fair amount of rating academic staff by research grants they have received, which hurts anyone doing any long-term work with all the costs at the beginning or anyone doing generally low-cost research.


I'm not claiming you're wrong, but why isn't it possible for the student to simply no longer place any value in getting a PhD? That is what they claim, and intelligent people often make decisions on principle that make little practical sense.


It's the same reason that people don't drop everything in their life to pursue their ideals: the odd of succeeding following the established path may be awful, but the alternatives' odds are even worse. Even without regard to the practicality of the choices (ie economic aspect of it) , you basically have two choices: go to grad school and have ~25-50% of getting somethings out of it (and I'm talking about knowledge/ skills, not the degree), or lock yourself in a room, hoping for the best?

If you're on the caliber of, says Srinivasa Ramanujan or Galois, then I'm fairly sure you don't have to give a shit to PhD or any educational systems. Most of us aren't that type of geniuses though, and we do need help and coaching from advisors, grad schools or otherwise.


Eh...

In theory, the PhD is this fantastic credential that opens all kinds of doors. In practice it might make a small difference in your starting salary in industry. The only place where having a PhD is truly important is in a research position, which basically means academia. Even then it is not the last word -- tenure track positions might be out, but there are a number of schools who are willing to hire a good researcher whose highest degree is an MS.

Even with a PhD your prospects may not be all that much better than they are with an MS. I have seen people get a PhD with only one or two published papers, in a field where you need at least ten to even have a serious shot at a tenure track position. If you are the kind of person whose attitude is, "Screw it, I am in school to learn as much as I can and not just to publish a bunch of incremental improvements," you might not even get a shot at an academic job. Sure, someone can pad their CV as a postdoc -- but that is not all that much better than being a grad student, just moderately better pay.


I wasn't talking about PhD as a credential, I was talking about how for the normal good-but-not-Feynman aspiring scientists, going for a PhD is one of the few ways you can properly learn how to do research.


Again, I'm not saying you're wrong. I certainly would not make the decision of the article writer even if I hated my entire field. But intelligent (though not wise) people often make foolhardy decisions. Perhaps what the writer did is more unthinkable than I'm imagining; I only have a bachelor's degree, but I've seen smart, motivated people switch directions at inopportune times. Someone I know left a well-paying job that fit his (4 year STEM) degree with an amazing work environment for... well he didn't know what for. He just didn't like engineering after all and wanted to do something else in some other field.


Because there is a story why he does no longer value a PhD. And a PhD is a huge investment, both in time as in money, and you don't walk away lightly. So even assuming that the email is a comprehensive account of his reasons to quit his PhD, he did still go from believing in the university system to deeply disappointed of the university system, a system he did spend a huge amount of time in. ( And this is essentially the best case.)


If you want to do research, then you either need funding (for which you need to work with the system being criticised, get the PhD and get a grad student workforce..) or you need to be independently wealthy from some other source so you can afford to.


Aren't there more alternatives? I'm in that particular situation, and I don't have formal education, nor money.


Actually the vibes I get from world-wide accademia is kind of the opposite direction: Human Sciences, even applied sciences related studies that are not ready to produce money (turning them some way into a small-term investment) are not funded anymore, because states (all over Europe for sure) are cutting down budges and research is one of the main sectors that gets cuts down to fractions. So if a company doesn't fund your research (which will not, if the company doesn't have any substantially imminent gain), no one does.

There are other points that are addressed in the opposite direction. Anyway it's good to have a wide variety of opinions floating around, makes it easier for us outsiders to get closer to 42...


"As for the PhD student... There is more than meets the eye here. You don't walk away from 4 years of work in the last month without reason"

Says who? Maybe this student has no interest in going into academia and does not see the point of completing a PhD program.


There's nothing wrong with what he said. Even you gave an alternative reason.

The student has a reason, we just don't know what it is.


3 out of 6 of Microryza's current team are graduate school dropouts. Technically, you could add 1.5 more (Ethan Perlstein left academia, I never started), so 4.5 out of 6.

We've all been there. I think you have to either be purposely oblivious or one of the extremely lucky ones blessed with a true research environment, but it is impossible for the new generation of scientists to not see the writing on the wall. Every single student is suffering from the symptoms, yet very few are doing things to change it.

One note, echoing the original author - for myself personally, the letters 'PhD' no longer carry the same meaning anymore. If anything, it's become an additional signal that you once bought into a system that is selling you short. I still respect those who have PhD's, but only for what it once meant, not for what it means now.


while a PhD doesn't carry that much cachet anymore, doesn't that make 'PhD dropout' even worse? Isn't the impression then, that 'You couldn't finish even the lowered standards for the PhD' or 'You couldn't finish what you started' or 'You had the poor judgement to bite off something you couldn't handle'? I suppose the last isn't so bad given that the age that most Americans start grad school is just over 22


Actually, I see a lot of value in people who can recognize sunk costs and quit wasting resources, no matter how much they have happened to already waste. We all make mistakes, it's whether or not we choose to continue to make mistakes that is important.


I know this sounds like rationalization, but I think we see it more like, "If we're doomed to suffer in a system where everyone is suffering, why not work to try to improve it and fix the problem instead?"

In other words, if we weren't doing this, we would all be in grad school/academia getting PhD's and doing research anyways. This is more a reflection that we're giving up that opportunity in an effort to pay it forward. I would rather 'PhD' to mean something again than to have something that is meaningless.


Depending on your point of view, 'PhD dropout' could also carry connotations of having sound judgement and the audacity to act accordingly and quit despite doing so being somewhat controversial.


is highschool dropout the highest honor of them all then?


That depends entirely on what you do instead of finishing high school.


It could just as easily be, "Good thing you were smart enough to get out when you saw a bad thing," "Glad to see someone who sticks to their principles," or, "Let's put those skills you picked up over that 4 year period to use."


I have my PhD, and this is the standard angst that many aspiring students face. There's a reason we have the term "ABD" for All But Dissertation. If he got that far into grad school without realizing that a Professor is an entrepreneur - hiring employees (grad students) to produce his product (knowledge) which he sells to customers (government and industry) through sales and marketing (conferences and papers) - then he was just naïve or kept the blinders on. My solution was to graduate and go into industry.


Exactly. My brother cut short his PhD and got a M.S. when he realized the life of a professor was basically being a manager with employees that were harder to fire. If he wanted to be a manager, he'd have gotten an MBA.


Yes; it leads to unexpected consequences. The worst students are the ones the that the professor has the most incentive to graduate. So paradoxically the better you are, the less likely you are to be done quickly.


I can't help but feel that the MBA is the scourge of the modern world. People in leadership positions need to be experts in the field they lead.


The author of the letter has earned at least a doctorate honoris causa. (S)he is a true civis reipublicae litterarum.

Universities as we dream them, like Humboldt did, are alas a relic of the 19th century. They’re a lost romantic desire, like the inspiring anecdote (regardless of its historical truth) of a student Niels Bohr and a professor Ernest Rutherford debating over truth as equals.

Universities as we know them since the 1960s evolved into cash-burning bureaucracies. Soon, academia may evolve again — open sourced learning, who knows?

Until the age of enlightenment the bigger part of scientific endeavors was done by dilettanti, unpaid, un-institutionalized “hobbyists” and other philosophes. We will return to those days — perhaps, and need be, at the expense of “democratic access” to learning.

But the ideal of a community of intellectuals devoted to truth and science will live on in the honest hearts of young scientists (and self-employed hackers). Kudos to the courageous heart of that young PhD-student for reminding the academic managerial caste of our ideal!


I wonder why we impart onto young people such an idealism about science only to have to crush it when they get older and must be informed that science, like everything else, is a business.


It may just be an artifact of the expanding population. It wasn't too long ago that many young people also harbored an idealism about working for big companies: go to college, get a nearly guaranteed job at a corporation, work there for 25-30 years, retire with a pension. But we all know that each of these steps is less guaranteed today than it was for the previous generation, sometimes significantly so. It's the same for academia.

As we cram more and more people into the same cities, the same companies, the same institutions of learning, they're going to experience some failures to scale. This is one of those failures. Academia becomes hyper-competitive. There aren't enough jobs. School becomes too expensive. Home prices outpace inflation and wage increases by a factor of three or more. The world is just much more competitive today, and it will continue to become even more competitive.

I sometimes think about my grandfather, now dead, who was a sales executive for a major clothing company in the 60s and 70s. He was making $50k in 1960. According to an inflation calculator, that would be nearly $400k today in base salary. It astounds me to reflect on this and the fact that he didn't even have a college degree. A similar job today would likely only go to a graduate of Harvard Business School, Wharton, or something similar. There were many others like him; you didn't have to be a business whiz extraordinaire to land such a job fifty years ago. You just had to be dedicated, hard-working, personable, and passionate, and everything else would follow. But the world was just not nearly as competitive back then.


A similar job today would likely only go to a graduate of Harvard Business School, Wharton, or something similar.

Actually that's not true at all. High-performing salespeople in tech (and other industries) are extremely well-paid, and tend not to have fancy degrees.

Sales is still a career in which hustle and results are valued over book smarts, and there is no branch of a company that is "closer to revenue" than sales.


Do you have any statistics to support this? And what do you mean by "extremely well-paid"? Anything under $400k doesn't count. Do you honestly think that's a typical salary these days for a salesperson in most industries? I assure you it is not. Most will never make that in their life. Also, how many of these extremely well-paid salespeople that you know didn't even attend college?

Finally, the upper ranks of many Silicon Valley companies like Facebook are filled to the brim with graduates of HBS and similar. There are lots of ex-McKinsey consultants in VP positions in the industry.


The discussion you introduced was about salespeople - guys (mostly) who have a quota, who "carry a bag." It's a highly-paid profession in which Harvard and Wharton MBAs are underrepresented. Enterprise hardware and software salespeople can make well over $400K.

Whether or not the upper ranks of Facebook used to work for Rajat Gupta is hardly relevant to the question of how much salespeople make!


Do you have any support for these figures? I've never heard of so many salespeople making so much money, and all salary reports I can find most certainly do not support your claim of this being even remotely typical.


Given that GDP/capita has expanded plenty, I don't think that's a very good explanation. An economy in which actual growth can't keep up with population should see a falling GDP/capita.

Can we explain why we seem to see an increasingly competitive society despite there being more actual stuff to go around?


Easy guesses: 1) Most income growth went to top brackets 2) More women in the workforce meant both less power to workers as worker supply went up, and also two incomes bidding on homes/products.


There has been a considerable amount of growth at the high end. But it's not being distributed down nearly as much as it once was. How many companies offer defined-benefit pension plans anymore, for instance? Nearly zero in the software industry; Garmin is the only one I know of that comes to mind. It's also well-known that executive compensation has skyrocketed while other wages have failed to keep up with inflation.

I don't think this is a complete explanation, though, and your point is a good one.

I also believe that the transition from one-income to dual-income households has had a pretty deleterious effect on the buying power of the average household. Rather than providing financial security, the second income led to a bidding war on homes, cars, and schools. It's a shocking fact that today's dual-income family has less disposable income after necessary bills than yesterday's single-income family.


I don't think $50k back in 1960 bought you as much as $400k buys you today. Keep in mind how cheap many products and services have gotten due to automation, globalization, and technological advancements. Comparing cost of living in 1960 to cost of living in 2013 as if it's comparing apples to apples is a mistake.


Well, have you watched Mad Men? That's approximately what $50k in 1960 got you. Most people would probably describe Donald Draper as wealthy in the show. He started at $45k in season one if I remember correctly.


The optimistic answer is that we pass on to our children what we wish were true for us, hoping that this time it will be different.

The pessimistic answer is that idealists make very hard workers, such that by the time they become cynics they've already locked themselves into a career path and it's hard to get out of it.

It seems the problem we have with science being a business is the belief that money is fundamentally corrupting. I think that only applies to mismanaged money; it can also be seen as fundamentally enabling.


The pessimistic answer is that idealists make very hard workers, such that by the time they become cynics they've already locked themselves into a career path and it's hard to get out of it.

This precisely describes my experience in gamedev. Luckily, I can drop out and go into any of a dozen alternative computer-related career paths. I'm not sure any scientist is so well-off.


My mother talked me out of a computer science path many years ago. My biggest regrets are letting her talk me out of it and then getting myself locked into another career path, mortgage, etc. One (but not the only) reason why is the current differences in job markets between my field and comp science. The other big reason is that I feel like I'm nearly illiterate. (I'm not illiterate. I can hack on code, but I'm very much a coding 'immigrant'. I'm certainly too much of an immigrant to pursue a job in the field at this time. And I'm too bogged down in other responsibilities to truly pursue the experience that I need to gain enough literacy to get a programming job.)


The only requirement for a computer science education is that you believe in yourself, and that you work on it at least a little each week.

A third (rather modern) requirement might be to ignore what most people consider good practices and simply try to get as many things done as you can. The actual process of building things is mostly what increases your skill, not solely your study of how to build things.

The forth requirement for a computer science career is for someone to give you your initial break (hire you at a job). The only thing necessary for this is to find (a) an open-minded employer who cares mostly about whether you can build things rather than about what credentials you have, and (b) you can show a portfolio demonstrating that you can build things. Build a portfolio of personal projects -- the more visual, the better. Mine was to show demos of 3D game engines I made. They were simplistic, but they were pretty, and it got my foot in the door. If I wanted to become a webdev, I would put together a portfolio of websites I made. Note that you can make whatever you feel like making -- it doesn't need to have a purpose, just something like "here's my example checkout process I made for my example merchant website." Stuff like that.

There's always hope as long as you believe in yourself.


I don't have a problem with science being a business. I have a problem with it being presented to kids as some idealistic pure thing.


Oh, I didn't mean you by "we", I meant the culture at large, specifically the scientific education system which never talks about money, outside of philosophy / policy / ethics classes.


Same reasons we ignore how capitalism messes with everything else.


I wonder why we impart onto grown adults such a cynicism about everything when deep down what we really want is to serve our ideals.


Science is not a business.


Running a research group is a lot like running a small business from the professor's point of view. You have employees (grad students, lab managers) and equipment/supply/rent(overhead) costs that you use to produce a product (research, in the form of papers). You produce these products on a kind of account basis (not a per-paper basis), but if you aren't producing enough when it's time to re-up with your funding agency you lose that grant revenue. When the revenue situation gets bleak due to either mismanagement or the funding climate (sequester, anybody?), you can even "go out of business" and have to close your lab. Even though professors are paid by the university and not out of grants, it's doom for your tenure case if your lab goes into the red.


At least in Germany you get the problem that the basic idea of research groups is rather feudal. ( That is the original idea when universities were invented in the late middle ages was, that professors get a personal fiefdom.) And this makes imho a lot more sense, at least for basic research. But recently universities try to organize it more like a business. That means that professors have to write more and more reports for funding agencies, and consequently have to pursue research that can be sold easily. That is, assuming that they have actually time to do research instead of fighting bureaucracy.


Agreed, but having worked in both research labs and industry, they are very different animals despite the similarities.


Modern science has become a business, complete with advertising, trade secrets (the old be sure you don't tell Dr. So-and-so anything about your work, his group is doing something similar and you might get scooped!), and various kinds of corruption. The scientific method is still hiding somewhere in there, but it ends where publish-or-perish begins.


I sympathize a lot with the frustration here and can agree on some of the points (especially the rankings/h-index/etc mania), however...

Much of what is bemoaned here is simply a consequence of the fact that science is done by humans and generally costs money, so there is a necessary component of business-type activities (networking, fundraising, marketing, management). Scientists also have human qualities such as egos, desire for money, power, respect and the rest. Why should we be different, or held to a higher standard?

Additionally, it's really hard to do original research. First off, really original good ideas are hard to come by; this problem is also not restricted to science or scientists. Then when new ideas come, they are often difficult or impossible to translate into scientific output, because of other constraints, such as lacking good enough data to test hypotheses. And then they typically fail, despite being great ideas. Really good, original, CORRECT ideas are far more rare.

Furthermore, science would be much worse off if researchers worked like the author would prefer. If everyone spent 10 years without publishing, working on a completely original problem, there is zero replication. Additionally, if the experiment or approach fails, as they usually do, little is left. So do you publish negative results, and then move on to the next decade's problem?

Instead, it's better that the system works the way it does. People do 'bandwagon research', which really is replication. They take the same idea, and apply somewhat different methods, or different datasets, or whatever. And they publish this as often as they can get away with.

Then, one can look at the aggregate body of work and evaluate the ideas and methods better. There is more redundancy in the system, and because the papers are smaller, the setbacks are much smaller in the case of failure, and there is a paper trail (so to speak) marking progress along the way. These smaller, more digestible, incremental papers are also more useful to other researchers, who may only be interested in a small part of the work--the methods, or some of the data, or whatever. It would slow everyone down if the only publications were these giant monographs that came out once or twice a decade.

It may feel like people are only working on small, unoriginal ideas because many papers look like this, but it's really important to zoom out and see how things fit together. Science is basically a Monte Carlo simulation, and having lots and lots of small and fast iterations covers the space much more efficiently.


"These smaller, more digestible, incremental papers are also more useful to other researchers"

Survey papers are useful to other researchers. Minimum publishable units are a waste of research time, and the pressure to pad one's CV and get "results" (which actually means more papers) worsens the situation. It is even worse when a grad student is told that they were "scooped" because their half-finished MPU was published by someone else.


MPU papers are useful for training purposes. They let students and junior researchers participate on the scientific conversation. It is important to have minor roles to be played in the larger scientific community.


Personally I hate the entire format of publishing a paper. It's more spinning a story then actually detailing useful information, and the single longest exercise of my PhD program (which I dropped down to a Masters from) was a result of a group publishing many papers on a subject and never actually detailing their method properly (turns out the whole thing is dependent on partial oxygen pressures, so the detail of "lightly placing the lid of the vial on top of it" - mentioned nowhere - is spectacularly important).


Real scientific progress does not happen overnight. It takes years of hard work. Sadly, modern day scientist are expected to constantly produce amazing results or face losing funding. And what do you think happens?


Let me try a guess:

They draw radical conclusions from their research which cause alarm/concern/interest to those who have no idea about the body of research but grant funding in order to get more funding?


Having been in grad school for a while, I can totally relate on using grants to churn out papers and egos running research.

Want to slightly disagree with this though: "Apart from feeling the gross unfairness of the whole thing – the students, who do the real work, are paid/rewarded amazingly little, while those who manage it, however superficially, are paid/rewarded amazingly much – the PhD student is often left wondering if they are only doing science now so that they may themselves manage later. The worst is when a PhD who wants to stay in academia accepts this and begins to play on the other side of the table."

The same thing happens in industry, except that you're paid more, and your work is directly relevant to the world you see (and not just a journal/conference where it is never read again.)


True. In an organization with enough people, it's often more efficient to specialize (e.g., into workers and managers).

I would say that the quoted statement is not an entirely fair characterization of managers (PIs) in academia though. It's always easy to discount the contribution of others and to believe you did all of the work, and believe me, you do a lot of it - but it's not a trivial matter that these "managers" already thought up and won money for the general research direction, and ultimately accept responsibility for the success of your projects; not to mention having to worry about funding you for the duration of your PhD. And do recall that they do had to be extremely patient with you while you were still learning the ropes - so you can't extrapolate your productivity in your last year back to the first year when assessing your cumulative value. ("you" used here in the generic sense)

I won't deny that many managers are asshats as bosses, but their job is not so simple.


I’m also going to suppose that in order to find truth, the basic prerequisite is that you, as a researcher, have to be brutally honest – first and foremost, with yourself and about the quality of your own work. Here one immediately encounters a contradiction, as such honesty appears to have a very minor role in many people’s agendas. Very quickly after your initiation in the academic world, you learn that being “too honest” about your work is a bad thing and that stating your research’s shortcomings “too openly” is a big faux pas. Instead, you are taught to “sell” your work, to worry about your “image”, and to be strategic in your vocabulary and where you use it.

contrast with the article and discussion at:

https://news.ycombinator.com/item?id=6353957


The problem with being "too honest" about your work is that you would be the only one doing it.

When you read a "scientific" paper, you read it expecting it to be a sweetened account of the reality. The test have obviously been performed using only few samples; out of the the ten described features of a system, only the two described in depth are currently implemented; the source is not available because it does not work on the famous really hard cases of the class of problems it should solve.

The system is so polluted that if you start being as honest as you should be, the reviewers will think that you have done even less.


It's interesting to think though, that government funding has produced nearly every single form of medicine we have, it funded einstein, it was indirectly responsible for the internet.

The problem is that you have to pay a bunch of parasites to support the few who are actually advancing the human race.


I am about to finish my PhD, and you clearly state the difference between my view of academia and the author's. While I agree with most of the things the author mentioned, and they are a large part of why I will not be going into academia once I graduate, I also recognize that academics acknowledge theses problems and try to mitigate them.

It is useless to complain about problems without giving a solution. Maybe, as you imply, the current situation is the least bad possible. I think that there is at least room for improvement when it comes to sharing of data and reproducibility, and I plan to work on reducing the technical barriers to both of these things. However, I don't have a solution to the social problem of how to rank academics and how to prevent academics from colluding in order to artificially raise their ranking.


It's inefficient and creates negative externalities.


Inefficient as compared to what?


Private funding, industrial innovation, etc.

Subsidized infrastructure does not practically promote risky endeavors. It's created the side effects mentioned in the original article.


It may be more inefficient, but it funds basic research without a short-term payoff, or even a suggestion of long-term payoff.


I would not dismiss the author's complaints out of hand with the ad hominem remark that the author is burnt out. It may even be true, but the author's points remain.

Academia has become a business, in which the priorities for research institutions are grants first, publications second and teaching a distant third. But this development does not imply that the way the university conducts business is optimal or desirable. (One sees this smug attitude in the comments in the form, "academia is a business, get over it.")

Graduate students, postdocs and research staff are exploited. One can legitimately ask whether the institutionalized scientific method must necessarily exploit graduate students, postdocs and research staff to maintain standards of scientific rigor. Cannot the the university act as a gatekeeper without requiring inexpensive contingent laborers to subsidize faculty and administrator salaries? Why must limiting access to networks of scholars and scholarship involve such oppressive opportunity costs?


Seems to me (I'm no expert in science history) that most of the huge breakthroughs came before the age of Official Science, i.e. Science(TM) being run through bureaucratic grant administrations in publicly-funded research universities (or 'private' ones with public money). Everything from the steam engine to Tesla's heroics happened while tinkering and playing. Granted, the game may have changed due to external circumstances, but we may yet want to look at whether the huge budgets of these departments are justified JUST BECAUSE SCIENCEEE. Do we want to support a bureaucratic process called 'Science' or actual production of interesting and/or useful results about the world?


I think that the critiques of academic science are spot-on, although I am not sure that I agree with some of his causes, for example that "science is a business". A lot of the raw pettiness that happens in science would totally sink all but the largest businesses. I am not sure what the solution to the problem is, but of course, I'm launching a science nonprofit startup to try and do things without having to deal with as much of this silliness as I can get away with. I fully expect that if it should live past 30 years, my nonprofit will begin to succumb to the very things I established it to address.


They should have never let business schools into universities. This is the result, academia is now censored, for-profit, tepid corporatism. A factory of managers crushing creativity with the jackboot of workplace professionalism.


What evidence do you have that business schools are the source of the problem?


Graeber's essay "Of Flying Cars and the Declining Rate of Profit" describes the "tyranny of managerialism":

"The growth of administrative work has directly resulted from introducing corporate management techniques. Invariably, these are justified as ways of increasing efficiency and introducing competition at every level. What they end up meaning in practice is that everyone winds up spending most of their time trying to sell things: grant proposals; book proposals; assessments of students’ jobs and grant applications [...]" (http://thebaffler.com/past/of_flying_cars)

Doesn't answer your question, because it's about deeper institutional forces than the mere existence of business schools. But I found it an interesting read.


A marxist critique of community college.


Or someone speaking out of experience? I can testify[1] on the nefarious influence of MBA-like groupthink in academia: gigantic counterproductive bureaucratic overhead, obsession with pseudo-measurability of everything, from impact factors over “quality assessment” to “standardization” of examination and “competence matrices”. Top-down bogus coming from people who failed in the corporate world and met their Peter’s principle in the academia they never left.

[1] http://www.veto.be/jg39/veto3914/ku-leuven-voert-machtsgreep...


while your characterization rings true for academia, I hardly believe that those attributes are the provenance of MBAs. The irony is that we are in an age of scientism (as Feynman would say) and it's precisely the popularization of science itself that has led to 'pseudo-measurability' that you speak of. I remember chuckling that my humanities classes were grading essays via mathematical rubric at a time when the math classes (which were proof-based) would just mark errors and give an 'overall grade'. Maybe they were just CYA in case someone would claim unfairness - which would not be the fault of the Business school.... Rather the fault of the law school litigious mentality.


You probably hit the nail with the popularization of science as the culprit for our captains-of-society’s obsession with measurability. And even more so with the institutionalized greed of law school sycophants. I don’t know about the role of the business schools — maybe I’m to harsh for my colleagues. But I noted that at least at my alma mater and its acquisitioned satellite colleges, the managerial caste consists of civil engineers with an MBA, only, and it’s a recurring pattern.

The noble idea of a res publica litterarum, or a universitas scientiarum — it’s lost since the post-war baby boom.


on reflection, I don't think the problem is popularization, so much as "authoritization". For whatever reason we like to put authoritative power in numbers. Although I disagree with his overall thesis and politics, David Graeber, in "Debt: the first 5000 years" really does identify this psychological tendency. (I think the solution is to free people from those shackles instead of giving in to it and trying to craft a solution that avoids it)


The irony is that we are in an age of scientism (as Feynman would say) and it's precisely the popularization of science itself that has led to 'pseudo-measurability' that you speak of.

Metrics, metrics, metrics... even if your metrics mean nothing at all, they're still very convincing!


A hate to trivialize a thoughtful article with a typographical point, but boy do I like my paragraph breaks when I can get them.



I'm trying to find a way to actually make a readability link open up with nice formatting but can't work it out... any tip to avoid having to have people then click on the "switch to Readability view"?


Try this:

  javascript:{v=document.createElement('style');v.innerHTML='blockquote{font-size:14px;width:500px}blockquote div{padding-bottom:8px}';document.body.appendChild(v)};void(0);


I read it in 5 minutes with spreeder:

http://www.spreeder.com/app.php


I used Spreeder to read "War and Peace." It's about Russia.


I see your point :) but for short pieces like this I find I can store most of it in memory. I did jump into the full text a few times to refer back to things.

I probably couldn't tell you much about the article now, but I felt I got it at the time.


There are lots of theories about how we absorb printed information, but few of them are backed up by reliable science. Hopefully this will change in the future. But speed-reading schemes have a particularly spotty record in the cause-effect and repeatable evidence department.


I'm interested - did you actually do this or was it a rhetorical statement?

@587,287 words and 400wpm, War and Peace would take just over 24 hours to read.


Well, while we're there, the HN link title is currently wrong also ("wity" vs "with").


I think this student misunderstood what a PhD is about. They sound disenchanted with the reality of doing a PhD.

The purpose of a PhD is to provide training in how to do research; most PhD students do not change the world or even make a large contribution to a research area - that's not the point. There are standard texts about PhDs that explain this, and most supervisors recommend students read a few of those books at the beginning of their PhD.

As someone who has worked in both academic and industry, I can assure you that academia has not literally become a business, that academics still have great freedom, and that almost every academic I have ever met works hard and is motivated primarily by the goal of doing great work.

Contrary to the article, most significant research is not done by PhD students - hardly surprising - it is done by Research Associates and Research Fellows, under the supervision of professors and lecturers. That's in the UK at least, and job titles vary in other countries.

I find the anti-academia slant in tech circles to be quite strange. Many people think academia is "out of touch", but there is a lot of collaborative work between academia and industry, involving nearly all the large tech companies. I've never heard academics in Computer Science expressing any animosity towards those working in industry. It's strange, because academia has historically been and continues to be incredibly important in the tech industry; I'd like to see more collaboration and less of a schism.


In my three years of operation, I have unfortunately witnessed cases where CERN duties and educational training became contradictory and even conflicting. This has particularly been the case when the requirements of the CERN supervisor conflict with the expected time dedicated to a doctoral student’s thesis. Some students would become hostages, torn between their CERN supervisor and their thesis advisor, usually located in his/her remote Institute. This is a very uncomfortable situation for students, which can even endanger his/her thesis.

https://cds.cern.ch/journal/CERNBulletin/2013/27/News%20Arti...

armies of graduate students and postdocs to do the nuts and bolts work, Asaadi says. That's fine, he says, so long as everybody understands the situation from the beginning. "When you're starting graduate school, is your advisor telling you, 'Look, you get this great skill set that will be transferable to other things outside of academic physics'?" Asaadi says. "Or are you being told, 'Just work hard and there will be something or other [in physics] in the end'? It seems like it's more of the latter." He adds, "This is where we got some pushback from advisors—it was seen as whining."

http://sciencecareers.sciencemag.org/career_magazine/previou...

"How should we make it attractive for them [young people] to spend [their prime, most productive first] 5,6,7 years in our field, be satisfied, learn about excitement, but finally be qualified to find other possibilities?" -- H. Schopper


I would like to point to Dijkstra's excellent article (The strengths of academic enterprise) -- http://www.cs.utexas.edu/users/EWD/transcriptions/EWD11xx/EW...

The OP talks about a bunch of problems in academia, may of which stem from an attempt to run universities like corporations. In his article, Edsger Dijkstra explains why that's not a good idea.

PS: If you'd like to discuss this, I just submitted this article at https://news.ycombinator.com/item?id=6358342


Just a thought but aren't Hackers way better at the whole pushing the boundaries of knowledge thing? I mean Is there any community more collaborative and co-operative and down right generous as ours? Tons of amazing people around the globe make tons of amazing stuff and a lot of them just open source it. By doing so they open the floor for debate, for improvement, for guidance to others to learn from it and to advance upon it.

Not to mention sites like stack-overflow where the entire community helps each other with tons of problems no matter how trivial or how complicated.

The whole world really could learn alot from us


In an admittedly fast read of the parent article, I don't recall seeing what field the PhD was in. FWIW, in chemistry, there is a very large difference in starting salary, job requirements and career potential between PhD and non-PhD. This is true in academia as well as industry. Non-PhD's tend to be technicians, pairs of hands, or sales. PhD's tend to be researchers and have a far better chance of going into middle management.

Is it relevant? Not sure any more given the loss of jobs to Asia and when the worker crosses 40. But it's been this way for as long as I've been aware.


It is sad to see promising (usually young) people dazzled by the myth of an ivory tower in the sky.

Most of academia simply sets fire to money. The more given to one bureaucracy, the bigger the flame. Fusion research (DOE). Or in this case, not. It would be rather nice to see Skunk Works and or Polywell reach Q > 1, it would be one of the greatest engineering feats ever.

Also there needs to be some support for communities of art and philosophy, where those may get by better.

Finally, all profit or all conjecture makes Jane a dull girl. (Jack was out sick today.)


If universities have simply become huge, bureaucratic, grant-obsessed sweatshops exploiting grad students whose churned-out PhDs don't promise them decent careers... why don't people hire some of those poor excess PhDs/PhD-dropouts and put together a grant-obsessed research startup?

Beat research universities at their own game, and perhaps they'll go back to being true universities.


Belief in credentialism doesn't magically result in talent.

Thought experiment. 1K have talent, another 1K are promising, and another 1K are basically non-productive.

If you have a 3K student body and chop away 1K, you'd like to think that that 1K leaving are those with the least talent, so they'll never be able to compete with the superior upper 2/3.

Also if you assume a constant income of R+D money, all separation really means is spending a larger fraction of that limited money on fixed costs (great, now we need two $10M/yr CEOs and two $5M/yr CFOs and two labs and two HR depts and two... etc etc) Much like a divorce usually doesn't in total result in an aggregate higher standard of living once the same total income now has to support two households instead of one.


Yes why cant research be done in start-ups? Well... maybe its the money. Research doesn't exactly bring in money. So wouldn't that be a failed start-up?

But i do like the idea of telling these senior supervisors to f-off and taking away the talent to do some real work for which they get credit


Similar thoughts : http://unqualified-reservations.blogspot.com/2007/07/my-navr...

CTRL+F for "computer science" and start reading from there. Till then it's filler.


"The art world got smaller, once more a somewhat parochial little world known mostly to its initiates. You can suffocate in these little worlds if you get them confused with the real world, so the smaller they are, as far as I'm concerned, the less confusion they generate."

--Gary Indiana


High-level scientific journals, when it comes down to it, really aren't that different from gossip magazines or HuffPo. Yes, worthier subject matter for sure, but in the end their judgements on publishing come down to virality and eyeballs.


There is another issue: People who are studying a PhD often work alone and develop interpersonal connections. Certainly the damange to mental and physical health are irreversible.


Sad reality-check - Genome Technology magazine March 2011 cover story page 38 "The Crowd of PhDs - ...whether they will have somewhere to go"


I'd be interested to get a sense of how these issues vary from country to country and field to field.


I agree. (source: at similar stage of PhD to original author. Trying to get to the end myself though).


Try not to agree too vigorously... regardless of how valid the author's points are, the end of grad school is usually the most stressful and depressing part (possibly of your career), especially until you get a job lined up.

You're almost done.


I am also months away from submission, so I wasn't entirely sure on what level I was sympathising with the authors arguments!

Almost there...


Academia is run like a business by people with no business or management experience.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: