Hacker News new | past | comments | ask | show | jobs | submit login
So Many Research Scientists, So Few Openings as Professors (nytimes.com)
89 points by hvo on July 14, 2016 | hide | past | favorite | 113 comments



I run a small industrial applied research lab (<100 researchers). The truth is that, except in fairly specific cases, we don't need PhD level staff either. Just like in a university lab, the bulk of the hard work is done by people with Graduate (or in some cases Undergraduate) degrees. Practical experience in the field is also about 50% of what we need (since we're ultimately figuring out things that can go into engineering and deployed operationally, knowing what the end-state environment looks and runs like is critical). The role of PI is more often filled in our case by the corporate leadership.

I honestly think that the lack of large industrial research labs and the drying up of good academic research positions is very much an issue of what would be R&D funding going to other things that provide better near term ROI: VC, stock market, etc.

R&D is very much an investment like anything else and for whatever reason (patents, existing IP portfolios, etc.), it's just not a place that gets much attention at the moment as the money handlers have found that they have more ROI opportunity elsewhere.

I think the modern substitute for old fashioned R&D is the modern tech startup. The siren call of these ridiculously inflated "valuations" is simply too much for investment managers and it locks up more and more money that could go towards other investments. There's more parallels there as well, R&D and Startup investment is a high-failure game. But the outcome of successful startups is likely to be much larger than the outcome of a successful R&D venture (which still has very long tails of IP capture, product development, marketing, etc.)

I think it's a shame in the sense that, while we end up with dozens of photo sharing sites and social network startups, there's very few actual world changing ones out there. Most of what we get are not "hard" technical areas and their value derives from their faddish popularity - they're kind of the Karashians of technology.

But I guess that's the way of the world.


The goal of research (at least for basic science) isn't to make money, it is to increase knowledge about the universe. This knowledge is a public good, so it makes sense that private industries motivated by profit do not support fundamental research. Scientific progress is a rising tide that lifts all boats.


At some point most of the sources of funding for most research, there's some expectation of ROI at some point. This isn't true obviously for very basic research, but even then there's a hope that the process of exploring the deep unknown produces some practical output. For example, the process of building and designing the LHC produced various advances in fields as far ranging as management theory to precision machining and CAD design.

For the organizations and countries involved, this helps provide for an advanced industrial base that continues to build and drive an ecosystem that supports powerful economic engines.


If you just want ROI, you are better off spending your money elsewhere. This is evidenced by the lack of money most companies put into scientific research. Further the gains of science are in general hard to profit off of. (http://www.therichest.com/business/they-could-have-been-bill...)

Another major reason governments fund research is to produce people with PhDs. This is why the apprenticeship structure of academia is so sticky, and why in order to get tenure professors have to graduate students.


Price signals are how we tell people what research careers are worth considering.

Unless they don't mind working for peanuts, in which case there's no problem.


Well .. it is a tournament model so price is misleading. Top researchers at industrial labs get paid a decent amount. The problem is (a) getting a coveted job, and (b) mobility when things eventually go south at your employer and they decide research is expendable.


I think both these points are true:

1. Society uses price signals to drive activity to areas "we" want to develop

2. In research, the tournament model prevails so a few winners get most of the gains

However, the question arises:

The tournament model prevails in many other domains including startups, the movie industry and so forth. However, in those areas most people don't feel that the enterprise is severely underfunded. (I am assuming this so if people have contrary data, I will re-assess).

The question stands: in spite of the tournament model, why does society fail to deliver enough reward for basic research when we as a society believe that there should be more of it.

(We could ask the same of teachers, and so on.)

It could also be the case that "society", whatever that is, doesn't really feel that science is underfunded.


The problem is that there are plenty of graduate students willing to work for peanuts.


You also don't know how you stack up in the tournament. The people entering graduate school have usually done really well in undergrad. However, being in the top 1% (or whatever) of that small pool provides you with a lot less information than you'd think about your prospects in a pool of people who were also in the top 1% of their own undergrad classes.

Repeat for grad school->postdoc, postdoc->tenure track, and so on.


I agree completely. I also think there is way more luck involved than people want to admit - a lot interesting results are unexpected, and there are so few jobs the timing has to work out for you. I have known plenty of great post docs who couldn't get a Professor job, and plenty of Professors who seem pretty mediocre.


> Now, as a new crop of graduate students receives Ph.D.s in science, researchers worry over the future of some of these dedicated people; they’re trained to be academics and are often led to believe that anything else is an admission of failure.

As someone finishing up grad school, this doesn't seem quite accurate. To me, it seems like a lot of the interesting research is being done in industry. I remember scrolling through jobs online toward the end of my undergraduate and seeing "PhD required" for all of the positions I was interested in. Most of my friends working on their PhD are aiming either for industry or a research scientist position at a government lab (like ORNL). So I'm not sure who exactly constitutes this group that believes "not professor" = failure.


If you don't advance fast enough on your academic career, it can be read as a failure indicator by grant reviewers, which will reduce your chances to stay employed. If you're too old, and not past an appropriate career stage, you will also find it hard to find someone to employ you in academia.


What field are you in, though? I believe that attitude is tremendously pervasive the more "theoretical" the field is, e.g. the closer it is to math & physics, and the further it is from engineering.


Well, it's kind of funny. My degree is chemical engineering, my advisor is materials science, yet all of my work is HPC/computational. A lot of this stuff has become very interdisciplinary (I studied more computer science papers than anything over the last year).


I studied chemical engineering: all of our PhD grads chose industry over academia, and most of our young professors had come out of physics, chemistry, and biology departments. It was an interesting culture mix, partly created by the fact that ChemE is either disappearing as a field or changing to the point of being unrecognizable (depending on who you ask).


Yeah, the field has changed a ton. When my dad was working on his ChemE degree, they had a class strictly on galvanization (which was barely covered at all in my undergrad), and they were doing McCabe-Thiele diagrams by hand.


I can confirm that this idea is pervasive in physics (both experimental and theoretical)


The attitude of going into industry is more prevalent in physics? Or did I get that backward?


Sorry I wasn't clear - the attitude of going into industry being seen as a failure is common, especially among older professors. This attitude is changing somewhat, but is still definitely there.


Three big problems:

1) Emphasis on funding translational research using government grants, which industry would otherwise be doing on its own. Hence, industry jobs have evaporated while basic science has languished.

2) Institutional salary caps on Grad students/Post docs, so training funds stretch too far resulting in an oversupply of entry level jobs. Better to pay the best people at competitive wages.

3) Growing divide between industry and academia, making job hopping even harder. Everybody is left worse off.


A huge part of the issue, which you hint at in #2, is the "training" part.

Most research is actually performed by trainees: undergrads, grad students, and some postdocs. This is silly because

1) the "training" is often pretty minimal (my PhD involved less coursework than an undergrad masters) and

2) most of the people doing most of the work have no idea what they are doing.

Nevertheless, this still happens because there are TONS of established funding mechanisms for trainees: REU (and similar per-university programs for undergrads), and individual (NRSA) and institutional training grants (T31) for postdocs and grad students.

In contrast, there are very few ways to fund more experienced individuals who are not running their own lab. This is bad for those people, and also bad for the institutions.


There's more to training than coursework.


Sure, I don't want to sell my PhD program too short. There were lots of visiting speakers (which needs $$$), journal clubs, workshops and things like that. There was a terrific library, and an environment where many

That said, actual training on doing research was pretty damn thin on the group.


4) Universities accepting more students into programs than could possibly get an academic job.


Wouldn't be a problem if universities competed with each other to nab students with the highest scores. Universities would spend more money hiring students, leaving Universities unable to hire as many students.


> Wouldn't be a problem if universities competed with each other to nab students with the highest scores

Why do you think they don't now?


What the article curiously doesn't talk about it the fact that the non-academic job market at PhD level collapsed in 2007. In the news yesterday: even more cutbacks at Merck. The degree of offshoring and outsourcing that happened since 2003 in chemistry and the life sciences is incredible to anyone not in the field.


More on the cutbacks at Merck: http://endpts.com/merck-triggers-a-new-round-of-layoffs-in-r...

> Merck’s move follows a major trend in biopharma R&D, as the biggest companies concentrate more and more of their work in the big hubs. And virtually all of the major players have downsized at one time or another.

> Close to three years ago, Merck triggered a major reorganization in its R&D ranks, as the then new R&D chief Roger Perlmutter set in motion a plan that involved 8,500 layoffs, all of which were piled on a restructuring effort that was announced earlier.

> Those layoffs followed a years-long gap in significant new drug approvals and a string of clinical setbacks. Since then, though, Merck landed a landmark approval of Keytruda, now the number two checkpoint inhibitor on the blockbuster cancer market, along with an OK early this year for its hep C combo, Zepatier, which is being sold in a rival-infested field.


It's a real shame how the recession has hit STEM funding. There are a lot of really exciting new technologies that are simply not funded.

The private sector cannot innovate until monetization is in sight and the public sector is increasingly squeezed to publish papers with limited resources.

The overall effect is less innovation and bad science.


> The private sector cannot innovate until monetization is in sight

That feels like a sweeping condemnation of R&D in the private sector in general. I know very little on the subject, is there a strong base for this position?


While "cannot" might be a little strong, it can be true, depend on the field and company of course.

I used to work at a large company that in the past performed a large amount of fundamental research into physics, since it affected their products. I took a tour of the labs recently and they are a shadow of their former selves. It was depressing.

I think reduction in private sector STEM comes from several sources. One is the private sector not willing to take the risk of something not working, no long-term thinking, etc.

Another might be the relative difficulty in measuring the impact of your R&D department. In an era where every cost has to be justified and everything has to be measured by some metric, R&D begins to look like an expense rather than a profit generator, at least in the short-medium term.


It's a side effect of changes in the marketplace and the world.

When big conglomerates ran big horizontal and vertical businesses, they did general R&D work that benefited their broader corporate mission. Bell Labs wasn't there because AT&T was some sort of benevolent charity.

Now companies are mostly brokers between outsourcers and have a narrow focus. We haven't "caught up" with the capability that we already have, so startups have replaced corporate R&D, mostly using off the shelf stuff to make minor incremental changes.


I don't know if this was legally enforceable, but I have heard that the existence of Bell Labs was essentially a quid pro quo with the federal government in exchange for being granted a regulated monopoly on phone service. More generally, I have heard that corporate executives in the golden age of American industry had more of a sense that they owed a debt to the public for all the free stuff they had been given (cost plus defense contracts, government R&D, radio and TV frequencies, ect.). I think there are some rose-tinted glasses involved in that analysis, but it was probably part of it.


This story again. The oversupply of aspiring scholars is really really not news.

What would be news is any sign that things are changing, with applications to grad school dropping, or maybe the expansion of graduate school degrees that are explicitly not aimed at professorships.


I had always heard that this was the case in philosophy, literature and so on.

Then, a few years ago, I heard it was the case in the sciences.

Now it is in the technical disciplines, and it is egregious. (4x oversupply). The magnitude and the discipline is the news here (at least for me).


The lack of scientists has been a myth for a long time, and probably comes from the lumping of very different fields under the umbrella of "STEM".

Lots of good info from the IEEE Spectrum (2013): "The STEM Crisis is a Myth"

http://spectrum.ieee.org/static/the-stem-crisis-is-a-myth-an...


Getting your graduate degree solely to gain professorship is a bad look for anyone, even in engineering. The supply of professors across the board has been extremely high for at least 5 years.


I disagree with the bulk of this article. It certainly varies a lot of by field but in biophotonics I've seen a lot of extremely poor candidates, people who say things like "I don't do math", "I don't want to teach", "MATLAB is for other people", "I'm not a biologist so I can't explain my work".

When I first started I thought these people sucked, and I was going to be great. What I learned is that the graduate development opportunities are rare, and many of the reason people suck are beyond their control. For example, working on tedious projects or bullshit projects, do to your PI's conflict of interest. Or simply put that your research rests on a foundation of lies.

So, on one hand we have many people on the other hand we don't have many qualified people.


Serious analysis irony here - your anecdote doesn't reverse the mountain of data in the other direction.


Saying "MATLAB is for other people" isn't the sign of a poor candidate. Saying "you will need to use MATLAB for this job" on the other hand is overwhelmingly a sign of a bad job, in the same way that any usage of Excel VBA whatsoever is a sign to run, run, run for the hills -- that firm is absolutely doing it wrong.

I've also heard the opposite of your first example used as a criticism of candidates too. After completing my undergrad degree in math and then grad degrees in statistics, I was astounded how in industry, either describing yourself more as an engineer or scientist who does not work on the abstract math stuff or describing yourself as someone who is very interested in abstract math will both cause you to get rejected.

The only way to win is to happen to have done a lot of difficult abstract math in the past and remember it all well enough to pass tricky interviews, but then to be overwhelmingly happy and satisfied with a job that will not ever ask you to use it and will instead burn you out on dumb shit stuff like fitting a regression to KPI data and using t-stats to directly do (fallacious) model comparison.

The biggest career risk I've seen from having done very computer science-heavy statistics is underemployment. Your math chops will be pure credential, sometimes used by your manager to try to win arguments from authority about e.g. that dumb shit KPI regression. But you will 100% never be given open-ended modeling work that could actually have a positive impact on your business's bottom line.

Essentially, you are hired to be some more senior person's sycophant political darling. You function internally much the same way that a big consulting company functions externally -- people already know what they want to hear, they just want you to tell them what they want to hear, slap together some plausible-sounding rationalizations for it, dress it up with buzzwords about big data and "insights", and be a show pony for talking about it.

They emphatically do not want you for doing anything that would be called "real" work.

There are occasional exceptions, especially in certain teams at certain established tech companies. But then, landing a position inside one of those teams is nothing but the same lottery as winning a professorship all over again.


Welcome to the real world of being an employee. It's common in every field for new employees to be academically overqualified and think their bosses are incompetent and irrational. If you don't like it then try starting your own company. I'm not being flippant; that's a serious suggestion and would give you a more realistic perspective.


Same tired, cliche replies as usual ... you're probably also assuming that my opinions aren't based on job experience, yet they are based on several years of experience in many positions, where I did the 40-hour grind, delivered business solutions for months-long and years-long projects, etc.

I'm not talking about academic overqualification. I'm talking about someone hiring you, talking at length about how you are being hired specifically to do X because it matters to the company's bottom line, and then after you're hired they switch it and say actually you're going to do Y but you're going to be a political mouthpiece for X.

One common set of values is

Y = statistically invalid model fitting that actively causes the business to lose money but which is easier to reduce to pliable metrics for political jockeying

X = (deep) machine learning and/or Bayesian stats

It's not at all about academic overqualification. The actual business need, for reals, can benefit from the pragmatic and cost-effective use of the tools, and the person is actually skilled in using to do exactly that.

Yet, they are prevented politically.

As for starting a company, I think I would guess that's one of the least plausible ways of doing important or useful work. You'll only get funding if it is a trite variation on consumer bullshit -- even though consumers themselves don't want that and would rather that your labor is allocated to solving more fundamentally important social problems that there just isn't money for solving. Plus, you'll be so burnt out over all the auxiliary stuff like HR, marketing, sales, that you won't actually do any of the underlying quant work that was the whole reason for starting the company in the first place.

It kills me how people here seem to think that "start your own company" is some kind of "put up or shut up" gauntlet to throw down to challenge someone who is lamenting the shitty state that things are in.

"Start your own company" is not some venerated challenge-call for those brave few who want to change the world. Starting your own company is just a different format of the same bullshit phenomenon.

Fixing what's broken inside of companies and organizations that already have huge leverage and capital to positively impact fundamentally important problems -- that is perhaps worthwhile, if you can manage to deal with the political fighting without getting too burned out.


You seem to be missing the parent's point. He's saying that it's very common to be told that you are hired to do X and made to do Y (for the record, it happened to me once, for X being stochastic calculus for derivatives pricing and Y being excel jockeying), and if the reason for that are the politics caused by a big structure, then move to a smaller structure. You can create a company that does consulting in a quantitative field, so you don't have to seek funding to hire a team to build product.


I disagree. I think the parent is trying to say that businesses hired overqualified people and then give them less engaging work for valid business reasons.

What I'm trying to say is that you're told you'll be hired for X, and, crucially, that it's easily verifiable that X actually would help solve the business problem better than Y. Failing to do X actively hurts the business.

Yet you're still forced to do Y. It's not because in the real world you only needed simple, trustworthy Y to get the job done. No, you're failing to get the job done, need X to get it done, are told you're the one to bring X to the table, and then you're made to do Y for destructive political reasons.

The point I'm trying to make is that there's no defensible "real world" pragmatism to support the focus on Y nor the bait-and-switch to hire someone who knows X. Whatever the reasons for that, they are not about improving the firm nor making money for the firm. They are about optimizing a bonus or promotion or whatever for a single individual or some small faction, even at the expense of the organization's overall progress.


What's with the Matlab hate?

Sure, it's closed source, expensive, and can be clunky, but it is not a horrible tool for many jobs that involve a mix of signal processing and other data analysis.

Scipy is often a viable altenrative, but if the company already has a bunch of Matlab code, rewriting things in Python probably wouldn't be worth it...


>Sure, it's closed source, expensive, and can be clunky

And slow. Don't forget slow. Closed-source, expensive, clunky, difficult to read, and slow.

(The first time I took machine-learning in grad-school, our assignments were in Matlab, and testing them required me to stay up until 04:00 in the morning at least once a month. A validation run just would not take anything less than five to six hours. The second time I took the class, I took it in the computer-science faculty, and the professor gave the assignments in scipy/numpy. Validating and debugging that was easy.)

(Just this year, I actually went and rewrote a probabilistic program in Haskell from a probprog package based on Python. Interpreted languages are fucking slow for numerical jobs.)


I'm a little skeptical. Yeah, interpreted languages are slow, but under the hood numpy / Matlab call out to BLAS/LAPACK. There's a constant overhead per function call, but if you "vectorize" your computations to work on arrays instead of single values, the difference is not that great.

Seriously, Matlab = MATrix LABoratory. It's a DSL for matrix math, and it's really pleasant to use for that.

Also, my Matlab code is a joy to read. It's not Matlab, it's that most people who write Matlab code aren't professional programmers / don't care about making it pretty.


Yeah, I'm not sure I believe that either--it's calling the same libraries python calls. It is possible to write matlab code that runs incredibly slowly--using eval to make anonymous functions calls to (badly-strided) individual elements of an array--but decent matlab and numpy code should be pretty comparable.

> It's not Matlab, it's that most people who write Matlab code aren't professional programmers

This, 1000x, this. There's also the time-honored tradition of a quick one-off script "to see if it works" that somehow becomes permanent.


Eh. It's more that Matlab code can either call out to fast linear-algebra libraries or go through an interpreter, and you have to memorize which language constructs do which. I'm definitely a professional programmer, but I still can't write fast Matlab code for that reason.


The slowness and the pan-everything global namespace for the primary libraries are the two biggest problems. The interface for performance-targeted optimizations (mex files) is awful and doesn't hold a candle to Cython. These are downsides that would prevent me from using an open source and free tool ... so when you add that it's closed source and expensive, it just simply does not ever make sense.

If a place finds themselves having a lot of MATLAB code, it means that earlier they didn't refactor and retool when the choice was cost effective to do so. That's strong evidence that for whatever they are working on right now they also will not refactor it to at least try to avoid the future costs of current poor designs. That's a huge red flag.

I guess if they paid an insanely high salary or gave you some other type of assurance that you personally valued, you could trade it off against the red flag evidence. But generally these places also tend to just hire maintenance engineers, since they know they can't offer interesting work. It's best just to avoid and work for places that bend over backwards to refactor and evolve their tooling over time specifically to prevent this problem.


> If a place finds themselves having a lot of MATLAB code....That's a huge red flag.

Serious question: do you feel that way about other older languages too? Would you balk at a C++ shop (which isn't doing high performance or low level stuff?)


I would only balk at a couple of things, MATLAB, VBA, and extensive server-side Javascript being the main red flags of poor tech culture.

I have some additional skepticism about (a) shops that very quickly adopted Go and then display a cult-like dogma about how super perfect and awesome at all things networking that Go is and everything else isn't; and (b) shops that use Scala or Clojure purely as "better Java" and often have thin, entirely unfunctional layers wrapping bad legacy Java code -- in such places, the Scala and Clojure often exists specifically because they had a hard time recruiting people to be legacy Java maintainers, and having them do it through one layer of indirection, such as Scala, let them tap into a more vibrant labor market with a lot of people who just naively believed that e.g. usage of Scala == adherence to good quality standards.

The age of the language has nothing to do with it, and I actually quite enjoy C programming. I'm not a big fan of C++ code bases in which some of the more esoteric language features are greatly abused, but generally feel like C++ is a great tool and would in fact enjoy the chance to dig more into in a future job.

I also quite like code maintenance and legacy code extension, because when the company takes it seriously and values that kind of work, it involves a lot of cool abstraction, working with interfaces, design tests a la the Feathers book on testing in legacy code. I think that stuff is quite fun, and a healthy culture of refactoring makes it interesting to work on legacy systems.

While it's not an iron-clad rule, certain kinds of legacy choices, however, are not really compatible with a spirit of quality, legitimately valuing maintenance or refactoring, etc. MATLAB and Excel VBA are by far the strongest indicators that it's not a good situation.

The items above are just language choices, too. There are also many other things to worry about. For example, when a place uses MATLAB, it often means some of the work they do is scientific / quantitative prototyping. Do they enforce good code standards even at the prototype stage so as to minimize the distance between research prototypes and production? Failing to do this is a seriously gigantic red flag. It can mean that the management chain values the domain science more than the quality of the implementation, and often this means that domain scientists are allowed some of the following

- to manage their own working environments (leading to lots of awful "but it works on my machine" errors)

- to write every thing as giant, messy, linear scripts that start off with 200 lines of boilerplate data loading and model setup code that should be factored into a library but is instead copy/pasted and finish off with 200 more lines of custom plotting code that also should be factored into its own internal library for standardizing reports and charts

- to turn things into short-term fire drills for production programmers solely by virtue of them being "someone who uses MATLAB, not <some real language> that we use in production", and get management support for this.

- never bother to learn object oriented or functional programming principles -- google a design pattern and then just paste some ill-conceived bastardization of it wherever they feel, and then argue with production engineers that you can't refactor it "because it's a design pattern"

- ... I could go on.

When I see places that heavily use MATLAB or R for research systems, I break out in a cold sweat, since those languages are entirely unsuitable for professional software design. They are good for ad hoc linear algebra, optimization, model fitting, plotting, and statistics. But the software design underlying how those things are carried out in MATLAB and R does not translate at all to a system where the scientific code is maybe 1% and the business reporting code, customer-facing services code, etc., are the 99%. And putting the scientific code at 1% is generous even for a company that is solely about scientific computing or quantitative services.

Anyway, there is just a large difference between organizations that work from a systems engineering perspective first, and build that way, and mercilessly require non-programmer PhDs to get up to speed on actual, principled software development even for doing ad hoc domain specific work.


> management chain values the domain science more than the quality of the implementation

That is the crux of the issue. Many Matlab and R users don't see themselves as producing software per se. They are producing "recommendations" or models or papers; the code is just a means to that end, and so who cares if it is terrible? No one will do this exact thing again anyway... Obviously, that is pretty myopic. If you have good "infrastructure" code, it makes writing the one-off parts easier, faster, and less buggy--which it makes it easier to turn the one-off code into infrastructure.

Out of curiosity, how could I convince you that joining our group (which does have a ton of matlab code) wouldn't be awful? Or what would convince you to hire someone whose last job was predominately matlab?


Oh really. Is this Haskell program anywhere online?


If the resultant paper gets published, the repo on github will go public and the link will be in the paper.


I ask since I work on probabilistic programming in Haskell and am always looking for more use cases.

Good luck and I hope the work is published.


Hey, can you tell me what probprog system for Haskell you work on? I might have been meaning to write a sampler for you.


I don't see any particular reason to tie research generally, and scientific research specifically, to institutions primarily organized around undergraduate education.


Well, I'm at a research institute as a postdoc. The problem is that there is no career path except for trying to get tenure, and since there are no openings and the competition in the academic middle field is insane, this is so unlikely that it has become unrealistic to plan for it even when you're doing good work and have adequate publications.

So at least in my area there is no research career. It's a dead end.


Two things:

1.) I am sorry that you are going through this.

2.) Your experience perfectly matches that of a good friend of mine. She has been doing excellent work, she has had some great pubs and yet she considers getting tenure to be about as likely as winning a lottery.

I wish I had something tangible to offer, but the best I can offer is my good wishes. Seriously, best of luck!


As someone in the same situation, thanks for the good wishes!

For biomed stuff, it's frustrating that the NIH seems like it could address this problem (they control a huge proportion of the available funding), but have no interest in actually doing so.

If I were king for a day, I'd seriously consider shrinking the training grant pool and redirecting that money to developing viable career paths for "staff scientists" or other experienced-but-non-PI employees.


They don't have an interest in fixing the problem. They want to get as much research done as possible---and so they benefit from exploiting cheap labor. (One could perhaps argue that they are harmed in the long run, but this isn't clear.) They also don't see PhDs leaving academia as a bad thing at all. To the contrary, they advertise it as a positive externality; the research money isn't just staying inside an insular academic community, but is giving companies across the economy access to skilled labor.

Giving money to staff scientists would work against both of these benefits. It would cost more and could reduce the number of trained scientists entering industry. (Although that isn't clear. Creating more NTT positions might just give hope to more of those eternal postdocs.)


> They want to get as much research done as possible

I agree, but it certainly does not feel like the NIH has thought about this carefully either. My impression is that the NIH does what does because this is how it has always been done (and that worked well enough for the PIs they consult). I would love to see data suggesting that the current arrangement is optimal--or even close to it. The only thing that comes to mind is a recent paper arguing that peer review scores for grants were mildly predictive of the number of number of resulting publications.

As for staff scientists, one good staff scientist could certainly be more productive than a gaggle of newbie masters' students on their own. Right now, everyone is incentivized to do just enough to get the next paper. Permanent employees could build up infrastructure (protocols, code, etc.) that is actually reusable and ultimately more efficient.


My understanding is it's the other way around. A research university teaches undergraduate education almost by accident, to help pay the bills when funding is short and provide able bodies to run experiments.

Look at UCSF, they don't even have an undergrad program.


When you are working at the coalface of research you are intimately aware of which concepts are relevant and which are not. That directly feeds into undergraduate teaching. You can't teach today what you taught 30 or 50 years ago, but at the worse undergrad-only colleges you can get away with it.


Even if this is true, and it seems much more aspirational than something that actually occurs with any sort of frequency, that's merely one benefit to be weighed against all the disadvantages. To name just a few such: worse average teaching ability than a selection method that focuses on teaching ability, reduced research output due to splitting researchers' time between teaching and research, and of course the issue pointed out in the article that the needs of undergraduate education in large measure limit the number of research scientists.

Were it not for the historical accident that we do things this way, I strongly doubt anyone would suggest we adopt it for whatever tiny educational benefit it may impart to undergraduates -- most of whom won't even go into the academic fields they are studying.


You mean undergrad courses don't teach the same thing they taught 50 years ago? When did that change?


It all feels like a market signaling problem. When people are choosing majors and how far to go in school, they don't seem to have the right information. Lack of understanding of the marketability of different degrees surely harms the students/future workers, but would seem to be in the interest of certain employers who benefit from the glut of overqualified workers. Of course, oversupply in one sector is likely accompanied by undersupply in others.


I don't think the colleges are keen for students to have that information. College budgets depend on people getting degrees using debt they will have severe difficulty repaying.


This has nothing to do with marketing or prospects for different degrees. It doesn't matter what degree you choose -- if you go to graduate school, you'll be trained for a job that you will likely never get. That's as true in Computer Science as it is in Literature.


Every year the market grows tighter, and federal money for research grants, which support most of this research, remains flat.

This makes me think we (as a country, a people), having trained all these scientists, have a market opportunity for inexpensive research, and we aren't taking advantage of it. Sure, the best and the brightest might fight their way to the top, but we can get perfectly good science out of the middle of the bell curve too.


You can't have a million researchers do a 4000 man-year project in a day.

There are decreasing marginal returns from throwing more people at a problem. Especially if they are publishing, because now you need more people to curate their publications. And if you throw enough researchers at something, they are going to find a bunch of spurious results simply because of the luck of large numbers.

I don't know if we are at that point. But adding a bunch of average people to a project can easily slow it down.

If you want to help the existing researchers, mow their lawns and buy their groceries and take care of their house maintenance problems. Let them concentrate.


Not faster research, more scope. Things that we neglect today. Hell, form a paper review branch whose sole purpose is review/reproduction of results.

We're not out of work to be done


This is a good idea.


But you need to find these researchers the equipment and infrastructure to do research, thus it's no longer inexpensive research. There's no money for this.


The thing is we could double the science budget in most industrialised nations without really noticing. Compared to all the other very low ROI work being done by government spending and the almost free cost of borrowing it's probably something we should do.

Throw enough tenure against the wall and eventually something will stick


Any chance you are running for Congress? :-)


Running from Congress, yes :-)


http://www.ncbi.nlm.nih.gov/core/lw/2.0/html/tileshop_pmc/ti...

Right,

What is the difference between Civil/Environmental and Environmental ?

It seems the best bet is to study something like mining since the R_0 is so low there.


To oversimplify by a lot, Civil/Environmental = urban/industrial (e.g. industrial/municipal wastewater treatment, air pollution), Environmental = agricultural/resources (e.g. soil management, biofuels, soil decontamination for mining/oil).

That's roughly how it was divided at my college, anyway.


Petroleum engineering regularly makes the list of degrees with highest pay...


The petroleum industry goes through boom / bust cycles every few years. Make sure to graduate during a boom.


Please choose 2 among the 2000 problems that you need to be solved as soon as possible. Wait patiently whereas 1998 urgent problems unexplicably rot and spawn 50 new problems each. Learn how to live with the new situation. Talk about the old good times when life was much easier and you had plenty of money to spend. Repeat.


It would be wise to do an MD/PhD. The MDs have all the data.

Also Business schools need professors.


I agree, but broadly stated, the path doesn't generally train good scientists. The MD/PhD is more like a MD/MS. They frequently are pushed through 'phd' more rapidly under the eye of Med School deans and to optimize MD/PhD training grant slots. A not insignificant number of MD/PhD trainee simply do it for the free med school also, with little interest or intent to pursue research.


Or you can do what I've done, get a PhD, and be friends with the MDs.


... yet tuition has increased dramatically in the past 25 years.

Where is the money going?


Where is the money going?

Buildings and administration. The "undergraduate experience" that involves 24-hour gyms and subsidized restaurant-quality cafeteria food.

The truth is that professorial salaries are low and tuitions are high (for full-price payers) for the same fundamental reason: universities set the levels where they are, because they can.


Administration and facilities.

http://www.huffingtonpost.com/2014/02/06/higher-ed-administr...

The interesting thing about this is that nobody complains about it.


This is one of the reasons I founded asimuv. You are better off founding your own project.


This is a good problem. The solution is to raise the bar: in order to be a professor you have to make a tangible advancement in your field, something that advances the state of the art by an order of magnitude.


It's easy to say that, but there's no way that can be reality. The amount of money to advance your field by an order of magnitude can be quite large.

Research is expensive and funding is hard to come by. Therefore, if you want to have a job next year, you end up doing 'safe' experiments that only incrementally advance your field.

In order to get funding for more radical ideas, you need to have a history of good results (and past funding). Which usually means you have to be a professor already.

(Also, many people who aren't professors, such as postdocs, cannot directly get funding from many funding agencies. They usually require you to be a professor already...)


Except right now, in order to be a professor, you're mostly writing grants, not "advancing the state of the art by an order of magnitude."


Even the greatest geniuses do not advance their field by an order of magnitude all by themselves within seven years of receiving their PhD and their first permanent position (a tenure-track evaluation timeline). Einstein took ten years after publishing on special relativity and the photoelectric effect to discover and publish general relativity. Hell, in 1905, he only had five papers published and had just finished his PhD thesis.

You just denied tenure to Albert Einstein.

(Luckily, in real life, he was appointed a lecturer in 1908, three years after finishing his degree, and became an assistant professor in 1909 and a full professor in 1911.)


Also, pay them more.


Where is the money going to come from? Undergraduate tuition has already reached ridiculous levels.


You can get administrative head count back to 1980s levels. You can also ban 7 figure salaries for University administrators. That would pretty much do it. It is a simple problem with a simple solution.


You can start by disbanding the football teams.


That's a great idea, but at many schools it wouldn't actually raise much money. It's not as though the idiot boosters will be just as happy to see their money going toward lab equipment as toward e.g. barbells.


While I'm not a huge fan of college football, at many schools the football teams are effectively profit centers due to media rights and merchandising.


or just pay them.


The cream of the crop will still rise to the top, so it's simply a case of the bar for Professorship is rising. Which is a good thing.


I strongly doubt that is true, actually.

I'm a few years out from my PhD. The classmates who stayed in academia are neither much smarter nor harder working than those who have left. It does select for certain personality traits, like risk tolerance and willingness to self-promote, but I'm not sure that's really what we want.


The counterpoint to that is that universities are moving more and more teaching positions to short term contracts with terrible pay and benefits. So skilled people might instead choose to go in the industry/pursue other opportunities, and the students find themselves with instructors who end doing what they do because they can't go anywhere else.


I find that the people that are best at research (and thus most likely to get tenure) are usually very poor teachers.

Usually they are so good at their subject that they tend to skip steps in their explanations and forget how difficult a certain concept is and expect everyone to breeze through it.

Unfortunately, universities optimize for research rather than teaching.


And that's ok. If the Professorships have value to anyone they'll get paid. If some positions are being removed then maybe they just weren't that useful.


That ignores a huge amount of evidence that we routinely shoot ourselves in the foot by making flawed assessments of the relative benefits of short vs. long-term rewards.


Goodhart's Law: "When a measure becomes a target, it ceases to be a good measure." (https://en.wikipedia.org/wiki/Goodhart%27s_law)

When you optimize for, say, the number of publications, you encourage aspiring professors to vomit out more minimal publishable units and self-promote more loudly rather than thinking interesting thoughts and writing them down after some rumination.

The property you actually care about, the interestingness and quality of the research, is a latent property. You cannot optimize for it by optimizing any of its indicators; they are only noisily related, and intervening on the indicators will only break the connection between the indicator and the latent interestingness.


In the ideal world of the massless pulley and frictionless plane, that's what would happen. The reality is, the professoriate we have now is better at playing the game. Smartest people I know are in industry. While tenure was supposed to encourage risk taking, the way it's given out these days it seems to have the opposite effect.


The world is about people selling crap they don't need to each other. If you can be honest with yourself enough to accept that, you realize there is no place for learning in this worldview.

Getting money today (GMT) is all that matters, so until the govt finds a way to take everyone's not hard-earned money and redirect it toward more noble pursuits, I'm confident the trend will worsen.


This seems to be a sarcastic off-shoot of the discussion but I gave u an upvote. Your point is completely valid ... a stable research environment needs government funding. I think this might be a US statistic but I recall reading that we spend more on any one of the top 3 sports than we spend on all forms of cancer research. This should be a no-brainer but it isn't. There was an ACM Communications article a few years ago by Moshe Verde where he pointed out that computation has radically altered both theory and experimentation - the two pillars of science. As a professional scientist, I firmly believe we are at the precipice of great discoveries that will benefit all of humanity - except we are squandering time, and the resources of the planet on stupid bickering.


Thanks. Somebody appreciated my point. The world is consumed by trivia. People with influence and money, who sadly are the often the ones most consumed with trivia, need to die, undergo brain transplants, or some other miracle in order to change the direction we are headed.


Don't worry, I'm sure Trump has a solution in mind.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: