Hacker News new | past | comments | ask | show | jobs | submit login
Why I left my tenured academic job (reyammer.io)
385 points by adamnemecek on Oct 4, 2020 | hide | past | favorite | 229 comments



And this is it. A reward system based on paper publishing and conference speeches is fundamentally broken.

There are really no built-in checks and balances, unless commercial interests care for it to do a 'check', that benefit that commercial interest.

Adding to that, a now well known, built-in corruption of student acceptance and grading, it is impressive that we get any forward movement at all in applied sciences.

>".. . Now, in general, negative results may have value, and it may be worth sharing them. You could write a negative result paper (I think they are valuable). But wait, most conferences don't like those! Shit. So what do you do? You could write a blog post — as detailed as you want, no page limits, etc. you are the boss! — and move on with life with something better? But wait, blog posts "don't count" (more on this later). And in many environments, "no paper" == "I completely lost my time from the eyes of the public"... "


Peter Higgs, of the boson and Nobel: “It's difficult to imagine how I would ever have enough peace and quiet in the present sort of climate to do what I did in 1964. […] Today I wouldn't get an academic job. It's as simple as that. I don't think I would be regarded as productive enough.” — https://www.theguardian.com/science/2013/dec/06/peter-higgs-...


What a tragically sad inefficiency of our current society that people who are way smarter than me are toiling away in the dark for pennies an hour on something extremely hard and useful to society, only to have their results denied by the system.

Meanwhile I'm being paid obscene amounts of money to center divs and type @Autowired and @Component for projects that never end up launching anyway. It's all so badly badly flawed.


In the same vein, the fact that teachers are paid orders of magnitude less than lawyers boggles my mind. I don't think lawyers are overpaid considering the supply and demand but given the amount of value that teachers provide to society, it's surprising that teachers are so undervalued and that the people who would make good teachers never end up teaching.


Most lawyers are actually paid like shit though, if they can even get a job in law. They routinely have like 200k in student debt, as well. It's not the ivory tower people seem to believe it is (and hasn't been since well before 2008)


> I don't think lawyers are overpaid considering the supply and demand but given the amount of value that teachers provide to society, it's surprising that teachers are so undervalued

I’m a teacher. I’ve never seen any evidence teachers are undervalued. Teachers don’t get paid more when they leave teaching for other jobs. Difficulties in hiring can be fully explained by having the same pay scale for those teaching subjects in demand in the outside world like Computer Science or Mathematics and those with either less demand or greater supply like Physical Education or English. Paying primary school teachers and secondary school teachers on the same scale is distortionary in the same manner.

Teachers in the government sector get very, very good job security and benefits for doing a hard job but plenty of people do hard jobs for much poorer compensation. And teachers aren’t magic. They come in a very distant third in their effects on student achievement behind student characteristics and family characteristics.

> By almost all objective measures, teachers don’t actually seem to be underpaid in the traditional sense. “Nationwide, the average teacher salary was $60,477 during the 2017–18 school year.” And teachers work around 2 hours less a week than the average profession (“ 40.6 hours during the work week, compared to 42.4 hours for private-sector professionals”). BLS Occupational Information Network studies found that teaching isn’t a particularly stressful job relative to other professions, and teachers typically have pretty solid relative job security. In addition to all of this, teachers don’t make a lot when they leave teaching and education majors have the lowest standardized test scores of any major. Although teachers have extraordinarily high social prestige, it looks like we are paid close to what we are worth on the actual job market.

https://medium.com/@coreykeyser/why-conventional-wisdom-on-e...


What do you think would be the solution for, for example, CS teachers being incredibly unqualified?

I'm asking because based on my country, I think there literally is no solution and we just have to live with the fact that CS education cannot be good (it might also happen to some other fields over time).

Paying CS teachers more than PE or English teachers is not politically viable as teachers and their unions wouldn't accept it, and paying all teachers a salary based on the market value of the most in-demand field is too expensive, bound to be unacceptable to taxpayers (who are already jealous of the incredible benefits and job security teachers have), and creates unacceptable incentives in education choices (ie even more people will try to become English teachers because the demand is the same, but the salaries are increased, so the oversupply of labor in those fields would increase even further, and the labor supply in in-demand fields would actually go down, because even though they also now get paid more for teaching positions, their relative attractiveness actually went down).

So is it even possible to solve this problem?


> Paying CS teachers more than PE or English teachers is not politically viable as teachers and their unions wouldn't accept it,

It looks like you have already identified your problem. You could give CS teachers double appointments with the IT department if you have absolutely no wiggling space on money/conditions. Or maybe you could gather some programmers who are eager to reach out to kids and let them work part-time as teachers after they obtain the necessary qualifications.


> Teachers don’t get paid more when they leave teaching for other jobs.

I think this depends highly on the subject of the teacher and private v. public schools. My SO was a public school teacher in a STEM discipline and at the very top of the district payscale. My SO now makes three times as much in industry. And we have much better healthcare.


Should have specified on average. Total compensation for public school teachers is higher than for private school teachers. Pensions that generous are no longer available in the private sector for rank and file employees.


> Teachers don’t get paid more when they leave teaching for other jobs.

This is untrue for teachers who left teaching job I know. They all get paid more.

In all seriousness, if teaching is such a great job, why does average teacher leave so quickly instead of staying for long years?


From what I hear from my teacher friends:

* Administration overload (government compliance type stuff)

* Developing brains aren't always fun to be around

* Parents expect to outsource child rearing to the school and want to be catered to in their unique sensibilities while the teacher wants to focus on what's best for the kid and the class

* Even here in Europe, an increasingly litigious attitude, where any school decision you disagree with is escalated and finally prosecuted. This increases the box-ticking, cover-your-ass sort of administration

This has some of the classic ingredients for cooking up a burnout:

* Investing emotionally in your work

* Feeling like you're letting people down constantly

* Being expected to do things that go against your personal ethics

* Spending a lot of time on things you consider are wasting your time


Is there any difference between teachers who leave being men or women? We often hear about women leaving tech, I am curious if there is any difference in who leaves teaching.


The question is, how much do we value the impact that teachers have on student performance. Your answer is "whatever the market will pay them". I think that's a little too slavishly trusting of "the market".


Thanks for the medium article. It was an incredibly insightful read that summarized a lot of researching on teaching and student outcomes.


Comments like these are why I'm excited about the no-code movement. So much of the past decade of tech hasn't resulted in anything tangibly better or different. It's the same mundane ideas rewritten with ever-more-complex toolchains while legions of programmers argue why the latest blend of frameworks offer things completely impossible before them. Once we automate the ability to build these CRUD apps, we'll find out whether corporations still feel a $200k/yr UBI is appropriate.


HAHA a $200,000 pee year UBI is definitely how I would describe most of these jobs...

Most of the interesting things in our field can be done with passing json objects around a network and persisting them in whatever way is durable, a bit secure and easy to develop.

What we really need to see is the advancement of computer vision, robotics, and dynamic, goal-based software decision making capabilities. And I suppose battery technology and radio networking too, but that's more about electricity and physics than software.

When that comes, you will see many mundane human tasks go straight to the robots, and hopefully every family will be permitted to own one (politics is all about limiting people from doing things for sane prices, right?)

If a family's robot could till the land, grow food, cook it, synthesize medicines, build a home, serve as physical security and defence, play games for entertainment, transport the family from place to place, etc etc... then quickly the only world industry still in existence would be semiconductor manufacturers :)


What if the robots decide they have different priorities?


I've lost count of how many "no-code" movements we've had since computers were invented.

The "no-code" environments from yesteryear were really, really good, and motivated "non-coders" accomplished amazing things with them. Hypercard, for example.

But even with Hypercard, at some point you needed to write code if you wanted to go beyond the happy path envisioned by the tool developer. And I imagine the new "no-code" tools will have similar limitations. And then when the "no-code coders" get bored of playing with those toys, will hand it off to the programmers to extend and maintain.


Heh, from where I stand "the no-code movement" and "the same mundane ideas rewritten with ever-more-complex toolchains while legions of programmers argue why the latest blend of frameworks offer things completely impossible before them" are much the same thing. What the industry needs is the boring grunt work of replacing frameworks with libraries. Rather than generating a 55-file "project skeleton", we should be able to call standard libraries written in plain old code and get the same behaviour, adding the extra stuff as and when we need it.

Why don't we have a library stack where I can define some datatypes and get database migrations, CRUD REST endpoints, and a basic editing web UI, in a maintainable language where I can understand where all that's coming from and incrementally start customizing? I'm pretty sure I've implemented all the pieces you'd need, scattered between the codebases of my last three or four employers: in a language with a decent record system it should be a one-liner to build a set of HTTP routes for a given datatype, not through invisible magic but through a function call that works by plain code.

But it's in no-one's interest to package that up to release it. A traditional business doesn't produce new systems often enough to make a general toolkit. A consultancy does new systems but has no need to make them maintainable (thus Rails, which does all the automatic spinning up but doesn't have the comprehensibility to be reliable). And there's no money in selling libraries to developers, partly because developers would rather do it themselves but mostly because a library you have to buy will never be popular enough to get talented developers using it. Occasionally a huge corporation decides it's worth making a framework for their in-house applications, and even more occasionally they find it worth publishing to the outside world, which is an astonishingly inefficient process for our whole industry to depend on.


> Why don't we have a library stack where I can define some datatypes and get database migrations, CRUD REST endpoints, and a basic editing web UI, in a maintainable language where I can understand where all that's coming from and incrementally start customizing?

This sounds like Django (models+migrations, views for CRUD/REST (maybe with another library for automatic generation from models), admin pages for editing, and in python).


Django is very much like Rails with the same upsides and downsides - it's easy to get started, but hard to maintain and upgrade, partly because of not having a type system. So companies making things for long-term internal use tend to prefer these Java frameworks, because while it's horribly cumbersome to get started with them, you can be pretty confident that upgrades will be safe because they put a lot of effort into backward compatibility.


Strongly agree about the difficulty of maintaining a Django project due to its lack of type system -- though tbf, it's imposed by Python, the underlying language.

And, one can annotate Python3 to make it typed, so things might improve.


actually we have. it's called standard software. in sap and salesforce, you have the standard that fulfils what you describe, and just customize the rest.


Those cases are notorious for ending up with "configuration" that's more complicated and harder to maintain than an actual program would be.


That's exactly the point :)

This is why no-code solutions have never become popular.


Except that there's absolutely no evidence that the so called "no-code" movement will achieve anything beyond what we already have today.


Definitely. But I don’t see the point of paying someone $200k to write a simple web app, even if it’s using react and kubernetes. VC has killed innovation and instead created a cyclical bullshit engine.


It's not only because VC sort of manipulating software labour values although it's their nature of so called "investment", it's also software developer communities. If you are around those slack channels, forums long enough you start to see there are very few discussion of developing anything real or valuable. Most of them spend time just "playing" technology/tools.

It's win-win BS. That's why it keeps going. The reality of its economy comes in delay long enough (e.g. 5 years with no profit, keep funding) that every body is detached from economic principle. With printed money lately .. oh the collapse might not come at all, you can be alive in coma for a decade as long as oxygen keeps pumping into your dead body.


You're paying that $200K because otherwise there is no way you're getting the business off the ground. And "simple" is relative, every customer is going to need a customized app.

This is not a trivial field, contrary to what some are trying to claim.


> You're paying that $200K because otherwise there is no way you're getting the business off the ground

This is a completely absurd comment and notion. There’s talented people all over the world who don’t need $200k salaries and also simply get things accomplished without whining about not being able to use whatever tech is popular on HN at the moment.


Then don't. If you can do it without the 200k, then you have a market niche you can exploit and make lots of money that way. Lead the way.


and “no-code” will be used to hype bullshit research and scam products


"Code" isn't the difficult part. (OK, it is initially when you are learning, but not in the real world).


Thats why most of us are in industry ;)


Money is an important motivator, but one wonders if it's possible to make amazing societal contributions and also make $2 million in a decade :)


I think those people end up in research work at large companies. Essentially doing academic work in the private sector.


Now all that's left to do is democratize that and you have a perfect world, no? That is to say increase the pool of such jobs from 10,000 to 10,000,000


> That is to say increase the pool of such jobs

Umm, no.

The reality is that research type jobs are not that numerous for a reason.

That reason being we just don't need that many of them.

The reason why industry pays more is because that is what society needs.


I think need has less to do with it than you suggest. We don’t ‘need’ most industry jobs any more than academic jobs. It’s just industry is better at generating wealth than academia, and so it has more money to pay employees.


> It’s just industry is better at generating wealth than academia

Ok... And almost by definition, that means that industry (and therefore industry jobs) is more useful to society. Because it is generating more wealth.


> ...that means that industry (and therefore industry jobs) is more useful to society. Because it is generating more wealth.

I feel that this is a rather narrow view, since there are a lot of companies who generate wealth by being detrimental to society at large (usually those who are just rent-seeking instead of continuously creating value).


You've confused utility with wealth.


They generate money which people can spend, so if that's the only criterion for wealth (it isn't), then mindlessly printing money is also useful to society (it isn't).


As long as we all agree that Facebook is one of the most meaningful and useful contributions to society, then yes, capitalism determines ideal value.


> The reality is that research type jobs are not that numerous for a reason... The reason why industry pays more is because that is what society needs.

That's clearly false. Industry/capitalism/profit-seeking by definition makes it incredibly difficult to do things where there isn't a large expected monetary return. But profitability is a poor proxy for whether a thing is useful to do or not.

For example, there are quite a few diseases/conditions that affect a decent number of people, but we don't find cures because the return on that investment would be too low, or often negative.

Capitalism suggests that the only thing that matters is having more money; I believe what truly matters is having better lives for everyone.


> Industry/capitalism/profit-seeking by definition makes it incredibly difficult to do things where there isn't a large expected monetary return. But profitability is a poor proxy for whether a thing is useful to do or not.

I don't think that's unique to capitalism. The Soviets were building spacecraft and nuclear weapons while people went without food.

Curing an extremely rare disease is valuable, we all value human life. But say there are tens of thousands of uncured diseases and more coming every year. Then let's say there is a finite number of people capable of working on them, how do you decide which cure we're going to target. The tradeoffs are a consequence of limited resources, not the system.

Market based health care does make me uncomfortable though, mainly because the incentives aren't in curing but in perpetual treatment to generate an income stream.


> Curing an extremely rare disease is valuable, we all value human life.

I think this is incorrect.

I think economic (and political) behaviour suggests rather strongly that we don't much value most other human life, nor value curing most extremely rare diseases.

> Then let's say there is a finite number of people capable of working on them, how do you decide which cure we're going to target. The tradeoffs are a consequence of limited resources, not the system.

That comes next, after there's a collective commitment to resource that finite number of people to work on uncured diseases of some kind.

At the moment, I think we're showing that the collective commitment is not in that direction.

We're doing other things instead, that you can certainly argue are of value, but given the large proportion of "earned income" most people and businesses have to spend on somebody else's "unearned income", I have to wonder if money flow is a good indicator of value.


I posit you make a greater contribution to society by earning as much as you can by utilizing your talents, then donating the excess when you don’t need it anymore.


Earning as much as you can often means intentionally blinding yourself to the negative effects of how you're getting there.


If publish-or-perish didn't exist in his time, then what criteria was used back then for academic assessment? He doesn't say that in the article.


Social inequalities and limited access to higher education created a high barrier of entry. Economy was strong, positivist, the status of doctors was still high and they had more agency into how they spend their money, overall, assessment of productivity was less of an issue because there was enough money for science.


I'm curious if the meritocratic system that we have now in a way is counterproductive. It assumes that only the smartest and best could advance scientific knowledge, so we have a system that tries to filter for them, but obviously, it fails or ends up being a time sync even to those, rendering them less productive.

But maybe this is not what scientific advancement takes. All you might need is randomly choose someone passionate about it, and just give them free reign without condition, no time limits, no stress, to just express their passion.


But the problem is that system is not meritocratic. Otherwise there would be no need for “publish or perish”. PhD is no longer a credible signal, that’s the problem.


The system is meritocratic, you just don't agree with its definition of merit.


I am pretty sure every scientist would agree. But the issue arise because there are too many scientists/passionate/smart people for limited resources. What you are describing does happen for a selected few, the extremelly gifted esp. in maths and physics live their research career in this bubble.


I'm enthusiastic about this idea, with the caveat that I worry filtering for passion will be much harder (given that we must assume candidates will actively seek to hack the filtering process) than filtering for ability.


One basic problem is that science (and related disciplines) is really a slow incremental process where no one knows what is going on, and many people make small contributions, but this doesn't sell well. So every little thing ends up being promoted as if it is. It's the classic cargo cult problem: no one is actually a supergenius -- if there ever was such a thing really -- so we'll just act as everyone is because that's the corner we've painted ourselves into.

Another problem is that academics is full of dead ends, and it has to be if it's being done right. But no one likes that either. So rather than working on changing expectations, we do the opposite and bean count.

Yet another problem is the classic fan fiction / fan community issue: if you get into enough expertise, the experts have to become the audience, because no one else understands it, but then that leads to increasingly narrow fields of view because of increasingly narrow interests. You could broaden the audience, which is increasingly a demand, but that has downsides too, because often what's really the way forward is incomprehensible and boring to the average person. People love their iphones; not so much all the incremental computer science, physics, and engineering that went into every little part.

Something I think this essay maybe misses is that there are a lot of senior academics who would say publishing doesn't matter, that a paper is a dime a dozen, and what does matter are grants, and lots of them. This is how this dynamic has shifted in a lot of places, from the science -> publishing -> money. You might argue that this is better but it has its own set of incentive problems.


There is such an anti individualism these days that I find bonkers. It's as if people crave collectivism.

There have been quite obviously individually remarkable people that stand head and shoulders above everyone else, and don't think that's changed.

Nikola Tesla, Carl Friedrich Gauss, Isaac Newton, and many more.

Feats of greatness may come from collaborating but there are often great leaders and minds at the centre of it.


It’s funny because I relate this to the “too many meetings” of some parts of the private sector. The pressure comes from the challenge of finding time to do actual work.


But is it really possible to establish such a system that ensure correctness, innovation and public trust, and so on? It seems like to me that science is already an impossible endeavor from the beginning. This kind of thinking reminds me of scientific anarchism championed by Paul Feyerabend, like "Anything Goes". He strongly criticized the corporate thinking in the scientific community, but what else we can do? Maximizing the (ultimately) monetary benefit and its efficiency is the only way we can run any business, and everything else looks fishy to most people (including myself).


>>And this is it. A reward system based on paper publishing and conference speeches is fundamentally broken.

What should it be based on? Should academics be judged on performance? If so, how?


I wish academics were challenged to bring a technology to life. A working gizmo cannot be argued away. We need some mechanism by which the products of research are exposed to reality, a little bit of a market force to help us evaluate what works and what doesn't. Already there is a replication crisis. What we need is to make talk less cheap, ideally give the academics skin in the game. This would be a system where people toil away for years before producing something, and some may never succeed, but when they do they get all the upside from creating a new technology, and we as a society benefit from having this new invention.

Essentially, I would break apart the monolithic idea of academia. There's too many inefficiencies, we waste away the talent of very smart people working on incomprehensible details that only a handful of people care about, and they write papers no one will read, only to then go to finance or tech where they'll wonder what was the point of grad school since a small percentage of their skills/specialized knowledge is necessary to make a living. We're creating all these PhDs and we don't know what to do with them. We need to foster a more entrepreneurial path in these people, they're stuck thinking that they only have two choices: academia or industry.


Bringing gizmos to life immediately disqualifies all fundamental theoretical research. And the humanities.


No, theoretical research is needed to understand why a gizmo works, we often use phenomenological models to progress, and the fundamental reason that ties everything together is figured out later. We didn't understand electricity and magnetism for a long time but we were creating all kinds of devices and materials via trial and error. The arrow of discovery has a strong record with experiment first, theory later. Metallurgy was pretty much figured out by trying shit out first, later came the formalization and explanations for why strain hardening is a thing, or doping and microstructure for example.

For the humanities, yeah I have no idea. I didn't specify but I meant the science and engineering parts of academia. I think math doesn't have a reproducibility crisis since it should be all logical proofs that don't require infrastructure, so I guess math is fine as it is.


In Physics theory in many areas is a lot further along than experiment.

Math does have a problem because proofs are very complex to actually check and few people have the time to do so. Subtle errors can creep in very easily.


> What should it be based on? Should academics be judged on performance? If so, how?

Thank for the question. It is, indeed, easier to criticize rather than offer constructive suggestions.

Here are my thoughts on the reward system:

- I would like to see published conclusions challenged (if conclusions are based on data observations or empirical evidence). And if the challenge is clean and results are fair, then:

   (1) The challenger gets a credit (regardless if the paper they challenged withstood the challenge or not)

   (2) The original authors of the paper get credit if the paper withstood the challenge(s).

- I would like to see a different credit system for papers in applied sciences, based on whether the result is accompanied by an implementation. Where implementation is either a working inference system (if this is all based on data), or a prototype. The discoveries that are accompanied by a working, publicly consumed implementations, in my view deserve higher credit score.

In other words, I would much prefer that our academic discoveries are encouraged to build consumable implementations (unlike US Patent law that does not require an implementation of an invention).

-if an published paper or a conference presentation was incorporated by the same team into a patent, I would remove any academic credits received by the team's researches. In other words, I would prevent double-dipping, if I may call it that. If the given paper is incorporated into a utility patent by completely different people (and it withstood the USPTO scrutiny), then that's ok.

- if a specific field of study is theoretical in nature (eg mathematics, may be theoretical physics). I would specifically reward joint publications, discoveries where the theoretical subject matter is incorporated into applied field.

- I would include into the reward system books, blogs and article posts that further public understanding of complex science topics (I would apply it for STEM subjects only). Such that online video lectures, and other forms of free/unpaid public education are specifically encouraged. I am not sure how exactly to measure effectiveness of this, but I am sure minds, brighter than mine, could come up with a fair, automatically-correcting model.


Yes, it's basically a popularity contest these days.


What's even more broken is that impact factors and h-indices -- all that data analytics we use to measure science is fundamentally broken. They are worse than useless


>> A reward system based on paper publishing and conference speeches is fundamentally broken.

After reading that I'm wondering, what is the reward?


From a computer vision perspective. If you get a paper into CVPR, or SIGGRAPH, or NeurIPS, it has the potential to make your career. It's that important to some employers (academic and not) and that's why everyone struggles to publish in those "top" venues. It's not like you won't have a career without a paper in these places (I really want to stress this), but it's enough of a boost to your reputation that everyone wants to.

Think of it like getting a Nature or Science paper, or something in Phys Rev, or Cell. It's about prestige.

The real issue is conferences taking the place of academic journals. In most fields conferences are laid back places to share your WIP. In CS, conferences require full papers which means many ML researchers (for example) structure their whole year around meeting the NeurIPS and ICML/ICLR deadlines. It's a bit broken because those conferences have limited capacity and it can be a lottery if you get in. Journals don't have the same restriction and don't have deadlines unless you're going for a special issue, so you can actually take the time to polish before submitting.


I did not work in the hottest topics in Machine Learning (and not in computer vision) as I cannot find rooms for innovation. I did publish several papers in 2nd-tier conferences and submit papers to journals. Near the end of my graduation, I started to find jobs in the industry. I have been shocked when I saw some employers will target authors who published in these big conferences even for non-research positions. (It sounds like discrimination.) I always wonder if my research profile will meet their requirements.


When I was hiring in industry for computer vision, by the end I honestly didn't even look at people's publications. GitHub, any skills they list, and examples of projects they worked on. What got published at the end seemed to have no impact on what was important for industry.


> it has the potential to make your career.

This is true, but I think for many researchers the main motivation is self-esteem. It's the way they are ranked among their peers. In other circles, it may be your bonus, or how fast you run a marathon. It may be seem silly but this is what drives many researchers.


You aren't fired - sorry you can't get fired from a tenure job - you don't get paid peanuts to fund research.


A 6 figure job with excellent benefits that you can hold until you die, along with discount tuition for your family. Not a bad gig even with all the administrative bullshit professors are saddled with these days.


Post-tenure? Peer-recognition & Funding.


Having been through this whole cycle I can certainly symphatize with the author. Everything he sees wrong with academia is correct.

Sadly, after a few honeymoon years he will discover almost the exact same problems with industry. Sure, it will not be about papers, but about useless client projects, spinng bad tech choices because of commitments, making your boss/company save face when things fail, spending massive amounts of time on proposals andreporting that will never be read by anyone. Insane administration practices. Pointless HR 'events' in evenings and weekends that chip away team morale...

Thruth: we tend to run from one pigsty to the next, because the grass looks greener, only to realize it was towards the same shit we left behind until we tire of running.

I would advise in hindsight not running, but grabbing a shovel and cleaning a small corner of the barn.


> because the grass looks greener

I also left academia to work for a company. I agree it's not all black and white. But one important thing to take into account is salary. I earn now twice as much. I'm not materialistic but it makes a big difference. It's something I didn't care about when I chose to work in academia but I'm increasingly worried about future pensions, health system and social safety net in my country. It seems to go downward and my salary as a professor wasn't enough for me to plan adequately.


Meh... even if you disregard the second half of your post, if you have a shitty job A or a shitty job B, it's still better to have a shitty job that pays 2x the money than the other one.

Plus, in industry, if you're good, you can always go higher, earn more, move to other industries, and earn even more. In academia, you can pretty much just choose the (almost) same job in a different institution, which probably pays nearly the same salary (especially in europe... even if you change countries, in the developed ones, pay (adjusted to cost of living) is nearly the same everywhere).


Very important point, academia is zero sum in a way that industry isn’t, however if you’ve got tenure you’re one of the winners.


I left academia to work for a company. I am not on a "rolling" 3 year contract anymore. I get paid more. Getting a professorship was nearly impossible.


A rolling 3 year contract renewed each year would be a huge improvement compared to the constant nagging feeling of being throwed out anytime the contracts end.

I worked as a research engineer for 2 years at an uni but since I was not full time employee for real I left. They would have needed way more full time engineers than zero. You can't run serious labs with grad students or short term engineers. The churn is way to big when you work on a per project basis.

The advantage state officials usually have is a safe paycheck with worse pay but unis offer the same numbers on it without the safety ...


In Europe (he was in France, no?) I would expect a professor to earn something similar to a senior engineer.


No, a junior researcher / assistant professor would earn about 2000 euros per month. After 10-15 years, they can expect 3000 euros but it'll stop there unless they get a full professor position (which are hard to get). I'd say a qualified engineer should be able to earn twice as much.


I find that hard to believe as I got more than that working as a technician ten years ago in an academic institute in Spain, and it isn't well paid here.


https://sgenplus.cfdt.fr/article/remunerations-et-grilles-in...

Sorry, it's in French: starting salary for an assistant professor 1727.49 euros. It's similar for a CNRS/INRIA researcher. In practice it's a little higher. After about 15 years, you'll get 3000 euros. Many people get stuck there but it's possible to get a full professor / director of research position.


> Sadly, after a few honeymoon years he will discover almost the exact same problems with industry.

Pretty much what I was thinking. I am in demand just now, but my job hasn't been inspiring for years, its just keeping other peoples crappy code running rather than anything creative any more.

My theory with industry is that if a company has been around long enough to be making money, chances are their codebase will be a mess.


I've been in industry for 12 years now, and none of your second paragraph rings true for me. "Useless client projects" and "boss/company save face when things fail" sound like they might describe some kind of consultancy? Evening/weekend HR events sound like a terrible HR team—if anyone should know the importance of being inclusive of parents and others with external commitments, it should be HR.

You recommend not running, but if you're not in a position to change some of these core practices, moving to a better situation may be the best option. I think one huge quality-of-life factor is working on a team with strong technical leadership. That could be a tech-led company, or it could be an in-house engineering team at any company where the team's director has clout outside the engineering team, generally based on a history of delivering value to the broader org.


You probably work for a non-huge software product company or on an in-house team? In Europe the VAST majority of 'industry' is bespoke B2B projects/engagement shops. I can assure you all the 'terrible' things described above are very much standard across that segment.


I would imagine so. But you generalized a universally bad bespoke software industry experience to say, “after a few honeymoon years he will discover almost the exact same problems with industry.” The OP didn’t even get that kind of job: he’s a malware researcher at CISCO Talos.

You then said, “the grass looks greener, only to realize it was towards the same shit we left behind.” My whole point was that the grass is greener than you described in different parts of the industry than bespoke software.

In other words, it sounds like we’re mostly in agreement, but I chose to get out of bespoke work. You seem to be advising to suck it up and sweep the barn floor.


As someone who came from industry and now works with mostly academics at a scientific non-profit, I feel this.

Especially the parts about publishing - there are so many people who are so obsessed by publishing, and most honestly don't have anything useful to say at all. I swear most of these papers aren't even read, even if they are cited. There are whole layers of publication politics that force people to cite work that is almost completely unrelated, if not unrelated and saying so, just to inflate citation counts. It's basically a follow-for-follow system. Same for author lists.

Also totally agree that the papers are written for these referees, instead of the people reading them (which many times I think is not any). Many of these papers are also written in order to give presentations or talks at conferences where papers are expected, since it is publishing and not speaking that gives you a line on your CV.

Also how tenure doesn't save you, I feel this is the same as in tech, where they won't fire you but they will try to manage you out or make life so unpalatable that you will just leave.


The comment about papers not being read resonated with me. At one point I published a paper at a large IEEE conference where the only audience in the session were the other paper presenters. And it was clear nobody wanted to be there. I didn't last long in that line of work after that :-)


In the humanities, it's even worse. Academia is stuck in this negative equilibrium, and even though many individuals recognize this, few or no institutions have managed to get out of it. Tenure and publish-or-perish are both big problems. Publish-or-perish also wasn't the norm in most schools until relatively recently; today, so many schools want to be "research" schools, even when they have relatively little business doing so.

https://jakeseliger.com/2012/05/22/what-you-should-know-befo...


the only thing propping the system up is unaccountable $billions in gov spending


meanwhile people in industry who might want to read your paper have to pay outrageous prices, even if they are IEEE members (you need to be society members and conference attendees to download the papers)


What’s funny is that every time I ask authors questions about their papers, they can never answer. There always seem to be a single person that really did the work.


It's not really that. This happens even when you're the one who worked on that paper. A paper requires intense focus on details. Mere months after that paper, you forget some of the details. It's just natural, considering that you may be working on something altogether different at that time.


I don't remember anything about my old papers and I was the one writing them and doing the bulk of the work. I don't remember what I did last week or even this morning, how can I possibly remember experimental details from memory?


> Also totally agree that the papers are written for these referees, instead of the people reading them

Of course papers are written with the reviewer in mind. What is your suggestion? Abolish peer review? It's up to the AE to find referees who are qualified to review your paper.


Yes. Abolish peer review. Bring back editorial review. If it was good enough for Planck and Einstein it’s good enough for everyone.

> For instance, the Annalen der Physik, in which Einstein published his four famous papers in 1905, did not subject those papers to the same review process. The journal had a remarkably high acceptance rate (of about 90-95%). The identifiable editors were making the final decisions about what to publish. It is the storied editor Max Planck who described his editorial philosophy as:

> To shun much more the reproach of having suppressed strange opinions than that of having been too gentle in evaluating them.


90-95% publication rate might have worked in 1905 but it would not be feasible given the huge volume of papers now. A journal must provide curation because the majority of submissions it gets are terrible, and it’s too much work for an editor to do. Hence peer review.


It’s only too much work for an editor to do if you insist they don’t get paid, and maintain the same standards as at present. Full time editors who accept and reject, without revise and resubmit would be much faster and able to deal with many more papers. If twice as many useless papers get published as now that’s a very small price to pay. There are other possible models of editorial review but that’s one.


I suppose this full-time editing job would be in addition to their other full-time(-plus) job of actually being a scientist, which is of course necessary to ensure that they are qualified to review all these submissions ... ?


I wish the best to the author and I hope he finds a great environment in Talos to do great things (afaict he probably will).

However, and this is mostly for the sake of others reading this, be aware that the security industry has its own versions of the problems mentioned affecting the academic environment, and these versions are in most cases way worst than in academia. With the exclusion of some enlighten cases (which most often coincide with a good financial situation that allows for teams that do cool stuff without too-pressing metrics on the immediate business), in private companies literally everything is driven by money and to that one must add up that the incentives that the people part of the companies have (to this: in private companies the turnover is way higher than in academia). Some examples:

- Papers that don't deserve to be written? wait for that endless stream of tasks that does not deserve to be done that way (but need to be done that way the same, because <reason-you-cannot-argue-against>).

- Optimizing for the wrong things? wait for words like "cost-effective" to pop up.

- Move away from very cool stuff? yeah, that does not happen because the cool stuff are not even on the table, unless they can bring in money fast.

- Non peer-review things dismissed? wait for not-approved-by-manager-X or not-in-the-agenda-of-key-person-Y or subject-to-approval-of-Z-who-cannot-possibly-even-understand-the-value-of-that.

- No time for tech stuff? sorry, that doesn't change much unless your job is highly operational (which you won't enjoy much because it will have to be "cost-effective", thus most likely repetitive and metrics-based).

I'm not saying that every private company is a circle of hell in the security industry, but unfortunately most have far harder-to-deal-with problems than those listed for people looking for things like the author. Afaik, the common big difference is you get paid more and you see real-world cases as they happen.

(edit: formatting)


Agree with all your points. The author has been in academia all his life so after 20 years or so, I can see how he can feel frustrated by it.

I think the general summary of what you're pointing out is that in industry it's easy to do things that the company wants to do (make money, make your boss look good, make life easier for the execs), and hard to do things that you want to do (cool stuff, disseminate ideas, and explore).

When you start at a company, you're optimistic and campaign to do a balance of both, but over time it wears you out so almost every defaults to doing the company work and fades out of public view. And yes, MSR is generally the exception but even then notice how many researchers there now do something related to "optimizing productivity".


> I think the general summary of what you're pointing out is that in industry it's easy to do things that the company wants to do (make money, make your boss look good, make life easier for the execs), and hard to do things that you want to do (cool stuff, disseminate ideas, and explore).

Based on my experience, I agree with the first half of this sentence but I'd argue with the second bit. Sometimes the interests align. The one thing that has made my PhD worthwhile has been that it gives me the credibility to work on cool stuff that's going to make the company money. That's not to say that I would necessarily be working on exactly the same things if I had been free to choose, but that's largely because I'm solving problems I didn't know existed before I took this job!


In industry, you have "paper that did deserve to be written, but didn't because that's giving away knowledge to competitors, and also time wasted." Patents are respected, but that's not really something too relevant to software.


Most large companies have research divisions that do a variety of tasks which are not directly tied to short term money. The work varies from almost-pure to almost-applied but generally has different KPIs than product teams (patents for example). They're usually run closer to an academic institution in terms of structure and management. Granted there's generally some incentive to get other teams in the company to actually implement whatever you came up with.


That's true, but I think on one hand these are fairly rare cases compared to the whole industry while on the other there is also the point that these divisions might not be as stable and long-lasting as one might expect.

Related, discussed on HN a while ago: https://blog.dshr.org/2020/05/the-death-of-corporate-researc... Previous discussions https://news.ycombinator.com/item?id=24200764 and https://news.ycombinator.com/item?id=23246672


To the first point, they're rare but industry is massive so their total absolute size isn't that small. And if you're a tenured professor in a semi-related area then it's not hard to land in one of them.

To the second point, it's true that the research focuses shift over time but if your research area fizzles out you can either shift to a more applied position or move back to academia.


I’m glad this was posted. I’m writing my personal statement for applying to PhD programs right now, after spending a few years in big tech and both the concerns as well as the benefits resonate with how I’ve modeled my expectations.

I think I’ll end up in a similar place - hyper-focusing as a graduate student and then transitioning out of academia after. One understated part of academia is that for a lot of fields (Such as the bio-adjacent fields I’m interested in), academia is one of the few places willing to take on trainee risk. I hope we can keep academia as a place where we’re willing to take risks to educate, which I think starts with reforming the processes informed by the critiques this post levies.

One thing I would advocate for is forcing pre-registration of experiments. Publishing a design for experiments with an intent, a hypothesis, and a afterwards a quick summary of the results. That process for FDA trials has naturally led to a large buildup of both positive and negative results, while also inhibiting post-hoc statistical baboonery.


I’m for pre registration for some fields with big large experiments.For other fields (robotics is what I have in mind),I think the experiments can be so rapid that pre registration would be a real hassle and slow down progress.


This seems to be a separable point, though. The issue here is not the act of pre-registration, but the friction it imposes. I know most of the experiments and research I have done could have been summarized at the time in 10 minutes - to the effect of "I will manipulate variable X over time ranges Y, this is supported by <source or professional explanation> and I expect Z", or even "We performed a gradient over variables X and Y expecting to modulate Z and looked for phenotypes not previously described for further characterization".

My experience in research is that there is very little pre-commitment on this level of granularity. Grant applications should be at the level of "We hope to study X because it is known to be correlated with Y and are hoping to elucidate a mechanistic relationship between these two". The experiments which go into proving such should more closely resemble mathematical proofs, is a viewpoint I currently hold.


why can’t you gpg-sign a text file and then get a bitcoin timestamp receipt? There is no need for bureaucracy.


Pre-registration is what funding committees and proposals are for. You get the funding, the experiment is approved, now go do it and publish the results.


In my experience [0], the grant funding tends to be something more akin to "We will attempt to further characterize area X, using tools Y and Z, which we expect to be involved in important area A, but has not been proven so, yet". The granularity I'm speaking to is at the lab-notebook level, give or take. If I'm performing an assay, formally documenting the premise, evidence supporting that premise, and result under that grant heading would mean we could easily publish both negative and positive results for a given study.

One of the areas I helped write a paper for as an undergraduate was later proven to have a completely different mechanistic understanding than we proposed. I think having this more detailed set of observations would have helped both us and any people citing our paper come to this conclusion earlier. Physics is great at this [1] - seeing "dark matter" or something unexplainable is an opportunity, but many fields treat presenting something as 'unexplainable' as a weakness.

[0] I don't have experience in particle physics or anything of that nature where you're working on constructing something like the LHC

[1] Once again, outside observer bias


Nice write-up! I would like to insist more on the false "industry vs. academia" dichotomy, that was mentioned at the end of the post. I was asked the same question at the end of my PhD: "Do you want a one-way ticket to academia, or a one-way ticket to industry?"

After 8 years of back-and-forth, I settled on both. I think this should be the norm. Professors should take a break from paper writing to work on a startup and/or be in industrial board meetings to know what to research next.

I have so many friends that kept a foot in both industry and academia. I find the model very valuable.


100% agree with you, students are fed the trap that only academia and industry are possible, no one talks about the entrepreneurial path which is the only one that offers the opportunity of making it big. And it's the only scenario where I feel it's justified working 60-80 hr weeks: you're building your company/product. It's a position with a risk/reward ratio that you can't find in academia or industry, and I think most of us here know that you learn a lot when you're free to deep dive into ideas you love.


> I settled on both. I think this should be the norm.

As I understand it, law and medicine are run this way.


I think you are right! In Sweden, clinical work is excluded from academic age when applying for positions or grants in medicine.

In CS, "that year in industry" counts as a big whole in your publication list.


I can certainly feel some of the points the author makes. I am in a different field (astrophysics) and I have recently moved from the professor job in the US to UK. And one thing I particularly feel is a burn-out. After going through undergrad + PhD + getting tenure which all involved pretty much round-the-clock work, I just can't work 60 hours a week anymore. And somehow in academia, it often is required. Basically you have so much stuff that you must do that takes a large fraction of your 40h/week (telecons, students, lectures, seminars, refereeing, etc), so anything interesting -- actual science that you do yourself has to be done in the extra-time (i.e. weekends/evenings). I am currently hoping that in the UK I'll be at least a bit better placed than in the US, as the pressure to get grants is lower, and people are more mindful of the working hours, but I am still not 100% sure that I'll stay. (throwaway account)


The things you enjoy at the early stages of academe are not the things you do as you stay longer. Academic promotion exhausts the older/wiser people with teaching/supervising/admin work instead of using their accumulated knowledge to create new things. Of course, some students might be less naive and are actually aiming for positions per se. But I was one of those types who just wants to solve technical problems and get technical work done.


I am also one of those techy types and beginning to realize this. It was hard for me to understand why so many of my peer academics /want/ to get promoted into administrative positions. But maybe they just realized they were already doing a full-time administrative job, just with a teaching load on top of it.

If you do stay involved in research, you spend more time hassling over a grant proposal that's a 1-in-10 shot than you would actually doing the technical work if you won the grant.


That seems true of many jobs, like programming. Become good enough and eventually you get promoted to manager, director, VP, C-level. You might have loved the joy of programming and making something yourself at some point, but sooner or later most career-successful programmers will become administrators or operators.



I think he's a little too focused on the paper-publishing aspect, but that may be just his specific corner of the world, and his experience. It might have been that in his sub-field, that seemed to be the main way to make impact.

I think the point is, if you want to be successful in academia, you need to be producing something theoretical or practical that others can consume and find useful. The medium need not be papers. It could be:

-- standards

-- methodologies

-- prototypes

-- libraries of code, manuals of protocols

even more practical things:

-- convening people on a topic with authority

-- writing popular works to publicize the field

-- building a scientific facility

-- even TV shows or blog posts

You will find professors at universities who do a whole variety of these kinds of things, not just publish research papers. (and as they get more senior, they often find more interesting outlets for their work, if they're innovative)

But you have to make something that others (in the scientific or industrial or popular community) can consume. All the paper counts in the world will not help, or make you satisfied, if they're just churned out for the goal of increasing your H-index but don't actually contain something wanted by others. It usually doesn't work as an approach.

If you can't do any of these, then you are probably better off taking a normal job. The incentives of academia will not align with your talents.

Personally, for example, I found that the things I was good at publishing or sharing with others wasn't groundbreaking novel research. So I went where that was valued, and have been happier since.


Are you in academia? Papers are the standard metric of success for tenure in science. At research universities, it's most of the basis for tenure.


Theatre professor here! My professional productions are expected to be my entire body of research toward tenure.


The author is already tenured in France and is mostly arguing that they need to "publish or perish" to get their PhD students to graduate. My experience in academic computer science in France (though not in the author's specific subfield) is that many many very good PhD students graduate with one or two conference papers published, and nobody minds, and they go on to good positions after that. I agree with the grandparent that the author might be overstating the problem a bit, in their very specific context. In other places, sure, students have some dumb quota to fulfill. But putting this front and center as the argument for leaving a tenured job in France... doesn't agree with what I've seen.


It depends on the field, but yes. Many optics researchers really only do presentations and they 'count' as paper publications. The pressure is still intense though.


Some areas focus on conference publications rather than journal publications—in optics is there no conference publication, just the presentation?


Oh man, yes, you are correct. My brain was somewhere else. My bad.


Agreed with this. One big thing that is missing here is teaching. If you are an amazing teacher, things will be much better for you longer term. You will also enjoy the job a lot more.


Q: "Why is academia so cut-throat?"

A: "Because the stakes are so low."

My dad was a professor and researcher (in ecology) in the 60s and 70s. He had a cartoon on his office wall, later relocated to his office at home, which depicted two priests having a conversation beneath the image of a very famous prophet being tortured (honestly, it doesn't matter which prophet).

One priest is saying to the other, "He was a great teacher, but he didn't publish."

My dad had a stress-related heart attack at 50, which he survived, and he semi-retired to a part-time professorship at a different university so that he could continue to work with grad students, which he loved doing.


> Problem 3: We move away from (potentially) very cool stuff

I have bad news about the private sector, pal...


Lol. Totally agreed.


How does “publish or perish” relate to “my job at a public company evaluates my performance every 6 to 12 months?”


Original research is much riskier than a job at your typical public company. Or, it should be. "Publish or perish" encourages breaking research into "minimal publishable units" which produces gobs of churn for everybody involved.

I've known a few professors that publish paper after paper with similar results and similar methods. I suspect each of having a much deeper theorem, either under their hat, or (more generously) at the tip of their tongue. If they'd take a bit of a break from the churn and focus on the deeper theorem, their actual impact on the field would be significantly more value. We'd all get to read, write, or review, a small handful of papers instead of dozens.


> Original research is much riskier than a job at your typical public company.

The typical job at the typical public company is also a lot less innovative than original research.

I have a feeling that the author will discover that the grass is always greener on the other side. The private sector has no shortage of authoritarian big-men[1], dumb vanity-driven projects, asinine performance review metrics, PHDs doing the equivalent of licking envelope stamps, and your work being thrown out in the dumpster (With no ability to share it with the outside world, because trade secrets), because someone in upper management lost a political battle, and your division has been canned, or re-purposed.

[1] Of either gender.


That makes sense. I suppose it’s easier to blend in on a team then the lead name with a team of students behind you.


The expectation for a performance review is that, well, there's some attempt to look at the actual performance and evaluate it. For example, imagine that the job at the public company would "evaluate" the performance every 12 months by solely counting the number of projects launched. So if you're doing great in a 18-month project, your score is 0 (since it's not launched yet), and if you've been part of launching 3 small projects, then your score is 3, no matter how irrelevant or poor quality the projects were.


The stack ranking is turned upside-down in academia. Only a small sliver at the top will make it to a secure position.



Both require evaluation against some standard of good performance. I guess the difference is the definition of "good performance." If it's a function of # of papers published, and you don't care about publishing papers for the sake of publishing papers, I can see see why someone would want to jump ship.


It’s not just top-down pressure but bottom-up too. Your students, undergrads, grads, postdocs are also incentivized to publish as much as possible for their own careers. So you are supporting them.


The key point I learned from this that I didn't really realize before: even if you already have tenure, you're surrounded by smart people that you want to work with, and they're often earlier in their careers and don't have tenure. So the need for tenure affects everyone in the system.


This was a long read, but very enlightening. Thanks for posting. It really resonated with me, since I’m going through a similar period of self-reflection, and searching for the true meaning of life and what makes me happy. For the longest time, I have been telling myself why my job is good, but that “not right” feeling has never gone away.

Since I’ve been remote due to COVID, my life has been remarkably better. I don’t work any less (perhaps more), but I am much happier. It is far more compatible with my lifestyle. I love being at home with my wife and my pets, I love being able to cook all my own food. Like the author, my physical and mental health have both improved; at the start of the pandemic, I could barely hang from a bar, and now I can do multiple pull-ups in a row. Same with squats, bench and everything else. I was never able to build a workout routine, but now I have. And I haven’t eaten any fast food or low quality food in this time, is all been homemade from scratch, and way more delicious and nutritious.

I feel the same about my company and my team, I really have made friends there and have built a great career, but they have made it very clear that remote work is not an option going forward. This is just not compatible with the way I want to live.

If you’re reading this and are in need of a technically skilled, motivated, hardworking individual with many years of experience, have an interesting project to work on, and support remote work, please reach out: https://lukep.dev for more info on me. I am experienced at building teams, delivering products, and leading others, and am looking for a management level role.

Again, thanks so much for the post. It’s difficult to articulate this feeling when things just change for you and there’s no going back, but I feel you and I know that you’re making the right move. Best of luck at CISCO!


> Instead of just writing papers for our "real" target audience (the community), we go all-in with "defensive writing" and we start iper-optimizing every single sentence to make sure it can't possibly be misunderstood even by the dumbest reader. Papers can't just be "clear" for a benign reader: they need to be bulletproof against capricious people who are ready to reject things for random reasons.

Considering how many papers I read that still are easily misunderstood even by smart readers... I think "defensive writing" is better understood as "good writing".

Of course papers are expected to be "bulletproof." It's called a standard of excellence. Peer review is supposed to be meaningful, not a rubber stamp. But the sneaky secret is, the bar isn't even that high. I read so many papers where I wonder why an obvious alternative interpretation isn't even mentioned, or an opposing viewpoint isn't cited, or where the author subtly misrepresents another author.

("Capricious people" are a separate subject, but they certainly exist just as much in industry as in academia...)

The rest of the post aside, this particular bit of the "rant" just feels like a defense of lazy writing. An academic paper isn't just a blog post, and it shouldn't be.


You are missing the point, by quoting out-of-context from the section with this big header: We optimize papers for the referee, not for the (intended) audience.

The author's issue is not with defensive writing. It's with writing to defend against the wrong audience.

Perhaps you haven't been in academia, and haven't experienced it yourself, but the author's problem is very real. Your intended audience will understand a point, and want you to go further. But the referees are often not the intended audience.

Imagine trying to explain the value of a new type of N95 mask to an anti-masker. You'll be spending your time writing defensively to try to argue for the most basic aspects of wearing a mask, when you would really just like to go into details on what's cool about the new type of mask, for people who actually care about better masks.


> Your intended audience will understand a point, and want you to go further. But the referees are often not the intended audience.

How do you know this? Like both you and OP, I've had my writing misinterpreted by reviewers. These reviewers may not be experts in my precise niche, but reading research is part of their profession. If even they misinterpret it then it is likely I was not clear enough. I have no reason to believe that experts in my niche would not also misinterpret it, and there's no way to verify this once the work is published.

Furthermore, vocabularies and terminology in a field can change. I find it very confusing when a paper from today and 15 years ago are using differing definitions for the same term.

As such I think defensive writing is a good practice, and I'm not sure why making your work more readable is ever a bad thing.

> when you would really just like to go into details on what's cool about the new type of mask, for people who actually care about better masks.

Defensive writing wouldn't stop you from doing this. It would ensure that a wider audience (across fields and into the future) can benefit from your proposals


I have a very recent example for this, we are a very research based shop and have recently submitted a paper about part of our open source software. One of two reviewers literally answered with 10 points, where every single one is a variation on it not being a study.


The reviewer is saying your paper isn't a "proper" study?

I'm wondering about the nature of your shop. You said it's heavy on research, but it also doesn't sound like you're in academia.


No this isn't a study at all is the joke. We are developing/offer a database for health insurance (specifically the German Market) data, for non technical people to analyse the effectiveness of certain treatments etc. Or create certain cohorts for studies. The paper was an overview/introduction to It. while I'm proud to have written it, it's basically marketing/legitimation for our field. Another division of our company does actual data science, research and publication on these topics. We're in a weird middle ground as our ownership is all health insurances which makes us technically public service, but our work environment is far more tech oriented (and 20 years young on average).


Why are you trying to publish it in a journal?


This is what demo papers are for. Submit to a demo track, and don't take negative reviews personally :)


> and don't take negative reviews personally

I didn't as it it's obvious the reviewer didn't read the paper :)


> Of course papers are expected to be "bulletproof." It's called a standard of excellence.

And sometimes it's just having to add verbiage in anticipation of a lazy or uninformed reviewer, who is not necessarily the audience that you actually care about. The net result is that there is a lot of work and fluff that interested readers often have to put up with that exists solely to get the paper through review and serves little other purpose.

> The rest of the post aside, this particular bit of the "rant" just feels like a defense of lazy writing. An academic paper isn't just a blog post, and it shouldn't be.

If your point here is that blog posts are "lazy writing", then I assure you I have read many blog posts that are much more thorough, well-researched, and worked-over than academic papers. I do agree with you that the two are separate things, but blog posts have a number of desirable qualities that academic papers have difficulty optimizing for–either because they are not prioritized, or because they would actively hurt the chances of acceptance.


> who is not necessarily the audience that you actually care about

But see, that hits the nail on the head.

There's a reason papers are full of long intros, backrounds, and even mini-literature surveys.

Because the audience isn't just other tenured professors in the same subfield. It's the rest of the field, it's grad students, it's undergrads, it's researchers in other fields it might be valuable for.

It's also for people trying to figure out even which subfield the author is approaching it from, to contextualize the paper -- is this political science paper being written by a structuralist or a culturalist, even when they never explicitly say so, but it makes all the difference for judging it?

It's easy but short-sighted to assume you're writing solely for your peers. Thankfully, the "verbiage" and "fluff" you disdain is deeply appreciated by many, many other readers.

To reiterate: there are very good reasons for these standards. What's fluff to you, is extremely valuable to many others.


I’m not actually really in academia, so when I say these papers have fluff I mean they actually have fluff, not content useful to professors or students or casual readers or whatever. These are things added specifically to get through review and only make the paper worse.


I guess we read different papers. The ones I read are mainly in psychology, sociology, political science, and political philosophy. And in having read well over 1,000 papers (just by counting in Zotero), I can't ever recall reading part of a paper and thinking "this is fluff" or "this must have been added specifically to get through review".

Not once.

Maybe it's a problem in other fields, or in very specific journals? I honestly don't even know what fluff looks like, which is why I'm so confused by this whole thread.


It sounds like repeating "state-of-the-art", "novel", never before seen, revolutionary new research that will change the entire world, you wouldn't believe how my method beats other baseline methods by huge margins, and you wouldn't believe how this paper will revolutionize the entire field, etc, etc, multiple times in the paper so the bored reviewers don't reject your paper. I'm exaggerating but it's close to the truth. You kind of have to turn your paper in an advertisement of your work to get past the reviewers.


Got it, that helps me understand it more.

But again, your paper ought to be an advertisement, no? That makes it clear why it matters? That's why there's an abstract, an intro, a conclusion.

I have read a lot of writing where I wonder, "yes but what's the point? why does this matter? what's the actual impact here?" So to have that clearly stated is incredibly helpful for all readers -- not just what it does but how much it matters, why and for whom.

Writing for a public audience isn't a place for false modesty -- it confuses rather than helps.


You have a good point but I think it’s a matter of degrees than a yes/no situation.

There are also other things a researcher is incentivized to do to get past reviewers such as emphasizing only the good points of your algorithm, deemphasizing its downsides, comparing only against current so-so algorithms or baseline algorithms or algorithms a couple of years old instead of the latest state of the art so that your algorithm can look good. As in, even if you propose an algorithm that has new ideas in it that could bear fruit in the future with more refinements, you as a researcher would only hype it up to get past the reviewers.


I took this to be more about writing for the express purpose of being accepted (or not rejected) instead of a defense of lazy writing. When the outcome of the work can only be “accept” or “reject” it incentivizes writing from a defensive perspective rather than one that is more speculative or questioning.


There is also problem of "defensive writing" as in writing that is not supposed to be understood. Math is very often used in this way - math is impressive, math gives vibe of rigor and formalism, math requires a lot of effort from reviewer to dispute, but is very often bullshit meant to sell a simple idea (sometimes even wrong idea) as a big contribution that is just simply hard to understand for uneducated. Nobody want to look uneducated, so it's unlikely you will be criticized.


I think the specifics of "defensive writing" depend strongly on the subject. In the author's subject (infosec), you probably need to defend against the "it won't work" and "the assumptions are unrealistic" schools of criticism, so you end up being overprecise and including unnecessary complexity just to preempt the critiques. In other subjects (the mathy side of TCS), you mostly need to defend against the "it's probably well-known" and "what is this good for?" attacks, so you end up providing lots of context that confuses more than it elucidates and speculating about possible applications you don't believe in yourself.

I myself am a bit skeptical of how far this "defensive writing" is to be blamed for the unreadability of papers, and even if it's such a bad thing at all when done in measure. In maths, reading papers is about swearing and asking the author just as much as it is about actually reading; yet the reasons for the unreadability are often more natural than "we need to get this past the referees" (all else being equal, referees still reward readability). New ideas rarely are born easy to explain. Ugly ducklings need their time to grow. You can wait for 10 years hoping that by then your own work has taken the proper shape to explain to undergrads, or you can write it up half-baked in half a year and let others play with it. What is better for science? The answer is far from obvious, particularly because releasing your ideas into the wild will often get them properly explained faster than trying to do it yourself.


> Considering how many papers I read that still are easily misunderstood even by smart readers... I think "defensive writing" is better understood as "good writing".

I agree 100%.

One thing I HATE HATE HATE is when research articles use unnecessary synonyms. Ie, plasma membrane in one sentence and then phospholipid membrane in another and then cell membrane in another. For someone just trying to get a grasp on a topic, it is REALLY fucking confusing. And even if you're an expert on the topic, it's still annoying. Just pick one and stick with it.

OR uses stupidly complex words just to appear smart. Ie, ameliorates, a term I've ONLY ever seen in biology research articles and yet is a completely unscientific/unnecessary word. This isn't your creative writing class. Just fucking say that you found that your drug decreases lipid synthesis, NOT that it ameliorates it you big dunce.

ALSO one more thing that annoys me - when people only use a term 3 or 4 times in a paper, but still INSIST on using the acronym. Ok, I get it - if it's something WAY more commonly referred to by its acronym like DNA or COPD, it's acceptable and probably preferable. But if it's central pontine myelinolysis, just fucking type it out instead of writing CPM so I don't have to waste mental energy trying to decipher what it means.

If anything, scientific writing should be dumbed down more.

PS. While I'm on my biomed-academia soapbox, I'd like to point out that the correct pronunciation of constitutive is just as it's written - similar to constitution. It's TUTIVE not TUITIVE.


As someone who has written academic papers and sometimes used such "fancy" words: consider that not everyone is a native english speaker. As a native catalan and spanish speaker, there are times where that "fancy" word is easier to come up with than the "standard" one (because "fancy" english words tend to be those that come from latin).

Your "ameliorate" example is a good one. In catalan, "millorar" is an extremely common verb meaning "to improve". Hence, I can easily see myself using it at some point, as rare as it may sound to you.

I'm not saying it is better to do it this way, what I'm saying is that instead of being so annoyed at the author trying to seem smart, maybe you can shrug it off as a quirk of those who are writing in a foreign language.


> This isn't your creative writing class

I think it is less this and more just code-usage to facilitate showing that they're part of the in-group. If I use the appropriate words ("Elucidate", "Ameliorate", "in situ"), I'm showing that I have learned the subject from a similar heritage.

Partially an aside, but an interesting talk on academic writing, which I've enjoyed [0].

[0] https://youtu.be/vtIzMaLkCaM


You all assume some grand reasoning and evil intent behind the word.

Simply, most people writing scientific articles are bad at writing and generally are happy or is somewhat English. They don't obsess offer single word, because their focus is elsewhere and they have no clue about writing anyway.


It is exactly because no one obsesses over single words in practice that these words gain usage, though. Academic writing is taught and word choice is a part of what is taught. I started using those words in my academic writing because that’s how I was taught to write from lab courses to journal contributions. Early in my studies professors and teaching assistants did correct single words where a more normative word was possible, so you just start writing like that to avoid any trouble.

There isn’t anything intrinsically evil about norms and signaling, I feel that assuming someone recognizing a structure means that structure is inherently evil is a bit presumptive. In-group jargon usage is common and useful for some, and a hurdle to overcome for entrants, that’s it, absent any moral judgement.


In situ has an actual meaning--and it would look decidedly weird if you called it (say) "in-place hybridization."

`Elucidate` is an basically a fancy weasel word: I read it as "I don't actually have a strong prediction about what will happen--but I expect something obviously cool will happen when I do this experiment."


It is a synonym for “in-place”, though. If academic writing was optimizing for reach, the more commonplace word instead of the latin variant would likely be used. To your point, though, if someone submitted a paper using “in-place” someone would probably point it out [0].

I think that’s the original point, that a lot of the word-level complexity is just how the writing is taught and subsequent reinforcement by mentors and peers to use the appropriate language.

Yeah, elucidate is just a fancy word for “figure out” - but I think it is part of the dialect because of the connotation you mention (Even if a casual reader might not use it).

[0] Not because it is strictly wrong, but because it violates how the in-group expects converse. I think this holds even if we go outside of phrases (One could argue in-situ hybridization is one item), as if I discover something new I’ll likely use the term “in-situ” if it involves similar locality properties


But in situ actually MEANS something.

Elucidate is unnecessary but not terrible - at least it's used in everyday writing. If I had my druthers, I would ask for a less cumbersome word to be used. Still, I think most reasonably educated people would know what it means without missing a beat.

Words like ameliorate are just wrong. Just absolutely wrong.


I guess I'm having a hard time understanding the distinction you're working to draw between the three words. All three mean something, and could be argued to have value because of their pithiness [0]. If you're inclined, I'd be interested in a bit more of you're framing. Especially in terms of bio-medical texts, I'm a bit confused which characteristics make 'ameliorate' worse than 'in situ'.

[0] Like this word. I could have said "Because of their precise meaning", but I picked a word that encapsulates that directly.


I mean, the point of writing a paper is to educate - and if you're using obscure words, you are also probably succeeding in obscuring your message.

Take this example:

> Chemical and/or biological therapeutic strategies to ameliorate protein misfolding diseases.

Why not just "to target protein misfolding diseases"?


As someone who recently used `ameliorate` in an abstract(!), let me make the case for it.

`Target` sounds active and vigorous, but it is quite vague in that context. You can target a diseased brain region (for stimulation, surgical removal, etc) or a mutated gene for deletion, but it's not clear to me what "targeting" something abstract like a disease would mean.

`Ameliorate` means "to make something better". It has the added implication that the situation is quite bad, but the intervention won't totally restore things to normality. This is exactly what I meant: we don't think this intervention is a silver bullet that will reverse the disease, but it seems like it should help--and our proposal is cheap and easy, so...consider it.

This also keeps Reviewer #2 from busting your chops over how effective the proposed strategies might be, which you'd be inviting with a stronger word like "cure."

Here, you could also use `treat`, which feels like an intermediate-strength claim. "Manage" might work too though that comes with implications of its own too: to me, managing a disease suggests that it is temporarily being held at bay, or the negative consequences are averted without fixing the underlying problem.

I learned `ameliorate` in junior high--and was writing for people with PhDs--so I went for it.


The target audience for most academic papers is not the general public and these words or phrases are known to the target audience. If the idea is that academic papers should have a more general audience, then I feel that’s a different statement, because I agree that academic papers are often less accessible to non-target audience members due to word choice.

In the case you mentioned there is a bit of redundancy which makes the word “ameliorate” easy to substitute. Terms like “therapeutic” and “misfolding” make it easy to derive that we’re going from a bad to good state.

“Activated protein X ameliorates biofilm formation” has more content than if we swapped to “target” in this case, because it has the added effect of indicating that the interaction was “good”.

As a continued aside, I think you’d enjoy the talk I originally posted. It dives into the difference between using writing to think vs persuade or educate, and how not recognizing that difference can cause poor scientific communication.


> As a continued aside, I think you’d enjoy the talk I originally posted. It dives into the difference between using writing to think vs persuade or educate, and how not recognizing that difference can cause poor scientific communication.

Thanks, I'll check it out!


It's funny that so much of what we do as humans is signalling.


Those fancy words don't look fancy if everybody is using them.

Especially if you are not native speaker, the fancy word is equally new word then the supposedly common word. Because both are new to you and you learned both from other papers. So you just use the one you remember better and ameliorate sound like common French word. Or maybe it is from Latin, but if you deel with Latin often it sound common.

It is like using POJO in tech. You kind of forget the word is not normal after a while.


> One thing I HATE HATE HATE is when research articles use unnecessary synonyms. Ie, plasma membrane in one sentence and then phospholipid membrane in another and then cell membrane in another. For someone just trying to get a grasp on a topic, it is REALLY fucking confusing

I think that this is generally frowned upon in any technical writing, including scientific. So it is not an expected convention in scientific writing.


Because peer-reviewed ist often also friend-reviewed. If you attend conferences you know most of the time who is pushing which topic and in your field. And seen the draft version at the conference.


These same thoughts often cross my mind. But the grass is always greener on the other side. I do really love how unstructured being a prof is though.


Nice write up.

I would just like to point out that the variability across academic jobs is as large as variability across startups, or across corporate jobs.

So the important part is not so much whether you're in industry or academia (there are toxic environments in both), but that you find an environment where you work on problems you care about and with people that make you better. Those places are available in both academia and industry.

Although I admit the worst environment probably is found in academia because of the lack of checks and balances (being in the wrong lab can really be a nightmare), I would argue the best environment may also be in academia.


> Although I admit the worst environment probably is found in academia because of the lack of checks and balances (being in the wrong lab can really be a nightmare), I would argue the best environment may also be in academia.

Isn’t this equivalent to saying that academia has the highest variability?


I'm saying something about both the spread and the center of the distribution. You can have a distribution with higher variability vs reference but still higher lower-bound because of different means of the distribution.


I wish academics were challenged to bring a technology to life. A working gizmo cannot be argued away. We need some mechanism by which the products of research are exposed to reality, a little bit of a market force to help us evaluate what works and what doesn't. Already there is a replication crisis. What we need is to make talk less cheap, ideally give the academics skin in the game. This would be a system where people toil away for years before producing something, and some may never succeed, but when they do they get all the upside from creating a new technology, and we as a society benefit from having this new invention.

Essentially, I would break apart the monolithic idea of academia. There's too many inefficiencies, we waste away the talent of very smart people working on incomprehensible details that only a handful of people care about, and they write papers no one will read, only to then go to finance or tech where they'll wonder what was the point of grad school since a small percentage of their skills/specialized knowledge is necessary to make a living. We're creating all these PhDs and we don't know what to do with them. We need to foster a more entrepreneurial path in these people, they're stuck thinking that they only have two choices: academia or industry.


This would be terrible for research. You would be throwing away so-called pure endeavours. Where would pure math fit in your vision? Homotopy Type Theory would be a no-go for instance.

Market forces are not and can't be the start and end of how resources are allocated, especially in academia/research. You would be stiffling that which does not seem to yield immediate, visible results (profit). Except that's what research does. It's why it's called research.


Math can continue its current structure because as far as I know there is no reproducibility crisis in it, it's all supposed to be logical proofs that should stand by themselves.

My idea would be for the experimentalists, give them the opportunity to tinker and trial and error without the bs pressures of publish or perish, and/or crotchety advisors.

I just want the academic to be exposed a little bit to something from reality that can give them feedback, market forces is what came to mind because the introductions of so many papers talk about the potential applications of their niche research, but you never see anyone actually do the application. So how about they give that a shot? Let's put the models to the test and see if they can build something. That is in my opinion a much better approach at evaluating what can be reproduced. If they build something and did so without their academic models, well, then now we need theoreticians to figure out what's going on there. To me this seems a more organic approach, where the direction of research is being led by stochastic discoveries fueled by an acceptance of the risks of trial and error. The top-down, directed research approach we have is hard to stomach with inherently biased people at the top, and this type of system will only change its preferences on research directions one funeral at a time. That's a lot of time to waste for many many 20 - 30 year olds that have ideas but don't have a fighting chance to try.


I find it weird that people who live abroad complain about life when they chose not to learn the local language. If you’re planning to stay for some time, learn the damn language.


Yes, it is weird. Now imagine. You learn English as a foreign language at school. You do a PhD and move for a postdoc to another country, say, Austria, because that's how academics do it. You work there, though you are not very motivated to learn German, as you know that it is unlikely to get tenure there. In the end, you learn German.

You apply for tenure-track and tenure everywhere, because that is how academics do it. In the end, you get a tenure in France -- they have plenty of permanent jobs. Now you start learning French. You are motivated to learn the language. But having learned two foreign languages already, you see that it will take you 3-5 years to learn it. All public services in France require French beyond advanced, as you have to remind the officials about every single application ten times, before you get anything. Want to have fun? Try to rent an apartment and open a bank account in France, not being a French resident.

I am not the blog author, but I had a similar academic trajectory. Before judging people, try it yourself.


Although I sympathize with the author, I believe most academics are aware of/experienced all these issues after their first year of PhD. Becoming a professor these days is hyper competitive, you don't accidentally become a prof, or become one by default. So it's a pity that this person would quit over issues that they were aware of from early on.

Regarding papers that shouldn't be papers because it's a waste of resources, I think it's unkind/elitist for someone who rose to a prof position via these resources to then propagate such a sentiment. Only experts and geniuses should attempt to publish? Only people from certain institutions? Why should an author self-censor their own work because one person who chose to be a PC member, to pad their CV (which OP also criticises), doesn't think it's worthy?

In general if people are making poor contributions, their careers in academia will be short-lived, and their individual draining of resources will be minimal. If they get published then it's likely on merit.


Most research papers are just a forced ceremonial sacrifice of time and resources. There is so much junk for every good article.

The PhD should be judged on their work rather than some semirandom publishing process.

Eg. like the normal thing should be one or zero papers published during a technical doctorial thesis and the effort is put into the thesis itself.


Hmm, in experimental particle physics I don't feel like there is as much of a pressure to publish. Probably helps that most work is done as part of collaborations, so there are always papers coming out with your name on them even if you only write a couple a year.

But you really have to love what you do, otherwise there's no point.


I could not be happier that I decided not to go back for a PhD. There's no way I would have made it in that system. And anyway, I make great money making Ruby on Rails apps with my degree from youtube (although I do have an unrelated bachelors).

We are wasting the brightest minds of our generation with this nonsense.


I don't know. I think people overblow this publish or perish thing. Shipping something and regularly is a thing that's important to practice; whether in industry or academia.

The reason it's important is that both cases, at some level, one needs to justify the use their use of resources. Places without some real world feedback are usually not a great thing long term.

Same goes for publishing in glamour journals/conferences. In the end it's a competition for attention, and things often have to be judged by presentation and storytelling as well. Prioritizing your time on working on your best projects is likely a good thing. If you feel something strongly that no one else sees, you will likely have less resources to accomplish it. That's life everywhere.


Journal referees are about as far from "real world feedback" as you can get.

There's an interesting clash of incentives -- the grant-funding agencies optimize for stuff that is low-risk, but they claim to want highly-novel work, which also what gets published in better journals. So often what you must do is dress up incremental progress as revolutionary progress. It's soul-sucking.


What would you propose as an alternative to peer-feedback? I really enjoyed academia and didn't find it soul sucking. Does it suck to get rejected sometimes? Yes. Does the work eventually get out; yes? With pre-prints these days, most of the people that are in the field already read the paper before acceptance if it's good. So the journal/conference name itself is largely a symbolic judgement of your peers.


I don't think we should go without some form of review -- just that other academics are not representative of the real world. Perhaps a system where the 'relevance' claims are evaluated by industry reviewers, and the scientific claims are evaluated by academics?

Post-publication (I'm counting arxiv here as publication) review is an improvement. My branch of science doesn't use arxiv to any real extent, unfortunately.

Edit: I left a word out.


However, the other big product of universities is education (and I think it’s fair to say that most people who would think of attending one see the value in universities through how they can educate, rather than whatever research is going on). And “shipping” good lessons, close connections with students, and great results is rarely recognised, and doesn’t help you get a job in the future. (It might help you look better compared to someone publishing equally as many or impactful papers, but it’s the papers first).

The other thing that this publish-regularly mentality can prevent is very long-term forward-looking projects - for some problems it is not clear what an answer should even look like, let alone how to get there. The closest thing to compare it to might be starting a new company - perhaps it takes 6 to 12 months to work on a product enough to start selling it. This is time that people simply don’t have in some academic fields, where they will be seen as underperforming, so it discourages these long-term projects entirely.


Totally agreed with you Re: teaching. I think universities are systematically not valuing teaching enough in my opinion. That said, at my university (UCLA), if you are a great teacher, teaching large, popular and impactful classes; you will get recognized and promoted for it. But when people bring up the publish or perish trope, it's not about spending time building better classes and teaching material.

In terms of 6-12 months, if you don't have that much time, it's likely your past bets haven't paid off and people aren't that willing to give you more resources. I definitely had lots of projects that still haven't paid off 7 years into them; but I get to keep playing on them because some of my other bets have. That's the way it works, but I don't think that is that bad. There are many professors with terrible ideas that they think are great and are indignant that no one will support them unconditionally.


> Too many tasks, no time for tech stuff

Its possible that this part will not change.


Interesting read although I am not a part of this world. The thing that caught my attention was the part about many very dumb professors. How true can this be? I have met some that were not the best, and maybe because of it arrogant, but it sounds a bit too harsh or with the bar being set too high.

The part about life being short and spending it on things we enjoy hit me hard because I feel the same way and can't hide it anymore. Yet, I can't really do that.


Some very valid points. As a PhD student I had similar considerations and on top of that my biggest reason was that I did not want to go into a painful postdoc career only to start on the arduous path of obtaining tenure. That said, despite the limitations, academia is still by far the best place do research (on average). Especially as a professor, you really do enjoy freedoms that are rarely if ever found in other realms.


Publishing for the sake of publishing is one of the reasons that a portion of the papers in every field is extremely low quality.


I understand the motivation (or rather...demotivation) underlying this. But I'm wondering: is this becoming more common? And if so, what is it going to take to fix this warped incentive structure?


Industry doesn't follow the publish an perish but industry research labs also publish a lot of garbage for the sake of publishing. And there are a lot of BS incentives in the industry too.


I am a master's student, I think many times about academia vs industry for "research". Academia has a great advantage over industry: freedom, never dismiss it.


Care to share some concrete examples of how academia is more free and how you expect that to make your daily life better overall? The author touched on that point a bit, especially when he described that tenure wasn’t as freeing as he expected. From his writing, he didn’t seem especially free to me.


I read his useful blog post. I think he will not find more freedom in Cisco Talos, but I hope he find!


I commented because I think academics may underestimate how much freedom there can be in industry. My experience has been that after establishing a baseline of trust (by delivering on goals, making good technical decisions, behaving honestly and ethically, etc.), my company and manager are happy to give me a lot of latitude to choose what I work on.

Being one of our founding engineers, I have several times identified a need and implemented a solution without "seeking permission" first. When I do want more formal company backing to pursue a project, a conversation with my manager or an email or a one-page Google Doc all sound easier than writing and submitting a grant proposal.

As our company has grown, I have seen my role change, and I have collaborated with my manager to determine a new direction and title for myself. Notably, I tried management for a while, and I ultimately decided to return to an individual contributor role (no direct reports). It didn't sound like Dr. Fratantonio had the option to stop mentoring other PhD candidates.

Freedom is not exclusive to founding engineers. Our company has a prioritized list of features available for development, and devs have a big role in choosing the next feature they work on, even when it's not a current area of expertise. Established engineers and new hires alike have a lot of latitude to pick technologies, architect a solution to a problem, implement a novel algorithm, etc. One relatively junior new hire is embarking on a project to rewrite much of our frontend code, with support from all of us.

Again, that's why I commented. Depending on what type of freedom you're seeking, industry may offer it and pay better and have better working conditions to boot.


Thanks!

I didn’t explain my opinion as strong as I could have. In academia, your biggest problem to do great research is "publish or perish", but in industry, as an employee, your biggest problem is your company's short goals. "They buy your freedom". You can't focus on a hard problem for ten years or do research on interesting theoretical problems in industry.


>You can't focus on a hard problem for ten years or do research on interesting theoretical problems in industry.

Sure you can. You simply spend a bit of time on farming out practical applications to other teams or filling patents or writing publications. Comes out to less overhead than in academia with grants, managing students, writing papers, etc. Sure, 99.99% of industry is applied however 0.01% of a massive thing is still very large so there's plenty of non-applied work being done.

Industry is massive and painting every little piece of it with the same brush is a very unjust way of looking at it. I'd recommend you actually learn about things and talk to people before making broad generalizations about them.


There are numerous counterexamples. AI/ML is huge in industry right now, but that’s just the most visible field. Hardware manufacturers certainly have employees pushing forward the state of the art. Google regularly publishes cited research papers. Microsoft and Oracle fund a lot of academic research—I have to assume they also employ internal researchers. Industry is on the forefront of the software engineering specialization of CS (my grad school focus). I’m sure you can find plenty more examples.

10 years of focus on the same problem is definitely possible in industry, and your salary will scale with your expertise. It sounds like you’re expected to produce results along the way even in academia, so there’s not a notable difference in that regard.

One other thing worth comparing is the administrative burden. Good engineering teams have a variety of support systems in place to keep high-value engineers as productive as possible (people managers, engineering coordinators, project managers, etc.). It sounds to me like profs end up personally doing a lot of legwork.


From my reading on the subject industry seems a lot more free in practice than academia. Which is a sad state of affairs in my view. No overbearing need to worry about keeping grant committees happy or paper review boards or tenured advisors or well connected members of the community. If you don't play by the rules in academia then you won't get another position and future grants much less tenure. Large companies will pay for people to do research that is only tangentially relevant to their business. And if your research does make them money then they'll put up with a lot.


If you want to build something, write code or make money, industry is a better place. I talk about doing science for science.


You can actually do much of the exact same research at industry labs as at a university. You can even compete for the same government grants. In many fields the highest-cited authors with the most publications actually work in industry.

You will have managers in industry of course, where in academia you just have a dept chair and dean who only have a vague sense of what you're up to. On the other hand, in industry you can actually devote most of your time to doing the research yourself. Often they break the labor up into those that write grants and the team that does the work. Whereas in academia the incentives really drive you to spend all your time dealing with funding sources and students. When you do have free time, you then have a backlog of writing you want to get done.


This is certainly the case when you're a masters student. It can change as you climb the greasy pole. You will have a lot of freedom if you are extremely successful in your chosen field. Otherwise, it's not such a rosey picture. In academia only the crème de la crème can shop around for a good deal in terms of pay, location and equipment. If you are merely successful then you'll have little choice over where you live and little negotiating power when it comes to pay.


My impression is that it is, in most industries, easier to switch employers than it is in academia, and that gives a certain freedom as well. The knowledge that, if treated poorly, you could just leave, is a powerful motivator to your employer. Also, if things become overly political or just poorly run, you have a ready solution: go somewhere better. At least some in jobs in academia do not feel they are able to do that (as easily, anyway).


Why a comment like that gets downvoted? he is just saying his opinion.


> I finally managed to establish many new healthy habits: I started eating in a healthier way, I lost 10kg+ ...

Damn, that's the opposite from what I've seen around here.


> Confession: I have secretly (?) envied my industry friends for many years, and not for their money. I envied them because they could seemingly have some work/life balance and at the same time spend their time on something technical and used "for real".

Work/life balance is not necessarily better outside of academia. In fact, my observation it might generally be worse.


The ugly at universities is much worse than what was posted.

Athletes getting easy courses. People in power positions typically don’t deserve it but think they do. So, there are ridiculous projects, bias, and people getting paid in “non-standard” ways. Backroom deals with private industry and government. Things that would make no one want to go there.

Universities should work more with private industry like they used to many years ago, government money should be poured into research more openly, and tuition needs to rise, then they can hire the best from private industry that are also excellent teachers or researchers.

To do that, administrations and staff will need to be gutted.


I think your post comes from a non-academic viewpoint.

A research prof’s concerns are about research, their scientific progeny (phd students and postdocs) and pressures to succeed in the related publish or perish/grant writing games. In this context the blog makes perfect sense and I agree with it. However, you write:

>Athletes getting easy courses

A prof doesn’t care about athletic admissions (those are undergrads anyways?).

>there are ridiculous projects, bias, and people getting paid in “non-standard” ways.

I am not sure what you mean about “ridiculous projects”? The university generally doesn’t control or fund a PI. Funding is nearly always external (in the US).

>Backroom deals with private industry and government.

I am not exactly sure whose research or publications this affects but I bet the number is tiny. This is not a concern I’ve actually ever seen play out in real life (again, I am sure you can find news where it happened, but it just doesn’t reflect the lived life of most PIs).

>Universities should work more with private industry like they used to many years ago, government money should be poured into research more openly,

I mean, I don’t think PIs are against working with industry, but industry usually wants IP, doesn’t want to publish, and wants RoI. The government funding process is already pretty open (at least NSF/NIH) but of course can be improved.

>tuition needs to rise, then they can hire the best from private industry that are also excellent teachers or researchers.

Most PIs are funded effectively from research grant overhead. Tuition is usually a smaller budget line. Increasing tuition will both not overall increase budget by much and add hardship to students. I do agree that more gov funding would be nice.

>To do that, administrations and staff will need to be gutted

I don’t think this follows from your previous statements but I do agree there is an explosion in administrative overhead that should be curtailed.


> A prof doesn’t care about athletic admissions (those are undergrads anyways?).

This gets into a tangential issue but I wouldn't be so dismissive of the consequences.

Where I went to undergrad the university had invested a lot of money in bringing their basketball team to NCAA Division I.

The program ended up doing a lot of shady stuff to recruit good basketball players, many of them which were discarded from other universities due to their inability to meet academic standards at their institutions.

Guess what happened?

Softball courses created for the athletes (someone has to teach them).

This was followed by rumoured intimidation of instructors by coaches who had the immediate backing of the university's president (who has the power to make your job/life miserable).

In practice, most of those who took the grunt were lecturers but this wasn't a household even name for sports, just some ego driven project for the president of a public university.

Having been a student during that team and paying attention to the reports that came out, I wouldn't discard the possibility of a professor being caught up in major politics as a result of the athletic admissions, so I wouldn't be so quick to dismiss it as a downside.

For the record, and not that it even matters, I was an academic.


Do note that the author is living in Europe.


I've been part of European academia and amount of corruption I've seen there is just staggering and way beyond most of private sector firms. I'm happy that the author of the blog article ended up in one of the organizations where politics, horse trading, faking of research results for grant audits and outright bribery isn't endemic.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: