Hacker News new | past | comments | ask | show | jobs | submit login
Science needs more research software engineers (nature.com)
532 points by sohkamyung on June 1, 2022 | hide | past | favorite | 347 comments



This looks great, and I would have loved to see this when I was in the lab.

When the software industry says to you: "We will nearly triple your salary, you don't have to work weekends, and you also don't have to feed the mice on a Sunday night."

You will 100% take this deal.

I was a 10yr+ academic tool maker in biochemistry, built cutting edge microscopes, hardware, and image analysis software. My lab was successful in our field. I got some papers out of the deal. I also saw things that no human had seen before in a microscope. I worked with very interesting people from around the world. The work in academia is great. You're moving the needle, new data, new failures. These are the perks. It is also highly possible that you have complete creative control of your project. I did, and it was amazing. Custom designed UIs to streamline our experiments, derived from watching students use the system to do their work. A decathalon of software design.

Some reality: Your PI and organization will never compensate you the way the software industry will. In pay, expectations, and benefits. When you're over 30, and you don't technically have a real 401k, you are still paying your student loans that you needed to get into this field, and you're still in the same shitty apartment, something has to give.

Comparison is the thief of joy, and when you see your cohort of computer science graduates your joy will be stolen :). It's good, honest work. A short tour of duty would be useful, and can teach you the difference between splitting the atom and splitting the check.

Academia, at least in bioscience, is still very much an ivory tower. You don't have enough letters after your name to matter, and you will likely be a pet instead of a peer.

Don't stay underwater for too long. Life is short. :D


Since this thread is turning into a yet another complaints about academia thread:

One of the serious downsides of working in academia is you are basically doing the industry's work for them for less pay and they will one day turn around, pat you on the back, then sell your work for millions of dollars. It gets worse honestly the closer you are to applied fields. There, you already straddle the line between what your more "pure" (and less well paid) peers think is "science" and actually making things that will in fact make people's lives better, so you have less room to be idealistic about why you are doing what you're doing, that is, whether it is for "moving the needle" or "adding to the corpus of humanity's knowledge" or whether you really are just doing someone else's work for them they aren't willing to fund given the risks. And given that the latter is basically closer to what you're doing and your closer to the place where you'll see your work enable someone else's riches, it's hard not to want to jump ship and just become one of those people on the other side but make money hands over fist.

It's an upsetting situation honestly.


I think this is only a half-truth. There certainly are examples of academic research being translated into lucrative products by industry (there are even prominent examples in software/systems engineering) but I think that many times the translation of academic research into a useable product is also a massive endeavor that deserves recognition in its own right.

I see this scenario described in medical research all the time with people saying that industry just leeches off of academic research and what people leave out conveniently is the vast amounts of money and research that goes into translating research into a real drug (billions spent on clinical trials to meet regulation, millions to billions spent on scaling manufacturing and synthesis of the drug to industrial volumes, drug delivery like pill design or injection methods)

Additionally many industries also do have well-paid research positions that "move the needle" on science and basic research. While they're more targeted at producing and supporting products instead of full liberty to exploring just for the sake of knowledge, it's not like there is a complete black and white poorly compensated academic research vs industry.


Fundamental: the patents produced by taxpayer-financed academic research have no business being exclusively licensed to some pharmaceutical corporation. As far as the cost of clinical trials being borne by those companies, well, let's get the FDA involved in the clinical trials.

Then the competition can come in, i.e. whoever can produce pure preparations of those drugs at the lowest cost will win the most market share. This means investing in top-of-the line manufacturing platforms (much of this is now outsourced to India, Mexico, etc. for drugs being sold in the USA) instead of squatting on the patents, blocking competition, and using monopoly status to jack up prices.

Yes, this would greatly reduce the profit margins and perhaps the stock prices of Big Pharma outfits, but the overall benefits would greatly outweight this. As a practical example, look how the best Covid vaccines (mRNA types) have been monopolized, leading to low rates of vaccination in Africa etc., even though that was technology developed with taxpayer funding at public universities.


>manufacturing platforms (much of this is now outsourced to India, Mexico, etc. for drugs being sold in the USA)

This completely trivializes and misses the fact that the manufacturing process itself can be patented.

>As a practical example, look how the best Covid vaccines (mRNA types) have been monopolized, leading to low rates of vaccination in Africa etc., even though that was technology developed with taxpayer funding at public universities.

That's just patently false, Moderna for example, has been waived patent infringement related to covid vaccines. The reason the developing world does not have high rates of vaccination is not because of patents but primarily because of their infrastructure.


No, if a Uni has developed some patents, and wants to 'exclusively license' them to a Pharma, that's probably a good application of that patent, they become much less worthwhile otherwise.

It's a misunderstanding of the market to suggest that somehow 'the FDA will lead the trials'. This is about as likely as a manned mission to Venus, it won't happen, and it shouldn't happen for good reason (cost vastly outweighs the benefits).

It's also a misunderstanding to suggest 'whoever can produce pure preparations of those drugs at the lowest cost will win the most market share'. The 'cost of manufacture' is most cases is not a material or relevant issue.

Your example of 'COVID' monopolization is completely upside down - companies didn't maximize their profit potential there, and may not have even developed such vaccines in a normal case, they were giving very special prices to places like 'Africa' - and none of this has anything to do with 'low uptake' in Africa.

Africa has 'low uptake' for the very same set of reasons they don't have electricity, or consistent electricity in many places.


"companies didn't maximize their profit potential there" as said pharma companies enjoy the highest profits they ever have... The argument "well I didn't kill you" when you struck my face isn't a valid claim to mercy. I'm not one to go full pinko here, I'm just pointing out the obvious logical flaw.


Non-exclusive licensing is the far better option. This prevents monopolization and ridiculous price increases. As far as clinical trials by exclusive license holders, those have a rather poor record of producing reliable results over the long term in many cases (Vioxx of course, there are many others). The trials should really be independently run, not controlled by the very corporations that have a vested interest in seeing positive results so they can go to market.

As far as Covid-19 vaccines, there are actually many companies ready to go right to production if those patents are released to the public at this moment, and that would greatly increase supply, and that would benefit the whole world, instead of a handful of pharma CEOs and affiliated shareholders.


- 'Non-exclusive' is a non-starter for most companies, there just won't be a license.

- Some type of independent trials might be possible, but there already is a lot of oversight. That's complicated.

- 'Releasing the patents'. I'm sure everyone in the world could release all of their patents for everything, and things would be good for about 2 years, but we'd likely never see another drug produced again, ever.


>As far as Covid-19 vaccines, there are actually many companies ready to go right to production if those patents are released to the public at this moment, and that would greatly increase supply, and that would benefit the whole world, instead of a handful of pharma CEOs and affiliated shareholders.

That's just conspiracy. Many companies have not been enforcing any patents related to covid vaccines. The reason why the developing world does not have high vaccine adoption is because of their infrastructure, not because of patent blocking.


It's a valid issue and dismissing it as 'conspiracy' only weakens your argument. For example:

https://www.theguardian.com/world/2022/may/03/covid-vaccine-...


It really is not. No where in that article does it even hint at patents being a primary limiting factor for the availability of vaccines in developing countries.

Of course not all companies are going to release their patents, but the fact there are several that already have, means its not the main problem. mRNA vaccines require extreme temperature controls during transportation and that incurs much more of a prohibitive cost in those countries than any amount of patent royalties do, even if there were no waived patents.

There are cases of huge shipments of vaccine donations going to developing countries that then go unused or underutilized because they do not have the resources to transport and distribute them effectively. Look at the COVAX initiative. The countries that failed to get vaccine rates up even with huge donations of vaccines lacked the infrastructure to distribute them whether it was the temperature controls or not having enough syringes.

"in Benin, only 267 shots were being given each day, a pace so slow that 110,000 of the program’s AstraZeneca doses expired...The vaccine pileup illustrates one of the most serious but largely unrecognized problems facing the immunization program as it tries to recover from months of missteps and disappointments: difficulty getting doses from airport tarmacs into people’s arms."[1]

[1] https://www.nytimes.com/2021/08/02/world/europe/covax-covid-...


Less than 1% of 'research' ends up being commercially viable in any way.

Almost zero research is commercialized directly, in a manner that equates tech to 'product'.

There are usually enormous costs in applying research to markets - just because something 'makes a million' doesn't mean there were no costs.

As for software:

We probably need cleaner, simpler tools, better SaaS for many things.

We just can't afford to have a lot of devs doing research.

Think about the zillions in lost man hours due to Python weirdness of various kinds. It's a giant productivity sink.

Also, I hope tooling for many researchers starts to improve.

I think the target should be, in most cases, that researchers themselves have the tools available to 'do their work' without having to hire devs.


The next thought should be: why doesn't neo-liberal capitalism fix this problem? And: is my characterization of the problem correct? Why not start a new firm that better compensates researchers (and tool makers) for their valuable work? It seems like big tech (especially Google, and perhaps Microsoft) comes in from the commercial side and invests in R&D at reasonable rates for just this purpose! But surely if workers are systematically undercompensated, there is room for a disruptive firm to come in and take the best talent and still make a profit.

Perhaps the characterization is wrong and the EV (expected value) of this work is far lower than you think (this seems likely), and/or there are externalities like regulation, or the leverage of prestige that traditional orgs (e.g. universities and publishers) wield, that warp the profit incentive. Or (and this is my cynical view) pure science was always best left to the hobbyists. Historically the most important discoveries have come only rarely and to those who loved doing science in their free time or, more rarely, when a talented individual found a patron. Building a science factory and hiring science factory workers not only sounds distasteful, but it doesn't seem to work very well. (The exceptions being those very capital intensive projects like the LHC which require a large pool of professional scientists and engineers to do the experiment.)


"If it always worked, it would be business. Let's go to the pub." -- Me, consoling a grad student after experiment failure #24.

More seriously, if you're in basic science, your skills are valuable in transforming the work into a more useful thing to be used later. Using your science factory model, you have created a reusable widget that other people can use. The science factory model does work, you can see its results in things like MIAME: https://www.nature.com/articles/ng1201-365 Where large pooled datasets are used to get insights otherwise impossible.

There's not a ton of low hanging fruit in some fields, as time has gone on the edges are harder and more expensive to see to be at the cutting edge. Ex: you spend $2M on a microscope that does a cool thing and two years later the new model is all that, a bag of chips, and a soda for the low price of $750k. You hope you have a good enough relationship with the vendor that they will either mod or upgrade your system, or that those two years were enough for you to get ahead. It probably wasn't. And you now have a not as fast ferrari for more money than the fast ferrari.

There is a massive glut of international students willing to work for basically nothing, beholden to your PI by their visas. I say this not as xenophobia, but I was the only working class American (my parents do not have degrees) in the department. All students/postdocs that I worked with were from other countries, or if they were American, their families were doctors, or a faculty member. More generally, the kind of people that might own horses :D.

No firm would take this work on, as the profits are not clear, and the time scales for success range from two years to never. In this case success is "great job publishing, we'll give your lab another 2-3y of funding." After which, you better get good at writing books and eating pasta.


I would also say, and I'm surprised this needs to be said in a community that is so connected to the Open Source and startup cultures, that just because something is valuable doesn't mean it's possible to make a business out of it.

Imagine research into a technique for getting better blood pressure readings from people who are so nervous around medical settings that their blood pressure spikes (or more basic research into the mechanisms of blood pressure and anxiety). This is a valuable thing to society (more accurate data informing treatment decisions for individuals, screening for physically demanding jobs, life insurance, forecasting medical spending for Medicare and the like), but it's not worth a lot to anyone in particular.

For the field you described originally, complex imaging devices, there are only so many users of that research so it's conceivable that work could be taken up by a corporate R&D department.

There are all kinds of other very useful research topics that are very valuable to humanity as a whole but it's not clear exactly who should pay for it (I'm not saying you aren't aware of this BTW, hopefully I'm adding support to your argument). In those cases it makes a lot of sense to take a fraction of a cent from everyone and pay for it that way, as we currently do.


It's very difficult to tell what will become valuable in the basic research world and what will remain a curiousity. A classic example in biotech is the study of sex in bacteria - it seemed about as useful as studying the sexual reproduction of ferns at the time. Bacteria generally replicate themselves clonally, but the discovery that they were also exchanging genetic material by the use of plasmids (essentially, mating with each other) eventually opened the doors to things like cloning the human insulan gene, inserting it into a plasmid, getting a bacteria to take up the plasmid, and then, voila, human insulin could be grown in vats in bulk. That was the first real biotech business that I know of, and from there it just exploded.

The problem with universities pushing research that clearly has some short-term financial reward (due solely to patents and exclusive licenses under the 1980s Bayh-Dole law) is that they neglect basic research and so close the door to the potential of truly fundamental discoveries like that. This is generally known as the corporatization of the American academic system and it's really been a disaster for basic technological advances.


Do you think the decline of large corporate R&D efforts is cause or effect here (or is this a false premise)?

I am wondering whether we've seen the reverse of the idea I was originally challenging (if research was valuable it would be a business), where universities captured a profitable business because it was being neglected by the business community (and were distracted from basic research).


The original concept was that universities were places of basic research, and more translational (read: monetizable) research was thought to be done at corporations.

That theme changed after 2008~ when NIH was flat funded and most universities were gazed upon by the Eye of Sauron for funding. A lot of places that were basic science focused, let's say at the level of studying a set of proteins in mitochondria, had to figure out how to connect the dots to disease or therapeutics. Not everyone made it.

Also, universities got into the game of stacking patents to license. I don't know the arc of that, but I know for sure after 2008 my Office of Technology Transfer was really into it.

Ex before: "We study apoptosis signalling in mitochondria, to understand how mitochondria are related to cell death." After: "We study apoptosis during heart attacks, and how mitochondria contribute to cell death in ischemic conditions."

Something along those lines.


Totally! Most of our best equipment was stolen and modded from materials science imaging or manufacturing automation. There was a budding industry for automated fluorescence imaging, but they were still finding their legs.

We had a couple electron microscopes that we modernized from film, and the companies we contracted with mostly dealt with materials people.


> surely if workers are systematically undercompensated, there is room for a disruptive firm to come in and take the best talent and still make a profit.

Other good replies here, but this part of the comment reveals some assumptions that need better definition. Having been both, I can comfortably say that academics aren’t “workers” in the same way that industry programmers are “workers”. The parent comment is not correct about the norm; programming for research projects is not usually sold for profit later to industry. It happens occasionally, but most academic work stays academic. Sometimes when it does happen, it’s in the form of a spinoff company that brings the original authors of the software, and so they end up getting some of the proceeds… when the business survives.

Also the top comment didn’t say ‘undercompensated’ - in business this has a clinical meaning that someone is being paid lower than market rates. We know that adademics pays lower, but we do not know that it’s lower than market rates for academics. It’s plenty true in industry alone that you can make a lot of money at Facebook or very little money at a small indie game dev shop. That doesn’t mean the indie game devs are undercompensated, it means they’re in a different market.

Starting firms to compensate researchers better is what pharmaceuticals (for example) are. The problem with your suggestion is that the need for income can undermine the ability to do research that is unbiased, risky, controversial, critical, or just free of agenda. If you pay researchers in line with what Bay Area programmers get, it will put an enormous burden on the PIs to make many multiples more money than their peers, and compete with them using a small fraction of the number of people of peer groups.


I'd guess that the expected commercial value being low would be the norm, and discoveries making millions relatively rare, just as this is in every other context. However, the second half of your second paragraph is where my mind went to first, because what gp says happens does happen, albeit at a normal (low) rate. The motivation of people working in science is different, as it is in say the games business. Game developers have historically been paid lower except at a tiny handful of companies. Not 33 cents on the dollar, but maybe 50 to 70 (bearing in mind that FAANG/unicorn salaries are not the norm either)


> The next thought should be: why doesn't neo-liberal capitalism fix this problem?

You are the vehicle by which neo-liberal capitalism fixes the problem. By leaving academia to work for a firm directly, you are responding to a price signal sent by the industry, relaying that price signal to the academic labs.

You might object, this is slower than most price signals! That's because the university environment is somewhat insulated from the ordinary pressures of capitalism (and thus better able to exploit young research programmers).


> you are responding to a price signal sent by the industry, relaying that price signal to the academic labs.

Which means absolutely nothing unless a ton of other people do it as well. A handful of people here and there can be replaced.


They expected value theory is very plausible. there are a lot of r&D projects that basically produce zero output for decades. high risk high reward


>why doesn't neo-liberal capitalism fix this problem?

The whole point of academia is to subsidize research before it gets to an application phase. How can a private firm compete with academia who benefits from government funding and are tax exempt? Trying to pin this problem on "capitalism" is just lazy.


No, lazy would be straw-manning a stranger's argument for no good reason to elicit an emotional reaction. It's a style of communication that seeks conflict rather than understanding, and there is plenty of it on twitter and reddit, but not here.


There are plenty of firms that sell software to academia and many of them make a ton of money. I bet there are great opportunities in that space. I guess the issue is that most business educated/oriented people are both too disjoint from both engineering and science, so competition is rare.


>The next thought should be: why doesn't neo-liberal capitalism fix this problem?

Neo-liberal capitalism fixes problems?!


why should anyone pay when the government is keeping it all alive today?


This.

I have worked for almost 15 years in academic research, but in very close collaboration with the steel industry. The code we write can help steel companies to save millions when developing new products. This is quite complex software, which combines materials science, mechanical engineering and advanced mathematical concepts while requiring high performance and reliability.

I found a nice position for a tenure track in France, in a top research centre. Besides designing and writing software, I would have to design and implement experimental plans, teach, deal with students and administration, keep an excellent publications record, and find funding for future projects. Remote work would not be a possibility (but I would work may unpaid extra hours at home). And the amount of published papers and recommendation letters required just to be considered for the job was overwhelming. My salary would be lower than $30k/year. They do not even know what is a RSE.

I am searching a remote job in the software industry now.


> But I would work may unpaid extra hours at home

I think that's incorrect. You would work the number of hours you wish to work (considering you produce reasonable value, but the bar is low). Research engineer (or researcher for that matter) in a public French research center is a civil servant position. They are difficult to get but you don't get fired unless something is blatantly wrong.

Source: I worked 10+ years in such a position. I work now for a FAANG and the pressure is considerably higher. Evaluations every 6 months, lot of peer pressure (engineers are on average better and more ambitious than those in academia and you need to keep up - some of them seem to work 24/7), extremely stressful oncalls. Gross salary is 5 times my previous salary and has the potential to increase much more.

Of course, this is certainly not representative of all cases, but most of the time, there's a price to pay for a higher salary. Another thing to think about is ageism: as a research engineer in academia, you're all set until retirement. In software industry, it's getting hard after 50.


seriously? in what kind of alternative reality does academia live to offer 30k?


European countries have lower salaries in general? Although their social safety nets are better.

Also, assistant professors (or the equivalent there) generally make less but do probably make more once they get tenure. I'm assuming they meant the tenure-track position itself is ~30K USD, but making tenure usually does mean a pay increase.


In the Netherlands, 30k is the starting salary for a PhD student.

30k for a tenure track position sounds insane to me.


In Ireland PhD stipends are closer to 13-17k. It's not a perfect comparison because the PhD stipends are tax free, so your comparative salary would be in the 17-20k mark. That said, Postdoc research positions are much closer to 40k than 30k.


That's still pretty miserable. In Germany, an assistent professor or postdoc makes 60k Euro after a few years, even when they are in the pay scale that only requires a master's degree (TVL-13).


I have spoken to quite a few that made minimum wage (which is not even close to 30k).


The situation is a bit weird in The Netherlands. Some PhD students are paid employees (AiO), their gross salary is ~31000 to 40000 Euro per year (I think this is excluding vacation money, but including 13th month).

Then there are PhDs that get a scholarship (bursaal), that is only around 24000-25000 gross per year.

Not too long ago, there were only employee PhDs, but some universities really love the scholarship system, because they have to pay less tax, so it's a lot cheaper for the universities.

My wife had a PhD scholarship in NL and it really had some large negative effects after finishing her PhD:

- She contributed 4 years less into a pension fund, since bursary PhDs do not build up pension outside the state pension;

- In her next academic position, they didn't consider her four years of PhD work as working experience, while they did do that for me as an employee PhD. So, she was set back 4 years in salary growth.

- She finished her PhD in August and started a full time job after her PhD. Because she had an income that went over some threshold, she had to pay the taxes that the university dodged by using the scholarship system. She worked the rest of the year at a loss (the taxes were higher than the income from September-December).

The worst part of it is that many foreign PhD students do not realize that there is a two-class system.


Update: today's news is that the minister of education requires that all students on a PhD scholarship will get a regular employment from 2024 onwards:

https://ukrant.nl/minister-zet-definitief-streep-door-experi...


That's pretty tragic, aside from the pension - which I don't expect to receive anyway (like most everyone under 40).


That is a pretty disappointing situation for a country that is known for being "progressive."


In the us, a few years ago, my program offered a stipend of 22,000 usd per year. Provided I taught a few classes, graded homework, tests, etc. While doing research and taking my own classes.

That was very lucky, many programs do not offer stipends and require people to take out loans.


For context, you have 7 weeks holiday, you can't get fired, your working hours are quite flexible. In France, medical expense, education is free. And outside of Paris and a few other big cities, rents are rather affordable. So all things considered, it's not a bad deal (which is why they do attract good candidates). And a typical SWE position in private sector in France would be $50-60K (of course there's variance there, but in academia, there are also ways to make extra money)


I've seen people get fired from a academia on several occasions... When they couldn't fire someone they beat them down so regularly and buried them so deep people left or had a mental breakdown.


I'm aware of several academics that are banking 1M+ annual providing consulting services.


Only possible if you are a late career academic with a lab and a publication record. They are paying for the prestige and the pipeline to new hires more than the technical advice (ok sometimes they pay for the technical advice but I’ve never seen that get a good roi). I can think of a few instances of professors consulting with companies where I worked and they all had grey hair and Tenure. The rich get richer, but these examples are a variant on ‘lottery winners can make good money’.


that's way to low even for private sector. flexible working hours are pretty much the new standard in SWE so not anymore even an argument


> you have 7 weeks holiday

Paid?


Paid.


A university lecturer in the UK will start at around £30-35k.


come to Denmark! you can make about 60k a year as a phd student!


But then you need to learn Danish. Brains can only fit that much stuff ;-)


> You don't have enough letters after your name to matter, and you will likely be a pet instead of a peer.

This is an underrated point. This is the case for programmers in finance as well, and requires a hefty salary premium to put up with.


It certainly echoes my experience having just left a professional dev job in academia after an 11 year stretch. Anybody without academic credentials relevant to the subject matter is "the help" no matter how much you contribute, and it's flat-out demoralizing.

I worked on a tech-heavy project large enough to get an NYT feature article covering its launch. For it, I collaborated heavily on the service design and logistics, and singlehandedly designed, built, administered, documented, supported, and provided training for the technical infrastructure and more than a dozen related interfaces and tools. In lines of code, it probably landed somewhere in the low 5 figures, but that was certainly way more than it needed to be. It was hackish but durable and performant. It was an exercise in pure generalism— no individual accomplishment was close to technically innovative enough to warrant a novel white paper, but I was invited to speak at a few related conferences about it.

But the professor overseeing the project didn't even mention me or my role in his launch party speech for the folks in our building, let alone anywhere that would have provided career visibility. He thanked and spoke about the contributions of every other major contributor— even the temp worker who ran the machines (he wouldn't want to appear classist after all)— but I got a hand shake and quiet thank you after his speech for my 5 year effort. I was at every related manager's meeting and largely seen as one of three "go-to" people for the project in general, not just tech stuff.

This sort of gatekeeping is a part of academic culture I just don't get. At least in business there's some predictability to people stepping on each other to get to the top, but what's the purpose of this?


This sort of gatekeeping is a part of academic culture I just don't get

This is just a hypothesis, but I'd predict a high correlation between becoming an academia lifer, and having certain preexisting personality disorders, stemming from having never derived a sense of self worth from anything other than academic achievement since they learned to speak. Or maybe I'm just speaking for myself :)

Similar to the top tier of tech companies being destructive and amoral in their own ways, not only because they're corporations, but also because programmers see technical challenges waiting to be solved like a moth sees a porch light, but see ethical problems dimly. (still probably speaking for myself...)


> Anybody without academic credentials relevant to the subject matter is "the help" no matter how much you contribute, and it's flat-out demoralizing.

That's my number one advice regarding academia: unless there's a path toward a valuable visa, or it's paid work while getting a valuable degree (read, something that will have the prestige to open doors) or your co-author at a good university you're much better building something for yourself somewhere else.

> no individual accomplishment was close to technically innovative enough to warrant a novel white paper [...] But the professor overseeing the project didn't even mention me or my role in his launch party speech for the folks in our building, let alone anywhere that would have provided career visibility. He thanked and spoke about the contributions of every other major contributor

That's because papers are the metric by which visibility is measured. Pretty much the only way to move forward is getting your name as author on the main papers.


This is different, though. I was a professional developer working in a non-academic lab doing work the academic world really cared about. Public recognition for big accomplishments is what distinguishes me from the 'web guy' at the help desk who knows how to customize Wordpress themes, and will lead to progressively interesting roles that pay well in exciting organizations. Just having X number of publications under my belt wouldn't budge the needle for my career. It's a weird sort of in-between spot without obvious career trajectories but you can get a decent salary while working on cool stuff.


What you did doesn't matter, and they gave you the right amount of recognition. There are 1000's of imported indentured servants that will happily do your job the moment you leave. Of course, we don't call them indentured servants any more, we use terms like 'academic visa' or such.

I'm sure they also didn't think the electricity company for keeping the lights on, or Microsoft for creating Windows to write their speeches, or the guy that emptied the waste baskets in the office so the PHD guy didn't have to.

Don't carry water for someone else. Enrich yourself. That's all anyone else is doing, all the 'research' is for personal enrichment and prestige. Don't prop up the broken academic industry with less than market wages, let them fail.


Nothing anybody does matters. It's all a big scam, man. ::hits birthday cake flavored vape and sips energy drink:: I'm looking out for number one from now on, bro.


> no individual accomplishment was close to technically innovative enough to warrant a novel white paper

There are so many academic journals, from scammy, to bad (yet honest), to average, to good, to the top. You can publish almost anything, if you select an appropriate, less prestigious journal.


Yeah— wouldn't have helped in this situation. I was a professional and (deliberately) not in an academic career path, and at this very prestige-conscious institution, publishing in a scuzzy journal probably would have made me look worse.


Yeah but am I going to get the job if all I have is garbage published in no name journals?


Are you talking about the letters P, H and D? As in, if you don't have a PhD they don't see you as a peer?


While informal culture and individuals' self-importance do play a role, it's also down to strict old-fashioned salary scales that many universities have in place (even if your day-to-day colleagues see you as a peer, the administrative systems defining your salary range can't/won't). Salaries are often strictly attached to letters behind your name, at a high level, and largely immovable by individual research departments.

And secondly, while your PHD peers may earn more than you, they also often earn much less than software industry averages.


I don’t think this is true in the slightest. At UC, research assistants typically make more than Grad Students or post-docs (of course the overhead and mentorship are also different and allegedly there is some possibility for greater career advancement). The snobbery is just plain snobbery. In industry there are plenty of people who make substantially more than me and I have never once felt the levels of condescension that I got from mediocre academics. There are maybe rationalizations related to scarcity and all that but jerk behavior is still jerk behavior.


The intent of my comment wasn't to make out snobbery doesn't exist (or isn't rampant - it is & I've experienced plenty of snobbery from academics myself). Just that there are additional factors.

> At UC, research assistants typically make more than Grad Students or post-docs

That's cool but I didn't say every university; I don't think one counterexample makes my comment "[not] true in the slightest".


I've honestly yet to meet a research software engineer without a PhD because of the academic bias you will get in, well, academia.


Depending on the definition of RSE, I may or may not have been one. The company I worked for was a Synchrotron Light Source; I worked on software for data collection on X-Ray beamlines. I would say that only about half of those in the same role as me had a PhD.

Moving away from data collection to analysis, the fraction of PhDs went up, but only reached 1.0 when considering the sub-group specialising in structural biology.


For many years I worked in a high profile research institute (neuroscience) as an RSE without a PhD. Still don't have one, and that's okay (for the path I'm on). Quite a few of the other RSEs in the institute don't have one either. In total I'd say maybe 50% didn't have a PhD.


I'm one who started with only a BS, and I'm at a top-20 public university in the US. It depends on your PIs, but I've definitely been appreciated on many of the projects I've worked on (e.g., listed with 2nd most ownership percentage on invention disclosures, which also won a campus-wide yearly award).

Admittedly, my path was convoluted; I started as a engineer to help with non-research software at a large lab, and got pulled on to projects via reputation. But I was replacing a Master's student who was essentially at the same academic level as me anyway. It does pay less, but I made the tradeoff for the quality of projects, which was worth more to me at this point in my career. It's still much more than I need, just not at industry levels.


I found a role like this. I love it, with the caveat that doing research, software dev, and some lead-type stuff is a lot of work. Though my hours are capped at 40, I probably am thinking about it on some level at least fifty hours.

Pay is quite good, though, so I can't complain.


I work with two of them at the moment. One is planning to apply for a PhD studentship soon, but the other does not intend to do so.


Ok well I can fix that. Hi, I'm a research software engineer and I don't have a PhD.

I'm in Europe. My salary is definitely better than the PhD students' salaries, and I have a proper adult pension as that's a legal requirement here. My salary is approximately equal to what a graduate might earn 1-2 years after graduating in the local market, so doesn't match my actual experience, but I accepted the post for pandemic-induced reasons. Certainly the salary does not, nor will ever, compare to levels.fyi/FAANG type jobs or a large corp in country.

However it is true that my position officially is very much a curiosity. We don't have a defined RSE type role, so the slot I fit in is "staying on to help out on project after graduating". My job is a fixed term contract that can only be renewed a certain number of times and I'm approaching that limit soon. There isn't any viable track to joining the ranks of researchers - I would have to do a masters first, and this ironically would require doing an internship, in spite of the fact I have more actual industry (non-university) experience than the entire lab combined.

I'm also not sure if my lab head bent the rules or not on hiring me - it might be the case that I am supposed to have a PhD or at least a masters.

I would agree with top level post in most points. It is interesting work, but I don't "belong" anywhere in "the system". This might change in 10-20 years. Artefact evaluation is very much becoming a thing in systems research, because being able to reproduce other people's work is quite important, and very occasionally you will stumble upon papers whose claims are, ah, more expansive than the associated github project can fulfil. As more research relies on software that graduate students are simply ill-equipped to write (by virtue of having no experience in anything and by being taught by professors most of whom no experience writing production code) the role of an RSE might become more important in time, but like anything it'll be a slow change.


> My salary is approximately equal to what a graduate might earn 1-2 years after graduating in the local market

When talking about the "local market" in Europe one needs to take into account the large number of "dark matter devs" that are working remotely for SV companies, at SV salaries. They simply won't ever show up for interviews at local companies.


In this case then I mean local local market, not devs working remotely for SV. I am aware. One of my friends does this and earns 2x what I do, in cash.


I’m surprised they even let you in without one.

Everyone “scientific programmers” potion I’ve seen wants you to have a PhD and be a domain expert.


Even if you have a PhD it sucks. Everyone without a PhD is trying to one up you, and everyone with one has invented ten other arbitrary things that ensure you are human trash on arrival.

Also, imagine all the people who failed out of masters or PhD programs who end up in management and are resentful. It's a surprisingly common thing.


Agreed- you might appreciate a story that happened recently. I worked at a finance algo trading startup, right before and into the financial crisis. The first CEO/founder gets ousted, new guy is an old school "phones and traders" kind of guy, and didn't know, or even seem to care about tech at all. It was a strange choice, since we were built as a tech first company, but seeing as we were having difficulty getting traction, I think the hope was by getting the old guard type in there, we would have an easier time selling the new thing... anyway, I give this a go for a few months but eventually leave as I just could not stand his hardly contained contempt for technology, you could just see it on his face that he longed for the days of dropping F bombs on the floor, and then going out for expensive steak dinners at night. As I give my resignation, I get screamed at, he is red in the face dropping F bombs on me- "You are F---ing us!" etc... Long story short, I offered them to counter, with a 10x'ing of my equity stake, and even to extend my notice period- at my newly offered salary, but they declined all of it, though practically begged me to stay on for 3 extra months at my current salary. On the week I left, this guy tries to get me to sign all kinds of nasty non-disparagement agreements, which I had not signed previously, and with no additional consideration ($) in exchange, and I just refused, and he literally threw the stack of papers at me at one point. I guess I took all of this because I had literally been there since day 1 and just had a sense of ownership over everything- I also didn't think this guy would last very long.

Anyway, fast forward about 10 years to a few months ago, I get a generic "cold call" type message on LinkedIn from a unicorn data tools company, from the same CEO guy- he bounced around and somehow landed a sales role there. I ignored the first one... he sent a followup, and I was incredulous- did he not remember me? Did he not care? It was something along the lines of "Hey how are you? I am working with xxxx and think you would be interested- can we set up a chat..." and I just replied back saying "I am great, haven't been screamed at or had anyone throw something at me in ten years..." and he still had the balls to right back something like "lol, great. Let me know when we can set up a call..." and I wasn't really sure how to respond, but after about a week just wrote "If your next message isn't a very specific apology for your past behavior, do not contact me. I am surprised with your past attitude you would even work at a place like xxxx." If he was a dick I was going to reach out to their head of sales and possibly CEO explaining his past and how I was disappointed that a firm with their reputation would even hire someone like that. He gave me just a half hearted enough non-specific apology to not do that- followed up immediately with an offering of buying me a beer (so he can pitch me), so I opted not to escalate any further.

I have a few other stories similar to this, where Karma really bit those that were hostile and condescending towards technology and technologists, but this is the most direct. I kept in touch with a few of the old "traders" I used to work with, and used to go out for drinks with them from time to time, and they would invite a larger group of people, and I actually stopped because they were all kind of depressing- they all lost their old jobs, a few pivoted into different decent roles, but mostly they just got drunk pining for the good old days to come back.

While there was a tiny bit of schadenfreude, in the end its just sad.


As a counterpoint, I know someone who went back to academia after getting disillusioned with tech (academia > tech > academia). The key difference may be that they live in Europe, have no student loans and the pay disparity between a developer in bioscience and in tech is not as large as I imagine it is in the US. They are paid significantly more than the scientists they work alongside but not much less than they were paid in a good tech job. For them the bioscience work is much more interesting than anything they did in tech (they have a maths-based PhD, so were working on quite complex problems but in a relatively boring field).

Software development is important to more and more industries and the pay disparity caused by insanely large funding and little requirement to produce profits means that other sectors are being priced out of in-house development, especially niche use cases. The ongoing rise of no-code development will be increasingly useful across all sectors but will fail to deliver a lot of these niche applications.


I did this as well, but started my own company. The key for a lot of people making the switch is to make the money in your first endeavor - in tech - then transition into a lower-paying but more pleasing industry, with the money buffer you built up making it possible. I've seen it a few times lately.


Salaries in Europe are catching up...


A lot of the smarter people I know have been recruited into Europe. People say "the salaries are so much lower lol", but the reality is, you often have employment laws that remove terrible occurrences as possibilities that are commonplace in America, you have access to healthcare, being a home owner is actually possible and if you don't want that renting is better overall. European culture is usually way less cut throat, and managers typically know their stuff, rather then failing upwards to half a million dollar salary's where using the word "digital" and being a brute is the main requirement.

Salary isn't everything. European engineering is a pretty different culture.

The way the us tries to prevent this is by crippling their people with student debt.


I don't have any student debt pressure, but I'm debating trying to do the same. I have a lot of friends in Denmark and no strong ties to the US. I'm about to hit my forties and it's probably now or never.


> at least in bioscience...

BINGO! that field is notoriously horrible and interacts extremely badly with a 'when not discovered here, not important' syndrome. Biology is brutal toward bio-physicists, mathematicians as well as people who code who they are forced to work with rather than seek out to help them.

I still hold up as an example nonsense discussions around p-values in bio vs actual work going on in statistics in maths departments. It shows how far detached they've become.

Not to criticize too strongly, but given the above, combined with it's reproducabilty crisis, and existential problem of being in the back-pocket of big-pharma, I seriously doubt the professional integrity of a lot of people in the field.

Move toward mathematics, physics and chemistry. There is (some) serious money and a good atmosphere around areas such as finite-element modelling, or wolphram like tools as an example. There is a lack of direct funding for decent posts but you get recognised and paid the equivalent as a peer, I know from working with some of these people. It's not to say it's 100% always without friction, but no job is I'd argue.


The reproducabilty crisis in chemistry is just as bad, if not worse, than biology. Anyone with a pen can reproduce a math proof. If you work on a big project (physics experiment) where every paper has a dozen eyes on it you can't slip crap work by your peers because that's their livelihood on the line. In between you have bio/chem fields where each project is too expensive to trivially reproduce but still small enough to have only one career on the line for each project.


Most of the reproducibility issues in chemistry happen in biochem in my experience(meanwhile it gets the most funding). That said, synthetic chemistry is also a problem area. Usually in synthetic chemistry it's not that the work can't be entirely reproduced, but rather that yields are fudged. That's mostly because PIs say "you can't graduate until this reaction yields 99%.". So after someone has written four papers, taught classes at minimum wage for 7 years, they fudge a 95% to a 99%. It's not okay, but neither is the way academia is structured. Super glad my discipline was elsewhere, but I saw colleagues suffer from this stuff...


No there isn't good money in physics and chemistry or pure math. PhD chemists almost never make 6 figures even in high cost of living areas serving as a specialist. I made less as a senior scientist or a project manager in chemistry than I do as an entry level software engineer. I don't know how many physicists I've met who work minimum wage jobs, usually call centers, after their PhD/post doc (even finding a PhD is difficult, let alone completing one in 6 years).

FEM can offer money but you are competing against engineers who that's what they've done for years.

If you interviewed software engineers and data scientists right now I bet a third of them once were physical scientists/mathematicians who mostly regret their degrees or the fact they can't find survivable work using them.


>"I bet a third of them once were physical scientists/mathematicians who mostly regret their degrees"

Would mathematicians truly be regretting their degrees, if they decide to work in software? I read that mathematics one of the best degrees for a career in software engineering, as computer science is very closely related to mathematics (to the point where studies of algorithms are largely the same for mathematics and computer science students).


Theoretical parts of computer science is connected to discrete mathematics, sure. But that is only a subfield of mathematics and mostly happens already at CS departments, so you'd get a CS degree anyway.

It is also possible that aptitude for math is related to aptitude in software engineering.

However: The mathematics content of 90%+ of mathematics degrees awarded is fully irrelevant to 95%+ of software development tasks. And when that 5% task needs that some kind special mathematical insight, the people who want that task done are going to get the top professional they can find for it. Maybe the prospective math student is going to be that professional, but I don't recommend planning a career for it.

I am not saying there isn't work where some math is useful but the most commonly used applied stuff ... say, linear algebra ... is typically covered in a respectable engineering program; degree in mathematics would be superfluous. Proving theoretical properties of Hilbert spaces or measurable sets or bifurcations of dynamic systems or advances in differentiable topology or fascinating behavior of cellular automata or whatever is going to be gigantic waste of your time if you won't use it later in your career or you don't find it intrinsic motivation in itself.


And five years of gluing APIs together that help get more people to click advertisements - you'd be surprised how much math you forget. Machine learning can be better for exercising math, but most company's do not want anyone doing anything new. Same goes for physical sciences in my experience. You basically get a PhD to do associates level work. Even if you know a better way, that comes after you get ten yrs experience and have authority over projects. See the first sentence of this post for a catch 22. Bleh.


> "good money" apparently a very relative term, I think I'm in it for job satisfaction then at 5 figures, shame I'm a qualified expert.


I would say that is a very strong criticism and very warranted! For note, I witnessed the immolation of two careers over retractions of papers that could not be replicated. You could say that the system worked. That was a while ago, and I'm sure the paper mill phenomenon is in full swing. You get echo chambers of PIs that rubber stamp each others work.

In my case, I was in basic science which hit a crisis near 2008 when the NIH was flat funded. This caused a come to Jesus moment, where suddenly all basic science labs were rebranded as translational medicine. My department was absolutely gutted, down from 15 or so PIs to maybe 8ish in the span of a year. Our field was bioenergetics which at the time was pretty competitive, and easy to link to diseases/metabolic disorders. We didn't work with pharma, some labs received contracts for small work. NIH was by far the biggest funder, followed by DARPA and other smaller health organizations.


I will say IMO (and experience) in professional math that while there is perhaps more of a chance for an outsider to have an impact, Mathematics is hardly free from bias towards insiders: it can manifest itself as subtly as using notation as a shibboleth (e.g. it’s somewhat easy to tell which community an author comes from through their notation and terminology, and equally easy to harbor resentment towards those outside your field) all the way to active “prove I’m the most clever in the room” syndrome during seminars. I’d like to think that a more collaborative atmosphere is prevailing now due to the rise of interdisciplinary and applied math, but people are people everywhere and as Sayre stated “Academic politics is the most vicious and bitter form of politics, because the stakes are so low.”


"Not to criticize too strongly, but given the above, combined with it's reproducabilty crisis, and existential problem of being in the back-pocket of big-pharma, I seriously doubt the professional integrity of a lot of people in the field."

Lack of professional integrity is a very real problem.

Over the past two years I wrote fairly frequently about some of the nonsensical / pseudo-scientific COVID papers that got published, especially the quality problems in epidemiology. Epidemiology isn't bioscience (actually that's one of the problems with it - total lack of biology), but it's adjacent. After that I got contacted by a former research software engineer who worked with top epidemiology teams in the UK. I also got contacted by a member of the SAGE committee.

Both of them told me some absolutely mind-blowing stories of ethical malpractice. I wasn't totally surprised because it was obvious that those sorts of things must have been going on behind the scenes just from reading their model source code, reports, watching their behavior etc. The RSE had become so disgusted at what he'd seen that he actually left the country and switched from working at Oxford to some US university I'd never heard of, switching fields along the way too. Quite the downgrade in prestige but after years of trying to help epidemiologists he concluded the entire field was utterly morally corrupt and he wanted nothing to do with it.

Here are some of the more memorable things I was told by those two scientists:

- The RSE at one point found a bug in a FORTRAN model being used to model malaria outbreaks. It had been used as the basis for hundreds of papers but at critical points was using pointer values as variables instead of dereferencing them. Obviously, a typical pointer has a very different value to most organic things (some FFI bug). He reported this bug to the authors and got a reply back within 30 minutes saying they'd checked the papers (all of them) and it didn't affect the results. This claim was very obviously a lie: not only could they not possibly have checked even one paper in 30 minutes but he already knew fixing the bug did indeed change results! They didn't care and he was shocked that his "colleagues" would bullshit him so directly, especially as they must have known that he would know.

- Same guy flagged code quality issues to some of the scientists and proposed introducing some rules designed to improve quality. He was dismissed with the words: "oh <name>, we're scientists, we don't write bugs".

- The SAGE member told me about some of the internal discussions they had. Criticisms of the methodological validity and accuracies of their models were dismissed with reasoning like this: "that person reads the Spectator so it doesn't matter what they think". Relatedly, he made clear that the supposedly scientific SAGE predictions were sometimes being altered to reduce criticism of the group by left wing media and journalists. The changes were presented as "the science changed" but that wasn't what was going on behind the scenes.

- Malaria research is (supposedly) being badly distorted by the Gates Foundation. Gates only cares about eradication which leads to lots of problems. There are some smaller ones, like many researchers don't genuinely believe that's possible but lie on their grant applications to make mitigation efforts sound like eradication efforts. And then there were unethical experiments on entire populations where e.g. whole areas are blanketed in anti-malarial drugs. If it works, great, you eradicated malaria in that area. If it doesn't you just selected for drug-resistant mosquitos and now the drugs that were being used only to treat the serious cases don't work for anyone. He told me this has actually happened more than once.

- The RSE told me they'd at one point tried to recruit an RSE working with climatologists to help them with their modelling (a belief that climatologists are more rigorous than they are seems to be common in epidemiology). The RSE they interviewed refused to take the job. His reason was he was quitting academia entirely, as he was so disturbed by the practices he'd seen.

A few years ago if you'd told me that a whole research field could be unethical I'd have thought you were crazy because, well, that's a whole lot of people being painted by a very broad brush. Now I've seen it for myself and heard from other former insiders, it's easy to see what happens - the honest ones discover what's happening and leave. Because academia hardly ever penalizes intellectual dishonesty, the pool of people who remain are the ones who are OK with it and have learned that it works / has no consequences. Things steadily become more and more toxic.


I probably shouldn't go too public with what I know of report 9 that isn't on the record, but frankly next to no code from biologists has gone through peer review and people put "experts" on a pedastle because of what they claim their tools can do.

What I can and will say (and is on record) is that reproducibility was not a concern from the Imperial College virology dept.


The converse is something akin to Tableau, which came out of the graphics department of Stanford and Pat Hanrahan's lab. Tableau was acquired by Salesforce for around $15B. No doubt, proximity to SV was key to their growth. But the original research from 20+ years ago on DataCubes visualization was a product of academia. It just so happened to coincide with our era of "democratizing data science" ;)

Multiscale Visualization Using Data Cubes

https://graphics.stanford.edu/papers/pan_zoom/


This is pretty much my story as well. I work less and get paid much more after leaving academia. Idealism can last so long before one gives in.


I have some similar feelings about working for a game company (that was fairly successful and on the “better half” of the distribution of game companies).

Loved the creative control and influence I could have (even as a mid-20s tech lead on a title), loved my colleagues, loved the work, and even enjoyed the satisfaction from shipping a golden master after a multi-week crunch period.

Ultimately, a hedge fund was willing to pay me a multiple of what I was making in games and I decided I’d rather have a house than work on games.


Having worked in academia before the age of 40 (though not in computer science), I can agree with this view. To enjoy the benefits of status, compensation and, to a certain extent, flexibility, one has to reach the level of professor. After failing to reach this level at 40, I switched to software development, without ever regretting it.

A while ago I saw a position for image recognition in astronomy advertised by the university of my town. It sounded all very exciting for someone who enjoys figuring out solutions for a complex task -- until I looked up the pay grade: Less than half of my current hourly rate, without the flexibility I have (working from everywhere I want, even pre-Covid). Well, the problem seems to be that a competitive salary for a software engineer would probably have to be higher than for the professor leading the group.

As an aside, I recently had a discussion with a friend in my country's military about the cyber defence forces wanting to recruit software engineers. There is a similar problem here: if they do not use contractors (whom they can pay what they ask for), they have difficulty finding an appropriate pay grade, since a well-qualified software engineer would have to be paid better than a general.


totally this. I had a boss once who took me aside in her office to probe interests and direction. Somewhere she quipped about work in academia being paid peanuts. It was a pretty shocking statement and attitude to hear it so bluntly, especially being so close in time to my own graduation and years at the university and admiration for that world. But in all honesty I never heard it challenged much.

Later as I started to hear more about how economies function and how revenues really build up and stem from consumers in volume I came to realize that things that ultimately benefit a lot of people generate a lot of money. I saw an intermediary of this working at Apple and seeing how much funds they had to spend vs biomedical companies that were more conservative with their funds. Consumer electronics and consumer products in general have a lot of customers, benefit a lot of people, and ultimately earn a lot of money.

Academia is much more limited in its scope and immediate benefit. That delay in benefit shapes the money involved in all sorts of surprising ways that aren't immediately apparent while still under the wings of the academic world and the "currencies" they operate with be it notoriety, prizes, grants, etc etc. Ultimately the results and products of academia are suspect and risky since they're often in the prototype unvetted phase of birthing into existence. Those thoughts Elon Musk shares while touring Starbase about design vs manufacturing in the gauntlet of tests against reality where the two forms get vetted side by side come to mind here. His statement of "design is overrated" probably has a close analogue in academia. Products of the mind are essentially untested and may not stand up to reality along whatever dimensions one needs to evaluate them against, or as is probably more often the case simple don't scale to the degree needed to impact a large number of people in a short time frame to translate to paying customers.


I dropped graduate research into adversarial algorithms and generative adversarial networks when I realized instead of being paid beans to do something genuinely interesting I could get paid 6 figures to make business software and do whatever I want with my free time. Like so many other potential promising academic software engineers, I had a family to raise and a life to live. No kidding science needs more research software engineers, but that isn't going to change until science can pay software engineers at least a basic minimal income. When that changes I'll considering picking up where I left off.


Tbh the more I hear about the academia the worse it sounds.

Look I’m willing to take a massive paycut, I was starting to come around to the time cost, and hell, I’ll even work in Matlab. But:

> you don't have to work weekends, and you also don't have to feed the mice on a Sunday night

This is just wild, and if anything reminds me of low skilled near minimum wages jobs.


I heard nih might be a decent place for possible permanent position other than PI in bio fields


What's Nico Stuurman up to these days? https://valelab.ucsf.edu/nico_stuurman/


Ah, the uManager guys! Great software, sits on top of ImageJ (open source image analysis).

Our software was a custom C+Win32 app that was ported from CodeWarrior on MacOS 7/8. Windows timers were so crap that I ended up using Ryan Geiss' timer from Milkdrop: http://www.geisswerks.com/ryan/FAQS/timing.html

Yes, that Ryan Geiss, the Winamp one. He now works for Nvidia I think.

Our machines were bristling with serial ports, driving high speed filter wheels, lasers of every color we could get our hands on, special shutters coated with teflon, fast expensive cameras, and more! Their work is very much in my old field, I was in bioenergetics, specifically mitochondria and their dysfunction.

Thanks for the link down memory lane!


Thank you for your comment, it was a joy to read and get a peek.


As usual, xkcd is relevant: https://xkcd.com/664/


Well, then science needs to do more to create an environment where software engineers would want to be an RSE.

From my experience "science" does not appreciate RSEs, the compensation is bad, the freedom is bad, and you get all of the big company bureaucracy from the University.

It's just a bad environment and to top it off, even if you have a PhD many will see you as "less than."


There is simply no incentive for science to do proper software engineering. It does not directly produce papers or money, so nobody does it. I've seen it several times in my own career in several countries.

PhDs having their own version of the code, incompatible with the one of the Postdoc sitting opposite of them. Each slaving away at their niche project but nobody there to bring it all together. This particular university had really great code and they thought they could sell it. But they did not even use git and when I gave a presentation there about it, I was met with absolute rejection. They don't have an idea on how much effort it takes to maintain a commercial code, unit tests, customer support etc.

I'm now working at a company that produces simulation software spun out from university. Fantastic job and we do all of those things mentioned above. But obviously my paper output has been near zero even though we sometimes do cutting edge research.

I wish I had a solution to the problem as so much grant money is wasted that produces a paper or two and the corresponding software for that just goes to digital nirvana, it's a real shame.


This isn't completely true. My wife is a postdoc at a medical lab at Stanford and there's a big drive to push good software practices - containerisation, tests, documentation, etc... and a good friend who's a postdoc in medical imaging at Oxford runs training courses for their labs software (along with following all standard engineering practices you'd see at a company). This will be subjective of where you're working and if you're driven to write good software. With the reproducibility issues in science, writing good clear software is becoming more important.


Of course my statement wasn't meant to be absolute. I come from an engineering background. I am not surprised that in a medical setting the situation is somewhat different, after all I guess the stakes there are higher. If your CFD code does not perform well, who cares. If your analysis of some medication is crap, chances somebody cares is going to be much higher.

Also from experience, it helps if universities have closer ties to industry. As an industrial player you can't work with software that does not deliver reproducible results.


I'll agree that there is increasing emphasis on reproducibility and _useable_ software in academia. Writing documentation, unit tests etc. is still not really rewarded properly, but at least within the current paradigm such efforts are often rewarded with more users (and therefore citations) which is rewarded. Soon, hopefully, it'll be recognised more directly.

Also, I'm currently a postdoc in medical imaging at UCL, super interested in learning a little more about the group in Oxford you mentioned if you're OK with sharing a link/group name? I may be able to guess but just want to check!


Sounds exactly like industry.


> good software practices

> containerisation

Pick one.


?


RSEs should not be working on, or assisting, research at the PhD level. By that I mean news/skunkworks stuff. They should be taking the output from a PhD and turning it into core software for the group. That doesn't solve the collaboration issue between PhDs/Postdocs, but there is a particular point in a research project lifecycle where it makes sense to hire an RSE.

A bigger challenge is that most PIs are not project managers and have no experience as such, so they don't know how to express what they need in a structured way, or how to steer their group to collaborate properly. Outside computer science, many would struggle to budget software development or compute properly on a grant application (and the assessors have zero idea either).


RSE's can and often absolutely should be involved at the PhD level. In my experience, collaboration between the scientist and engineer in the process of research iterations almost always produce better results. Each has insights the other may not, likely leading to better outcomes for the research, final product/tool and time taken.

The scientist just wants to focus on their research and once they have a barely working proof of concept, hand it over to the engineer to figure the rest out. The engineer wants a well specified design and prototype that they can lightly refactor to clean up, scale up and turn into a product/tool.

The reality is that approach makes it way harder for both, though most often harder for the engineer as they are generally at the end of the chain in Academia and have little power. For example, the code or spec from the scientist is often terrible, so the engineer needs to start from scratch and keep going back to the scientist to spec out the design as they were not involved at any stage prior. They may even find edge cases or flaws the scientist had not considered that are fundamentally problematic to turning it into a viable product/tool.

This is why the big corporate/industry research labs often have high level RSE that are involved in the research process and get their names in papers (they sometimes have PhD's themselves). They are not optimising for the scientists time, but for the companies resources


Yeah let me be clear. PhD students absolutely should get guidance from experienced engineers (so I was a bit over-zealous with "assist" in my parent post). But this should be more like understanding best practices, and they should feel free to ask questions and figure out how to write better code. There are initiatives to do this called Software Carpentry.[0] However, RSEs should not be writing code for students doing PhD level projects in my opinion, for exactly the reasons you mention.

I know some of the big research councils do this in the UK. For example STFC has a program where they'll work with universities and companies to production-ise research code.

> The scientist just wants to focus on their research and once they have a barely working proof of concept, hand it over to the engineer to figure the rest out. The engineer wants a well specified design and prototype that they can lightly refactor to clean up, scale up and turn into a product/tool.

As you say, this is a great idea in principle. In reality I think that it's really difficult to make it work.

[0] https://www.software.ac.uk/programmes-events/carpentries/sof...


I don't think its about best practices, its about good design and communication. Even if we are just talking about PhD students, the majority of them are fresh graduates. They are no different than fresh grads in a company. Those grads work with experienced senior software engineers to guide them and provide design advice (not just best practices). Those engineers are often the ones writing the complex/difficult areas of code.

> RSEs should not be writing code for students doing PhD level projects in my opinion

So should a mechanical engineer PhD be designing and making all their own robot parts? Or should the shop engineer help them? The few mechanical engineer PhD's in robotics I know made a few early prototype test parts themselves with help from the shop engineer, but the shop engineer made and even helped design most of it, especially the final prototype.

> As you say, this is a great idea in principle. In reality I think that it's really difficult to make it work.

The point I'm making is that it does work and its proven to work very well (which is why the major industry labs do it). In my experience its Academia that doesn't like it. Anything which appears to take power/freedom away from scientists and gets in the road of their research is rejected. Though I think the core reason is (as other comments have mentioned), there is no incentive for Academia to make it work. The funny thing is that having a RSE working with them would actually help the scientists in the long run and allow them to focus more on the research because they wouldn't have to do everything themselves.


> I don't think its about best practices, its about good design and communication.

I would argue these should be included in best practices for software engineering.

> So should a mechanical engineer PhD be designing and making all their own robot parts? Or should the shop engineer help them? The few mechanical engineer PhD's in robotics I know made a few early prototype test parts themselves with help from the shop engineer, but the shop engineer made and even helped design most of it, especially the final prototype.

This is an interesting example. Every mechanical engineer I know has huge respect for their in-house machine shops. Everyone has a story about some design they submitted for fabrication, only to be told by the machinist that their design was terrible and they should do it another way. Generally machining jobs are very well-defined though, you have to submit CAD documents, tolerances etc.

The shops in universities I've worked in have a strong incentive to help people optimise designs because they're the ones doing the manufacturing, and they know what sort of things will work and what won't. But by and large this is informal. Usually this comes in the form of "have you thought about designing this another way, because this is really difficult/expensive/time-consuming to machine". Maybe this is just a cultural thing for machinists?

The PhD question - if your project is to design a new type of part then you should probably do the design. Should you make it though? It depends if the project is specifically looking at fabrication. Otherwise it's normal to dispatch this to a workshop.

In my opinion, it comes down to what your PhD is training you for or what you're hired to do as a postdoc. If your job is data analysis, then I think you should be writing code, but you should be able to get guidance and support. If you're a field biologist with no coding experience and you want to develop an app to take measurements, then that's a case when contracting it out to an in-house development team makes sense. I'm not saying it can't work, but the make in making it work is important.

If you incentivize RSE's properly then their time will become expensive and we need ways of figuring out how to maximise their impact.


Well I think the best way would be for RSEs to maintain a project and PhDs should then contribute to that project via pull requests. RSEs can then point towards proper coding styles, test development, etc.

That would ensure that the contributions of the PhDs does not get lost and they learn how to properly contribute to a project.


See my sibling comment on this. I think this is an ideal case for an RSE: if you have a shared codebase that ends up being contributed to by multiple members. That avoids legacy problems where someone contributes, leaves, someone else modifies, none of it is in source control etc. However, this assumes that you have a group that is structured around some common IP or library - and sure, there are lots of places where this applies. This is generally more mature research, not something that a PhD student has just come up with.

There are of course scenarios where someone comes up with some very high impact work, and there's an obvious need to make it robust or user-friendly, spin-out, etc.

It works less well for groups where everyone works on different or loosely related projects. That's not an efficient use of an RSE's time, in my opinion. Though of course you can have a situation where lots of people do random projects using the lab's core code. In both cases, there is a use-case where RSEs embedded in a university can train students on good coding practices.


> They should be taking the output from a PhD and turning it into core software for the group.

And what if the PhD’s output relies on software to even exist?


The challenge is that usually PhDs are not writing software that's designed for production, is very specific (for a single conference or journal paper) and often the utility of the code is nebulous until the end of the project. So what you don't want is for RSE's to spend ages writing code for a PhD project (which could be done by the student) only to have it thrown in the trash when the student leaves, or when they pivot to a new avenue of research.

I'm saying this as someone who did a PhD and who wrote a lot of code, including refactoring legacy codebases in my group.

Where the utility in having an RSE lies is where the group is all using some shared codebase that gets touched by everyone. This is the sort of cruft that I had to work with: legacy frankencode that generations of students and postdocs had added to. It would have made a ton of sense to pay someone to spend a year optimising it (which is ultimately what I did). But you want to make sure that RSE's maximise utility in the group. Having them work on individual student projects is not an effective use of their time IMO.


I'd say it depends a lot on the environment you'd be landing. I was hired to cleanup some clean room software and boy did it need cleanup. Simply by implementing some best practices and pretty much all what today goes as devops (this was 15 years ago in Switzerland) the thing got so speedy that a company got interested in buying it, so my contract was extended to add the final perks to close that sale. I refused the offer to join the buyer company and that was it for me. Ah and my name landed second on a paper, the only non-PhD on the list. All in all it was an exciting stunt while the pay was in line with what others said - 30% under the market average (they were even surprised why I wanted it).


Is there any solution to be found through analogy with established fields' relationship between engineering and research disciplines? How, for instance, do physicists share formal models which rocket engineers might apply? Certainly the physicists aren't out developing unit tests for engines which apply their theory, but nor are the engineers exasperated by physicists rejecting their standardized tools (I imagine?).


Well from experience the interface is conferences and papers. On conferences you get to know the latest stuff that is going on and the implementation details can be found in papers.

What needs to be said though is that reimplementing something from a paper is near impossible. I had to do that a couple of times and I was only fully successful if the paper was accompanied by some open source code as there always are tiny little edge cases or initialization details that won't be covered in the paper.


One could argue that bad software engineering practices are rewarded by academia. If no one can follow your work you drive away competition so why write comments. If you regularly refactor your work no one can use your API but you. Not unit testing ensures that code is cryptic, and people will have a hard time refuting your claims due to errors. The list goes on and on.


The goal of research scientists is rarely to produce a finished and polished product. Instead, they aim to prove some concept/algorithm/technique/etc.

They are almost always time and resource strapped, so it becomes a near necessity to deprioritize factors like readability and maintainability.

Still, many RSEs could immensely benefit from applying basic best practices.


This last point, and replies to this all seem to talk about the research industry. But it has been my experience that this same dynamic of being “less than” exists in any industry that is not primarily software.

I am a mechanical engineer by degree, but a software engineer by 30 years of practice. That has allowed me to thrice straddle the divide where I worked on software that was not just software for softwares sake, but rather as a value add addendum/enablement to an institution whose roots were in more real-worldly devices. I also spent time on the other side of the divide working twice for companies whose primary product was software.

This “less than” is a very real thing. Software is still a very new thing relatively and it’s taken over the world in just 30 years time. The power dynamics, entrenched in generations haven’t had time to rebalance.

I’ve seen this run the other way as well. Where companies that are primarily software, look down on the other disciplines that participate, a necessary evil, but resented.

I have yet to find a company/institution where the trifecta of mechanical, electrical, and software is balanced in mutual respect. If one such exists and you are looking for someone that would love to work in that environment, drop me a line. I’m skeptical that there are any. It seems one discipline or the other always triumphs and tramples the others down. On off axis variant of Pournelle’s iron law of bureaucracy .


Just curious, have you sampled many robotics companies? I have worked with dozens and it seems like there are some with well-balanced power dynamics between mechanical-electrical-software.


I have to agree that in some environments you can see comp sci majors picking on anyone who isn’t part of traditional “software”.

I switched from engineering into comp sci at my university and I can safely say that the elitism for software was stoked by the elitism of engineering.

While I was in engineering, everyone bragged about how they were going to get “The iron ring” (a thing we give to only engineers in Canada). When I switched over to comp sci, all my old engineering buddies would say to me, “yeah, but you don’t get the ring” like it meant everything.

After I graduated, I noticed the comp sci “software” folks didn’t like engineering folks much (especially new grads). When I asked why, they shared similar stories to mine. The engineers and their pumped up pride at my local university had hurt their relation to other fields by being arrogant.

I’m not trying to justify this kind of prejudice, but the reasons it happens are fairly obvious to me. It’s sad cause all of us are so similar in our trades and will likely end up doing similar work as well.


I sent you an email too, but my employer REV Robotics definitely fits this description in my opinion, especially in regards to the software team's collaboration with the electrical team.


I believe this has less to do with software in particular and just generally reflects the typical dynamics of the line/staff distinction.

https://en.m.wikipedia.org/wiki/Staff_and_line


The last point is important. In my experience it's mostly about status. RSEs are always seen as inferior to researchers (research scientists) who supposedly come up with the "big ideas" while RSEs merely implement the stuff they're told to do.

In reality the line is much blurrier. There can be no innovation and iteration without implementation of ideas and the RSE work is just as important.

But unless this view changes, nobody wants to be an RSE.


According to a quick search for the case of researchers, and the stereotype of most staff in academia across the board being underpaid, this means academic researchers would have even less incentives/perks/reward, because now they have one less beneficial status differential.

Is there some way to make human respect feel like it's not a zero sum game? The world may never know.


> Is there some way to make human respect feel like it's not a zero sum game?

Respect is non zero sum but status is definitionally a positional good. If you’re number 1 someone else isn’t. There can be uncertainty about status but ambiguity always collapses eventually. Everyone can be treated with respect but there will be a prestige or dominance hierarchy in any group of humans, subtle as it may be.


I actually wouldn't mind being lower status than the scientists, so long as my particular expertise was respected and I had a reasonable degree of autonomy within my domain.

I did tech support at a university for a bit. I certainly wasn't as high status as the professors, but they mostly respected me and the value I provided (especially when you do things like help them recover that next book they were working on, or whatever).


Yes, I think RSE is often viewed as a laboratory assistant.


Not just viewed, but literally called an assistant in the official title.

I am a licensed professional engineer (mechanical) and I work in academia (though my work is varied and does involve some code), my official title (in french) is "Professionnel de recherche", which translates to Research Professional. It is my understanding that in most of the english-speaking world, this position (someone that works for a research lab, who is qualified above the technician level, but not a PI nor a postdoc or student) is called "Research Assistant".

I write "Research Engineer" on my résumé/linkedin, because TBH for most people who don't have much experience in academia, "Research Assistant" sounds like an admin assistant (secretary).


I'd argue the position of software engineers/ programmers being those that implement the thinking of the business/science thinkers isn't unique to academia.


Apart from compensation and freedom you also get micro managed by someone who might know about the science part, but is really bad on the software architecture and engineering part.


> even if you have a PhD many will see you as "less than."

Inside academia and outside of it.

Inside academia you are not on a tenure track or similar, and will have to put up with a lot.

Outside academia, your peers will be making 3x or more than you, working less hours, with less stress, etc.

The reason RSE's jobs are hard to fill and often aren't even opened is that they don't make sense. If you are good enough for an RSE job, you will be good enough for research postions at FAANG. Those pay 10x more, so you also need someone willing to not accept that 10x pay, and also willing to work double the hours.

RSEs making a reasonable pay for the skills they require make no sense either, because that would put your pay at 2x that of professors, etc.


I'm curious which company pays a RS 10x more than a SE/RSE. I've found its usually SE/RSE's that make a bit more than RS's but I've never seen an RSE comp significantly outweigh an SE unless its in ML/Crypto.

On that note I wish levels.fyi had RS and RSE roles...


This isn't totally crazy but it's only if you stretch the facts enough. An RSE at Oxford will earn 32k GBP, like a postdoc. In theory, that person could be so good they could get the highest starting salary possible at a Tier 1 paying company like a research engineer at Hudson River Trading, which can be > 320k TC. So it's possible but only for a very very small number of people. 3x-5x is much more likely.


> highest starting salary possible at a Tier 1 paying company like a research engineer at Hudson River Trading, which can be > 320k TC.

You don't have to be at a "tier 1 paying company" to exceed 320k TC, that's easily achievable at any tech company that has gone public in the NYC metro or Bay Area. TC at a FAANG (MANGA or whatever they're calling it) for a more senior IC role can easily cross the $500k mark. If you've been their awhile and have been accruing shares that have increased dramatically in value, crossing the 7 figure mark is not unheard of.

So you don't have to stretch the facts too much to realistically achieve the 10x. If you take two very qualified engineer graduating with a PhD, one chooses to go to Google and stay there the other goes to a university and works as an RSE. Fast forward a decade, you'll definitely be seeing a 10x difference in total comp.


Oh after a decade sure. I was just interviewing with Meta for > $500k TC so totally believe it. I meant immediately after a PhD.


10x more than academia. I’m a PhD in mineral physics turned Big Four MLE via startups and I earn about 10x what I would be on if I had stayed on the academic track.


10x is possible, but maybe 3-5x is more realistic (even for non-FAANG).


I certainly agree - it was about 10 years ago now, but I did a physics degree and ended up pivoting to computer science. I offered to help a couple of friends in the department with their analysis, and ended up writing the majority of their code in actually analysing their experimental results. In both cases, their supervisor decided not to put my name in the resulting papers, and treated me less than a lab tech.

Why would anyone want to do that, when a six-figure tech company salary is just next door?


While the academia has a lot of issues, software engineers should also adjust their expectations.

First, if you choose to work for a non-profit, you should expect a huge pay cut. That's the way the society works. Or actually it's the opposite. If you work in a for-profit role, the society allows your employer to keep most of the value you create. If you create a lot of value, that gives you the leverage to negotiate much higher compensation than your peers in less profitable roles.

Second, like in any business, you should focus on what brings the money in if you want to advance your career. You should be the PI instead of the lab technician. Instead of building the software other people ask, you should become an expert in the field and build the software other people will need in the future. For example, bioinformatics as a whole used to be more of a support role, but today there are many high-profile bioinformatics PIs.

Third, many people in the industry fail to realize that you don't really work for your employer in the academia. You work to build a reputation for yourself, which makes you valuable to the employer. If you want to continue working for other people, the upward career path from support roles goes to the administration.


>First, if you choose to work for a non-profit, you should expect a huge pay cut.

It may be the culture in some ecosystems but it's entirely false. A nonprofit can pay competitively with organizations of its size. I've worked at nonprofits and I've seen this line towed far too many times as an excuse to lower labor costs for people producing value for the organization. If your nonprofit is service oriented in anyway and requires people, appropriately investing in those people through their compensation and WLB is an appropriate investment.

In theory, a nonprofit should be able to outperform and outpay a for-profit entity of the same size, largely because there shouldn't be an expectation of the organization to cut a big slice out to investors but instead to invest back into itself. This could mean increased hiring, grabbing top talent, investing back in a cause and so on.


While some non-profits may be able to pay competitive wages, especially for people whose skills are not in high demand, that's definitely not true for most of them. In general, non-profits don't compete directly with for-profit businesses. They are far more likely to serve niches where for-profit businesses are not viable for one reason or another.

Very often, the level of funding is what it is and the non-profit has very little control over it. If you choose to pay higher salaries, you get less work done. This is common in the academia, which is usually funded by taxes and tuition fees.

Sometimes there is even an inverse correlation between funding and salaries. If a charity chooses to pay higher salaries, its "administrative costs" increase. Donations may then dry up, because people consider the charity inefficient.

Many non-profits rely extensively on volunteer labor. Salaried professionals often contribute their time and expertise to a worthy cause for free. That puts paid employees in an awkward position. It's hard to argue that you deserve a higher salary when the market rate for your services is 0.


My apologies for not being familiar with acronyms, but what is a PI?


Principal Investigator, the feudal lord who has near total power over the post docs and total power over the grad students in their lab. They can ruin your career at will and will almost certainly suffer no consequences whatsoever from doing so.


> Third, many people in the industry fail to realize that you don't really work for your employer in the academia. You work to build a reputation for yourself, which makes you valuable to the employer.

That feels like how it works outside of academia as well.


There is a huge difference: Universities don't do reseach, and they don't care about the research their employees do. If you switch jobs, your old employer doesn't hire anyone to continue your research. Instead, you usually take your project to your new job.

In the industry, your employer tends to own the projects you work on.


That was my experience as well.

The RSE role doesn't seem to fit into their current model. It doesn't work from a career path, nor from a competitive compensation standpoint.

I suspect that the housing crisis is also going to push more people out of these positions.

If you have a PhD in material science, physics, chemistry, biology, etc... and are reasonably knowledgable in Python (or similar), perhaps spend a year as a postdoc. After that year, seriously consider moving to a tech company.


your premise is that it's not being sold right but maybe you just need more people to give their perspective. I might be considered a research software engineer in the US. In my job, I get paid (well) to develop open source software for data visualization, and I work remotely. I get to maintain existing tools and develop new ones. there is an interesting global community that asks questions daily, and I am constantly learning. I'm not working under the thumb of overbearing people, just delivering results and contributing to new grant goals to continue the funding. these positions are not necessarily common, but are interesting and cool.


I'll offer another positive perspective. I've worked as a software engineer within the Department of Energy's National Laboratory System for 15 years, and I really enjoy it. Software is a major element of much of the laboratory's work, and in some cases such as mine software is the main product. We enjoy autonomy, lead projects as PIs, and develop mostly open source software. We are also hiring https://nrel.wd5.myworkdayjobs.com/NREL/2/refreshFacet/318c8....


Hi FlyingRobot --- I'd be interested in hearing more about your career path (and openings at NREL). What I've personally heard and experienced aligns with the overwhelmingly negative accounts elsewhere in the thread, which (I feel) have also correctly identified the misalignment of incentives that's responsible.

On the other hand, I've never worked in the national lab system. If you're still monitoring this thread, I'd appreciate it if you contacted me at the email in my profile!


I will second this. The national labs are definitely some of the places who know what to do with research software engineers and treat them right for the most part. The Computer Systems Engineers and Software Engineers I had the chance to meet at LBNL had a decent amount of autonomy and were very good.


If RSEs are actually valuable but under-compensated, can someone please disrupt this industry?

Would love to see more startup medicine development companies ...

Alternatively, would it make sense to provide RSE services as a company, for very high hourly rates, separately from the rest of the university, so status and rules are less of a problem?


The reason they're under-compensated is that they aren't valued, unfortunately. A lot of scientists are self taught programmers and don't know what they don't know. The other issue is that if you say you should spend half your departments budget on hiring much better paid software developers, it looks really bad:

a. You're implying your colleagues are incompetent.

b. You can't use that money for "science".

c. It puts in painful relief that the software guys are much better paid than you are, implying that their work is more valuable than yours. But a lot of scientists put up with low pay because they believe their work is really valuable.


There are a few like these:

1. Benchling 2. Enable Medicine 3. Radix.bio


Absolutely, but I will give them the hand that achieving a good software dev experience is extremely hard inside a uni. There is a number of ways that they need to match the industry and are having a tough time doing so due to the institutionalization of all the adjacent processes.


IME that's all true. To say that they need to do more to create a more positive environment for research engineers though depends on the perspective.

From the perspective of the university and the PI, they can just get a graduate student to do the work for 1/5 the cost.


> From the perspective of the university and the PI, they can just get a graduate student to do the work for 1/5 the cost.

That is what they do. However, some projects grow in scale where a grad student cannot handle it well, or if they try, it will be a detriment to their research. To give you an example, only one of the PhD students (non-CS engineering) in my group and the ones around me had taken data structures or algorithms - yet everyone's thesis was "computational" (i.e. numerical computation, etc). None of the advisors would appreciate their students going off and taking serious CS courses.

When a research team wants to go to the next level, they need to hire someone with better SW skills.


I've worked at a top research university for the past 15 years as a research software engineer. Many of these comments have some truth but a lot are definitely not universally true.

Poor compensation: Yup, I'll cop to that. I could be making way more in "industry" but I make enough to live and I don't build ad-ware and do take pride in my work.

Being a PhD but little respect: I actually have the opposite experience. I'm not a PhD, I don't even have a Masters. But I can write decent software and I get respect and recognition from my PIs and other top-tier faculty that we've collaborated with. I feel like I've gotten just as much respect with my humble BS in Comp Sci from a state school than if I got a MS/PhD.

Micromanagement: I have the total opposite experience. I've worked with the same group for a long time, maybe that changes things. We apply for and receive grants to work on projects. We make high-level assertions for what we will do during the course of the grant and our PI gives us lots of lee-way to meet the goals of the grant in whatever technical way gets results, is FAIR, and open. Granted, I may have just lucked out here.


I also had a good experience as a research software engineer. Tons of autonomy and fun work. Reasonable expectations and no overwork. But it was barely a job, I got paid like a tenth of what I make in industry. Once I saw the kinds of offers friends were getting, I was out.

(My research institute didn’t have a concept of “research software engineer” so I was paid as a generic research assistant.)


I kind of alluded to this, but my experience was slightly similar.

B.S. in Physics (w/ years of significant undergrad research), but worked with PhDs+ in national lab scenario. At some point everybody assumes you have an PhD.

Little to no micromanagement. It was more professional scenario.

IMO an RSE at a mid-ranked state school hired by a PI and embedded into a research group, compared to an RSE at a top-tier research University - at a lab or institute that employs many engineers, they will have vastly different experiences and probably pay as well. There's exceptions to the rule, but this is generally my experience.

There's a good chunk of both jobs, but I would not work the former.


Lets be honest. Searching for https://hn.algolia.com/?q=burnout in hn shows so many people suffering from burnouts working in industry.

My SO is a research software engineer.

1. As you say, she doesn't build ad-ware and do take pride... 2. relaxed and convenient job timings 3. In EU - close to 31 days paid + usually Professors are very generous about going home early etc... 4. Pension, unions, unlimited contracts 5. No one gives a shit when github/gitlab/heroku loses credentials on a Easter weekend. (We were in roadtrip) - chill life.


Obviously "research" and "industry" are not monolithic entities. It's possible to have a relaxing and fulfilling and balanced work environment in industry too.


I was also very surprised by the micromanagement comments. I have seen much more the opposite end of things (in particular with PhD students), which is not enough management guidance. I know that none of the academics I know would have time (and interest) to micromanage their PhD students, let alone a software engineer working on some software for the group.


I no longer do so, but I was also an RSE (Senior Faculty Research Assistant) for 13 years with a large university, and loved it. Pay was not great great, but just fine all things considered. I respected the PIs I worked for and they respected me. I was constantly learning new, interesting scientific things as I'd work with new and evolving projects. I even got my name on papers even though I didn't even know I could be a co-author.


I also did a couple years at a university as a research software engineer and my experience matches yours.


I was a "research software engineer" for a major research organisation in Australia. There just weren't the resources to achieve what was desired. It would have taken me at least 24 months to modify the existing ecological modelling framework to work in the way they were asking for, they wanted it done yesterday, and there was no money to bring on another engineer to spread the load.

Instead I got given an intern from the internship program. While I tried my best to give them a good experience, let's just say that wasn't the boon to productivity my manager thought it was going to be.

To make matters worse, one of the researchers in the PhD program had been told these changes would be ready shortly before I even started, and the PhD candidate had made a series of choices about their studies relying on those changes being available. Felt bad to have to break the bad news to him after the lab manager had been blowing smoke up his ass for so long.


It's already been said several times over but I'm going to throw it out anyway. I'd love to work at a university as a research software engineer. I find things like bioinformatics fascinating and even interviewed at a research university on the east coast. But, you will get paid SIGNIFICANTLY less than you would doing almost anything else, have zero autonomy, and get no respect. Those 3 things are basically what most people look for in a job.


>you will get paid SIGNIFICANTLY less than you would doing almost anything else

Definitely true.

>have zero autonomy

Might be true? But in the right research group, this is definitely not true. I actually find that I had far more autonomy in academia than in industry. There are far fewer deadlines, less time pressure, and more curiosity-driven projects that are entirely owned by you.

>and get no respect

Might be true, but I find a lot of people in industry feel they get no respect as well. As an academic, I actually found that I respected academics more than I did people in industry. Though this is entirely anecdotal.

I was an academic until my 30s before going to industry, and actually find that the autonomy is the number one reason people like academia. YOU drive the projects because only YOU know the research that deeply. YOU get to be at the forefront, and YOU know things nobody else might ever know, until you get to share them through publications and presentations. Academic projects have essentially no deadline (projects are on multi-year long timescales rather than quarters or months in industry).


> Those 3 thing are basically what most people look for in a job.

When you put it that way, it makes it so clear why nobody wants to sign up

(BTW I'm the same way, would love to work on research, especially if it is something which could improve people's lives, but I'd have a hard time switching for these reasons)


> But, you will get paid SIGNIFICANTLY less than you would doing almost anything else

Yes definitely true, a junior software engineer in industry earns about as much as a Professor in academia. How would you justify such a salary at the university?

> have zero autonomy,

I've written that before. I would argue it's the complete opposite, autonomy is the one big advantage of academia. If you are a software engineer you likely have even more autonomy than most others, because your work is not directly linked to conference/grant deadlines etc. typically you would be given very rough guidelines on what you should try to achieve and then left to your own devices for long periods of time. Most PIs would neither have the time or expertise to micromanage a software engineer.

> and get no respect.

Again not my experience at all. I know that in my area people would respect a software engineer. I know when my partner was working as a research assistant in a medical lab, I wrote a small script to process some huge spreadsheet that they always processed by hand (work that took sometimes several weeks). They treated me like a god and even wanted to include me on the paper. The thing to remember though is you will not be considered a researcher, so likely not get grant funding and your job progression would always be in more technical roles, not in the academic track. I still don't see any difference in respect.


>> and get no respect. > not my experience at all

Respect could mean different things to different people, but I find the things you wrote not to be very respectful, but being nice in my evaluation.

I have worked in a research lab. While the folks I worked with respected my work, what I would actually consider respectful is including me in the design of the work, not giving me my part of work after they designed the thing they want to work on. And while most job industry is not good in this respect, working in research lab is worse than that. Almost all research labs treats SWE like cost center[0].

[0]: https://www.kalzumeus.com/2011/10/28/dont-call-yourself-a-pr...


> Respect could mean different things to different people, but I find the things you wrote not to be very respectful, but being nice in my evaluation.

> I have worked in a research lab. While the folks I worked with respected my work, what I would actually consider respectful is including me in the design of the work, not giving me my part of work after they designed the thing they want to work on. And while most job industry is not good in this respect, working in research lab is worse than that. Almost all research labs treats SWE like cost center[0].

I don't quite understand your point, do you mean that you as an SRE want to be involved in the grant being written? That might be appropriate if the grant is directly on the topic of software, but completely inappropriate if the SRE writes the labautomation software to be used in the grant. So if the researcher asks you to write a GUI to some lab-instrument for example is that being disrespectful?

Regarding your citation, I also don't get it. The text says that MBAs consider SWEs as cost centers, so that implies it's worse in industry (there's very few MBAs running research labs). I can also tell you that academics are largely considered cost centres by university admin.


The pay and autonomy will never improve, I can tell you that now. That’s the reality of the current system.


I’m certain the respect will not improve either. There may be steps made to improve it but they won’t stick.


You can do all type of flavour of bioinformatics in industry as well; in fact, the profession is so undersaturated it is really easy for a qualified individual to lend a nice R&D position in big or small pharma or biotech.


So much to say. I went through the whole academic cycle and moved to industry... about 15 years ago. I can't put it any other way than: the professors and PIs are abusing their employees. It has worked for some time because there were always enough people ultra-motivated to work in science for the prestige and other perks, but now, the economy is such that people can no longer do that (science job + family + mortgage = unhappy).

The right way to think about this is in terms of long-term politics and the success of nations. If you disincentivize national research, your country will fare worse than its competitors. For me, that's the US- if the US doesn't turn around the incentive systems for doing high quality academic research in the national interest, we will eventually be at an economic disadvantage to China.


100% correct. The professors and PIs abuse their employees, and the institution completely supports it. It's the business plan.


I don't think scientists want research software engineers. PIs typically like to assume any and every role on a project, including software design. They don't like handing things off and letting people build modern, best practices software. You start talking about source code control, package managers, a language used in industry that they've never even heard of and don't care enough to learn about, etc., and you just get asked "why can't you just write it in Python or C because those are the only two things I'm remotely familiar with and zip it up and put it on a USB drive?". Heaven forbid you actually want to do something novel, no matter how enabling it could be. I have been continually inspired by Bret Victor's Seeing Spaces talk and have always wanted to build such a system described for a lab, but unfortunately, it was really hard to get anyone interested much less excited by such an idea. Scientists are far more institutionalized, conservative, and myopic than they admit. Physicists don't even use the proper word for code. They call it "codes". (I've seen in first-hand, in research papers, and in published books. It's endemic.)

I've literally sat in meetings trying to get access to scientific projects I'm interested in and applied for jobs even at the university I already worked at in similar capacities, and yet, it's filled with PIs just wanting grunt labor (i.e., what they're used to from graduate students) to build things that they understand or you absolutely never hear back or the pay is a third of what you already make. And then there's pockets of software people in this niche world that only want to do things their way, because that's all they've ever done, and they can really be quite obstinate. They're like this massive pillar you must somehow work around.

I really, really, really like writing software in scientific contexts, but it's such a quagmire. I'd like to return to that world some day with the lessons I've learned outside of it. Software in support of an interesting domain or context is just really fun. I've also tried getting a job at an architecture firm that had some very neat ideas for building out a tool, but the response was basically the same as described above.


You seem to attribute certainl motivations to the PIs based on some misconceptions of the incentives and constrains for doing research. It is true that scientific software engineering is undervalued and this is one of the many issues in modern science. However attributing this to PIs wanting to do everything themselves is completely off the mark. One of the biggest issues in science today is that PIs can't do anything themselves, they, after having been trained as scientist for years, become glorified managers and administrators.

The reason why they can't and don't want to hire software engineers to build well designed software is because they don't have any money for it and it is very mismatched with their incentives. If a PI gets a grant for a project, they typically get a constrained budget to achieve the scientific goals of the project. There is usually no money (and it would not be approved) to engineer well designed software, because that doesn't produce scientific outcomes.

Now in the situations when there is budget for this, which are the super large institutional research initiatives, like e.g. CERN, they build incredibly well designed software. The SKA for example will have the largest storage cloud in the world.

If one wants better designed scientific software one needs to seriously overhaul the scientific funding process and quite significantly increase funding.


PI here (Research Professor of AI, former industry R&D director, so I have seen both sides). The money will never be the same in academia, and rewards are outside of the control of the PI in most cases (e.g. in Germany, it's a government-level decision and everyone is paid the same), so there are two options: work with average people or find smart & idealistic ones. So...

...please excuse this shameless plug: I'm interested in people who would like to work with me to build research software (both on-premise and AWS cloud) using best practices, in particular using Rust. If you are interesteed in machine learning, natural language processing and information retrieval and would like to work with me as a research software engineer, please do get in touch (leidner at acm dot org). Our new 2 TB RAM dev research group server running Ubuntu 22.04 LTS is waiting for you (CPU/GPU cluster ressources also available) to set up (a VM for) your CI infrastructure. Your software will be used to push forward the frontier of research in machine learning and its applications to language (summarization, question answering) and information retrieval (vertical search engines, learning to rank, unsupervised topic modeling).

Edit: academia can be poorly funded and bureaucratic. The former can be fixed by grant applications, the latter must be tolerated until politicians/activists fix it.


The fact that you included a job posting in your comment might be the best evidence in this thread that it's hard to hire RSEs. Good luck of course :)


I recently quit a job as a software engineer on a research project for all the reasons mentioned. They want grunts who don't care about project management or good engineering practices and they will belittle you as not worth the money even when you took a (theoretical, as I am overemployed) massive paycut to go and work for them. I made 1/3 there compared to the jobs I am interviewing for to replace it.

These people desperately need software help and have worthwhile projects that need software help. They also need to learn to collaborate over just parsing out obscure goals and being upset when what they get matches their three sentence email and not their vision in their head.


Yea, it's frustrating because there are indeed some super cool projects just waiting to have interesting software written for them, especially in the more hardware/physical oriented R&D areas. There are some areas that take software seriously, but in general, I have found it really difficult to get scientists and research engineers excited about anything that they don't know about. I don't think its malice. I think its part of their education and institutionalization couple a little with arrogance and somewhat blind view towards areas outside their expertise. I have given presentations before, and I usually only got interesting feedback from non-PI level workers. I always tried to paint a picture that we were doing state of the art technology development, so why shouldn't state of the art software be developed to support that?


It's very interesting that you say you find it difficult to get scientists excited about anything, because I am told pretty much the opposite. To give you the perspective, I'm a scientist and my partner is a GP. I obviously have many friends who also work in science and one thing that my partner noticed quite early after we met is that the scientist friends (and myself) would ask a lot of questions when she was talking about her work, and really tried to understand reasons, mechanisms etc.. She noted that this was very different to almost everyone else.

I can definitely tell you that if I talk about my work non-scientists become quickly uninterested.


In my experience, scientists do not like anything that they perceive as "soft". I like to say that the hard sciences are actually the easy ones and the soft sciences are actually the hard ones. (As it turns out, Herbert Simon also said this.) The point is that when you discuss things like design, complexity, architecture, systems thinking, etc., I've found that this does not stick, so to speak. Scientists do not see beauty in form equaling function. In my experience, scientists like learning about things they consider science-y enough for their tastes.

This is all painting with broad strokes, but in my experience, the most interesting intellectual conversations I've had have been with people with the least amount of degrees. I always feel when I talk to someone with a Ph.D., they're afraid to let it be known someone might know more than them and if that happens, then it's suddenly something that's not important.

The book Disciplined Minds gets at this some. It's part institutionalization and other things. My gut feeling is that it's exacerbated by publish or perish culture and the push to always be right. So people that are brought up through that are intellectually risk averse.


bmitc 1 day ago | parent | context | flag | on: Science needs more research software engineers

> In my experience, scientists do not like anything that they perceive as "soft". I like to say that the hard sciences are actually the easy ones and the soft sciences are actually the hard ones. (As it turns out, Herbert Simon also said this.) The point is that when you discuss things like design, complexity, architecture, systems thinking, etc., I've found that this does not stick, so to speak. Scientists do not see beauty in form equaling function. In my experience, scientists like learning about things they consider science-y enough for their tastes.

In my experience what you say is true for engineers (and software engineers are amongst the worse in this respect), but much less so for scientists from the hard sciences (physicists, chemists, biologists). You can also see that from many famous physicists being involved in philosophy, while many engineers consider philosophy "irrelevant talking about nothing".

> This is all painting with broad strokes, but in my experience, the most interesting intellectual conversations I've had have been with people with the least amount of degrees. I always feel when I talk to someone with a Ph.D., they're afraid to let it be known someone might know more than them and if that happens, then it's suddenly something that's not important.

Maybe we converse in very different cycles, but generally for people I know the "higher up the foodchain" the more they ask even very basic questions. In my experience you can often distinguish who is the most accomplished scientists in the room by listening who is asking the most basic questions.

All this is obviously anecdotal evidence, but is reflected also by famous accounts like e.g. Feynman's account of the challenger commission.


On the flip side of that, many experiments run for decades or longer, and "state of the art" software is often a poor impedance match for that.


State of the art doesn't necessarily mean React. Just something with unit tests and more than one giant 5000 line file.

If the requirement is that it lasts 30 years, you can choose more timeless languages like C, over say Rust.


I guess a better word for that might be "not terrible" :).

And yes, way too much research code reads like somebody's first coding project.


I'm not convinced of that. Because what I've seen (several times) on a somewhat small scale perhaps, is someone's code, sometimes written decades ago, never maintained upgraded or properly developed used for years on a system and eventually it gives out, forcing basically a rewrite. And state of the art can mean different things for differently scaled projects.


I've certainly had to resurrect research code written 10+ years before I got to it. It's usually not too hard as long as it's in C/C++/Fortran and doesn't have too many dependencies. Something like Matlab, on the other hand, is often a nightmare...


You have too look at the big picture though. How much time would have been needed to train the PhD student (might have been even the PI) to write more maintainable code 10 years ago, and maintain that code to "best practices" over the time, compared to write that it works once and need to rewrite once after 10 years?

Moreover, people tend to forget that scientific software evolves much more slowly than commercial software and the number of people who work on this are much fewer, so why always adopt the newest method, if the code is still working. Show me commercial projects that have had the stability of something like LAPACK or BLAS, those essentially had a stable API for > 50 years now, with very few bugs.


It's okay for things to start off rough, especially depending on the context. But there's often not much excuse for things to be taken seriously and evolved.

But to be honest, I've seen this in industry as well from software developers. So maybe it's just a law of software development that's amplified in scientific software.


Are you saying that you hold multiple full-time jobs?


> They call it "codes"

They/We have probably been calling it that since before you were alive. I wouldn’t be surprised if it comes from the 60s or 70s.

Every group of people has their own terminology for stuff. There is no right or wrong.


Yeah. Considering how much time we spend developing and using the damn things, we’ll call them the way we like, thank you very much. We don’t really need condescending software types to just come and explain to us how we’re saying it wrong. That is particularly rich, considering the tendency of CS people to cargo cult and borrow physical terms without understanding them.


Always annoying to hear machine learning people speak of "tensors"...


I'm still waiting for std::pseudovector<T>


It aggravates me too, however, considering that the definition of a tensor according to mathematicians is an element of the tensor product of two vector spaces (or whatever other objects you can tensor together), and according to physicists would be an object which transforms like a tensor, I’m somewhat sympathetic. Neither definition sheds any light on what a tensor is to anyone who doesn’t already understand what a tensor is, and I’m convinced that the moment one understands what tensors are they lose the ability to explain what tensors are.


I have an academic mathematics background and I'm perfectly OK with this.


Yes, please stop naming your software projects after elementary particles :).


Sure, there's no absolute right or wrong on terminology, but it is a sign that they're cut off from the software engineering world, which leads to a whole host of dumb practices, described elsewhere in the comments here.

As someone with a mathematics background, whenever I encounter someone in another community using mathematics (like economics or accounting or finance or whatever) using mathematical terminology that seems silly to me, it's a bad sign.


I have never heard or read anyone, beside physicists, say codes for software code. It doesn't even make sense. Where have you seen that used?


I have never heard or read anyone, beside physicists, say codes for software code. It doesn't even make sense.

Why not? "A code" is a software package which solves some kind of problem computationally. Plural, codes. I'm sure it's been in use almost as long as the programmable computers. Here's[1] a link to an article from 1968 which uses the term in the singular ("The MENE Neutron Transport Code".)

[1] https://www.osti.gov/biblio/4819611-mene-neutron-transport-c...


That article appears to be written by a non-native English speaker. I don't think I've ever heard anyone refer to a software package as "a code."


I don't think I've ever heard anyone refer to a software package as "a code."

Then you probably haven't been around physicists, or read enough physics-related material. A non-scholarly reference which can't be easily checked online is George Dyson's "Project Orion," where Freeman Dyson (author's father) is quoted as mentioning this or that "code" as significant for the design of various Orion parts. (That should be a native enough English speaker.) The time-frame is late 1950s to early 1960s.


Actually I have been around physicists. In my experience they use “codes” for any amount of code. If A wrote some software, B might say “I’d like to use your codes.”

That reference you give doesn’t appear to be available online but I’ll take your word for it.


There are plenty of words in English that have the same singular and plural form.


I've seen it a lot outside of physics.

Obviously, what word they use for it is arbitrary but it's indicative of an actual problem - they're all totally cut off from the software engineering / CS community. Lots of them think programming is like learning to juggle or ride a bike or something, a skill that anyone just learns and then they're done. Then, obviously, they make major mistakes that invalidate their conclusions and nobody notices or cares because it's all student code, all the time.


It’s used fairly often in the US national lab community (Not just among physicists).

I think the point is that the physicists were among the very early adopters of computing systems. I’m not sure about the history of the term “code” used this way, but it might literally predate compiled software since Ulam and von Neumann invented the Monte Carlo method in 1948 during the Manhattan Project.

Personally I think it just sounds outdated now, but it’s pretty common to hear people call their software “a hydrodynamics code” or “a molecular dynamics code”


Now I do wonder about the history of the term.

The tendency is to call computational science software “codes”, but I’ve never heard a scientist call, ie a web browser, a code.

There’s a different connotation. Like how theres a different connotation between a program, an application, and and app. The are largely synonymous, but have slightly different uses.


It's pretty widely used in "traditional" engineering, at least in mechanical/civil/nuclear/aerospace. My impression is that it's probably a holdover from the good old days FORTRAN-IV.

See for example Code_Aster (as in, "a code"), which is one of the largest projects for open source FEA (and it mostly rooted in academia).


All mechanical engineers. All of them. Even that guy, who corrects himself when you're in the room - as soon as you're gone, they're back to the codes.


I'd encourage you to have a look at a well-known European physics research laboratory's (rhymes with Bern) library and engineering departments (not IT, unless they've started using version control and tests since I worked there). The salary is not as bad as academia, and there's plenty of interesting stuff going on. Most of it is accessible to anyone interested in a way I've not seen outside of academia.


Certainly cool stuff but out of the question due to location. I've worked for or met with people from similar size (at least in ambition) projects. It's a tough world to break into and especially difficult to bring a different outlook to.


I can understand not wanting to move very far, but the location is in many ways great: French alps within driving distance, Jura within walking distance, the lake, old architecture, lots of fields and forests. World-leading public transport. Geneva itself isn't very exciting, to be fair, but you're in the heart of Europe with easy access to all the rest.

As for "breaking in," I dunno. The hiring process seems very fair and international (very few locals work there in a technical capacity AFAICT), and there are plenty of people with flexible ideas. Not everyone, for sure, but then there are so many different groups there you couldn't possibly blanket everyone with that statement.


Have you considered US's national labs if you are in the US?


Hard to get into, have to move to one of five places, usually requires security clearances (huge ordeal), extremely clean lifestyle, if your PI is evil expect zero protection, post doc can be considered entry level qualifications, can be asked to work 7 days a week. Ultimately, usually still pays less than entry level software positions offering full time remote work...


> They call it "codes"

This is not really the core issue here, but yes, I've encountered this many times, and I have to admit that in some professional contexts I use "codes" as a shibboleth to identify people who aren't really in touch with the software engineering world.


I always assumed it came from big international physics experiments where English is most people's second language. People who learn to program in an expat community are going to pick up some jargon that will sound weird to native English speakers in California.


I wonder if this used to be the more common term, and at some point we switched to “code”. Physicists were among the first serious computer users, but their software culture may have bifurcated from ours at some point.


I think you accidentally revealed the problem when you said "best practices", because asserting such introduces a presumed total order on expertise, and that introduces conflict. thus ends the discussion between a PI and someone telling them they know better?


It's pretty easy for me to acquiesce all design decisions to their expertise and/or role in addition to being excited by what they do. I'd just like that to be reciprocated.


See this infamous discussion: https://www.reddit.com/r/MachineLearning/comments/6l2esd/d_w...

Tldr: They view software as just a tool and don't see how they'll benefit from following best practices.


Oh wow, looking at that Python code they linked to brings back memories of my ML course in college. We had skeleton code given to us to fill in for the assignments, and often times it took longer for me to understand what the code they gave was even doing so I'd know what to fill in than it took me to actually write the code. Occasionally there would be random semicolons at the end of the lines, but the author seemed to understand that wasn't necessary in Python since they weren't on most lines, so I guess they were just careless. They also used a very amusing "OO" pattern where they would pass in every input needed to the constructor of objects, then have one method pass all of those instance variables directly into other methods which ignored the fact that they had a `self` parameter and just used the copies of the instance variables (with the same names) passed in as parameters. The biggest confusion I ever had was in one method where they used the typical `n, d = matrix.dimensions` (I can't remember the exact incantation, but it was used in all the code to store the dimensions in `n` and `d`, immediately followed by `d = d + 1`, which I took to assume that the leftmost column of the matrix was just filled with 1s and wasn't needed (which happened in some of the algorithms), but then down in the implementation of the algorithm, the only time `d` was used was as `d - 1`. Because I had mentally skipped over the boilerplate at the top of the function, I spent a decent amount of time trying to figure out why they had subtracted 1 from d until I finally saw the increment above. To this day, I still struggle to decide if the author legitimately did not notice this and just used `d - 1` to try to fix a problem that they didn't know the source of, or if they knew and thought that it was a reasonable way to implement things, and I'm also still conflicted about which would be worse.


You can write FORTRAN code in many languages


Good craftsmen care for their "just tools", this is delinquency.


I'm an RSE and the president of the Society of Research Software Engineering (https://society-rse.org), a small grass-roots registered charity in the UK that acts as a home for RSEs in the UK and internationally. We came about from the organisers of the RSE conference (https://rsecon2022.society-rse.org) and from the early pioneers in the naming and recognition of RSEs.

The other comments in this thread indeed point out the issue of poor salary compared with industry, this is a problem that I feel will persist. Historically, the benefit of working at a University over a company has been greater freedom, better benefits and (at least for me) the knowledge that you;re working for the public benefit. Unfortunately, the benefits are being reduced as time goes on as Universities are under greater financial stress.

My role as an RSE at the University of Bristol has been in a primarily teaching position for the last few years, where I have been training PhD students, postdocs and research staff (all the way up to and including professors) in both the basics of programming but also testing, version control, profiling etc. My role is funded indirectly by the government as it is recognised by the funding agencies that RSE skills are essential for modern research and they are putting their money where their mouth is by funding more software projects, fellowships and training programmes in this area.

If you think that you'd like to know more about getting involved as an as RSE, or if you work as one an would like to meet like-minded people then you can join as a member of the Society, join our Slack channel or our mailing list (https://society-rse.org/join-us/) or find out how you can get involved (https://society-rse.org/community/get-involved/).


So, I was an RSE? Know why I left?

Money. It’s all very well being an RSE but there are two ways of being funded (at least here in the UK). One is centrally funded by the University - universally this means being shoved into the IT hierarchy which is not a good fit. The other is on an ongoing basis by doing 1-2 year contracts. So no job security.

The salaries for both of these are very poor. On leaving I took a 35% pay rise and do basically the same work but in industry. The only difference is that instead of working for engineers in academia, I do work for engineers in companies. Since I jumped, I negotiated another 20% pay rise. If I’d stayed, I’d have got a 3% pay rise. That’s not including the fact I get bonuses, private health insurance, don’t have to pay to park at my own office…

The issue is that the skill set demanded to be an RSE is highly desirable in industry - able to write code well, often mathematical, familiarity with much of devops type work.

On top of that, researchers don’t truly value your contributions. They don’t want to put you on papers that couldn’t have happened without your work. On top of that, they’re hugely demanding about timescales that are unrealistic. I got put onto one project which had contractual commitments to deliver to a major governmental agency to run a live service, and the guy running the project handed me a 5000 line Python 2.7 script that wouldn’t even run and basically said “get it working”. He couldn’t even supply the input files needed to run it for 2 months after that, just ignored emails. I quit before it got any further.

Edit: forgot to add, my boss was a grade higher than me on the national academic salaries scale. He had technical responsibility and also had to line manage 10 people. Even if I could have gotten promoted (impossible - no budget), I didn’t want the line management responsibility. I earn more than the top of that salary band now, and line manage nobody.


Research software engineers need to be more recognized.

In my workplace, I am the rare one who can code C++ to work with drivers for real time stuff and develop C# GUIs for convenience and automation. Everyone else is happy with Python or Matlab scripts that are only usable to one person and always need to be edited even for the simplest parameter changes. I would do analysis stuff in real time in a camera feed and adjust the experiment in real time, while others could only analyze an image after being saved. The difference in productivity is literally a hundred times. In most cases the job would have been impossible otherwise or at least painfully inefficient.

Unfortunately, research leaders cannot tell the difference (or pretend to), and just label everyone the same. They are not software engineers themselves, and are probably even insecure. I feel my opportunities and potential are not being recognized either. I also feel that they don't respect that software development needs more time as they will just dump as much tasks to me as the other non-software engineers.

I have to eventually leave such groups as I don't see their willingness to step up in their coding skills despite being heavily reliant on my output. Draconian intellectual ownership rules don't help either.


I’d been active-ish in US-RSE and worked at a national lab for 10+ years. I left about a half year ago for a position at a FANG-ish company.

My pay was almost doubled (after counting bonus) and with RSUs it’s double+.

I could have squeezed out another $XXk maybe when I left the national lab system, should I had stayed.

The biggest reasons I left were as follows:

1. Too much legacy responsibility. Legacy responsibility is usually underfunded or unfunded but “important”. It’s a lot of work to keep the lights on. NASA usually handles this with contractors which are usually paid even less than a National Lab person.

2. Bifurcation at the lab towards very large projects or very small projects. Not a lot of stuff in the $250MM to $750MM which is large enough to have an effective team and go full time, but small enough to not be anonymous/matrixed out to subprojects. The high end is a very funny place

3. Related to (2) is not enough new blood or turnover. This may be a good thing but you can be hired as a junior engineer and end up as a senior engineer without anybody below you for years. Getting an intern is a dark art if your not a staff scientist

4. Institutional BS. Looking at you, security drones with professional certificates running qualys/nessus scans. and you too, data center people afraid of the cloud.

Some to zero agility in terms of compute - everything is a nail and slurm is your hammer

The university-based RSEs complain a lot about not being respected/publishing. I never really had that issue much specifically, but that may be because there’s a stronger understanding of value in the national lab system. On the other hand, poor leadership/project management means you may still be at the whim of an idiot, servicing their desire for some crappy web app project. In those cases, it’s best to ignore.

I will probably go back after some time when I’m throughly annoyed by BigCorp. In the mean time, it’s fine on the other side and at least I don’t have to patch Jenkins every two weeks.


For this to work academic science would have to start paying significantly more, which many labs simply cannot do, and other labs will refuse to do.

My PI flat out said when we were trying to hire a python developer that they simply could not be paid 2x more than anyone else because it would cause major issues with team cohesion. 2x is the bare minimum to even get close to market rate for a full time developer who won't dance out the door a year later.

By this logic either everyone in science needs to get a massive raise (stop to the PhD slave labor?), or rse's really have to justify their salary, and when everyone else works 60-80 hours a week for half the pay and is also defacto on call, that justification looks like zero work life balance.

Talk to a funder near you, let them know that congress needs to provide more money for basic research. Be sure to sell it as training future employees for big tech too since they will all eventually leave after you finish teaching them.


let them know that congress needs to provide more money for basic research

That won't work. Academia isn't under-funded, especially not in the USA. The NSF budget last year was $7.7 billion so money is there.

The reason salaries are so low is because academia will generally choose to expand the empire by creating new projects and hiring more postdocs over raising salaries of their existing team. And they can, because lots of people have stars in their eyes about science. They see it as more virtuous than mere product building and are willing to put up with a lot for the associated prestige. Also they may have picked their vocation early on in life when their understandings of real market rates was poor, then get stuck in it.


Yes I don't understand the complaints about the salary a software engineer would be paid in academia. A software engineer straight out of undergraduate, typically earns as much (and often more) than a full professor at the university, who is >20 years after their undergraduate. Should they pay the software engineer more than everyone else, even though they don't even work on the core mission (i.e. the science)? What is even more ironic is that this comes from the same crowd who complains about high taxes, well research is primarily paid from tax money (private foundations/donations are a minuscule part of overall research funding), so the money to pay the software engineers would need to come from taxes.


The university doesn't operate in a vacuum, and the rest of the economy doesn't really care about the fairness of internal pay structures at the universities. The engineers aren't telling you it's fair that they get paid more than a professor, they're just relaying the reality of the market, and the university still gets it's labor from the market.

This does play well into my argument that if software salaries in general came down then the rest of the world could utilize software development more broadly across industries, rather than concentrating so much talent into one inflated industry niche (the tech industry is that niche).


> Should they pay the software engineer more than everyone else, even though they don't even work on the core mission (i.e. the science)?

They do not have to pay the software engineer anything if they can find one who is happy with getting nothing.

Salary is always going to be a factor in who they can attract, especially when their competitors can offer much better compensation. If they are satisfied with the software engineers they are currently getting, then they can keep doing what they are doing now.

What the professor gets paid matters very little compared to what the candidate can get at a competitor.


But that's my point, the software developer is not a crucial role in the research process (unlike (most) professors arguably), they are nice to have and definitely helpful, but getting a grant to do research is difficult enough, paying all the funds to a software developer means no research gets done. In other words in most cases a software developer does not add enough value to justify their cost if they get paid like in industry.

I was not complaining about developers being too expensive, I was responding to people who said that they would like to work as a RSE but salaries are too low and explaining why the salaries are so low.


If the developers don't play a crucial role, then why did I just read an article claiming we need more of them?


I'm a R&D scientist, working in industry. My work site has a full blown software department. Yet I do all of my own coding. Pay and status are not issues -- if I had a programmer working for me, they'd get paid according to industry standards.

Some other issues include:

1. Knowing that software development can be a black hole. Nearly half a century after The Mythical Man Month, management of software projects remains an unsolved problem. And we're not gifted managers to begin with. When I do it myself, if nothing else, I have a pretty good sense of when it will be done.

2. What to do if there's not enough programming work for a full time programmer, and that person doesn't want to work on other things.

3. It seems like programmers with domain skills related to R&D, such as math, are in particular demand.

4. It takes a certain temperament to hang in an R&D setting. A lot of engineers just hate it: The rapidly shifting requirements, and the knowledge that something they make will only be used once, or even wrecked.

The dirty secret is that the same programming work that's a step down for a commercial developer, might be a step up for a grad student or researcher who wants to learn a marketable skill. Also, a multi-disciplinary team, including people who can program as needed but also do other things, can be quite agile.

My team has a software engineer, but he doesn't do very much coding for us. He has actual R&D projects of his own. Yet he can help us at a higher level, for instance giving guidance on how we can write better code, and make better use of tools. This brings us a lot of benefit without wasting his brain cells on mundane coding tasks.


Would love to work at a University as a Research Engineer. Would not love to do it for less than I made as an undergrad out of college. Universities can avoid not paying market, but they pay more like 40-50% of market.


The first thing I did for my PhD in experimental quantum computing was to rewrite the lab control equipment in Python. It was written in LabView and while it worked well it was an absolute mess of (literally) tangled spaghetti code. I designed and implemented one of the first MVC frameworks for quantum computing in Python. It had an interactive IDE where you could execute scripts and store snippets (e.g. for qubit calibration), an instrument manager that allowed initializing and managing all lab equipment (locally and over the network), a data manager that allowed capturing and exploring measurement data in real-time and instrument frontpanels that allowed interacting with instruments (e.g. microwave sources or waveform generators).

Initially my supervisors were very skeptical and thought I'd waste my time, as they did not see any value in software development. Having a programmable and well organized software framework was absolutely instrumental for my work though, and I think I couldn't have succeeded at my experiments (demonstrating quantum speed-up for a simple quantum algorithm) without it. Today the software is still in use in several labs and as far as I know they also hired a research software engineer to keep developing it.


This sounds like a good outcome. From my experience, the usual reason small labs setups use things like LabVIEW is that they are usually built by one PhD student without experience and this is the only way they can get something working in a reasonable time. But then the student moves on and the result is something that may work, but nobody can build upon.

Still, the situation you describe is unstable because as soon as the budget starts to look uncertain that engineer is likely the first to go. I really wish institutions would take this problem seriously and set up e.g. a department wide development group, which could help build and maintain stuff like that and ensure continuity. Unfortunately this seems to be against the current trend of cutting costs. For example, at the university I used to be, they cut the small common electronics shop, which was a great resource since the guys there knew all about who had what equipment, kept spare parts from broken stuff, etc. But in the budget it looked like pure cost I suppose. It's hard to measure the value of common resources and nobody wants to pay.


I think the software spread to multiple labs and was even used in a commercial startup, so I think it will live on. Also, Python skills seem to be increasingly taught at universities, so many PhDs have had some exposure to the language already.


The principal incentive of science is publish or perish. Novelty trumps replication of the previous results (see replication crisis). Use of the citation metrics to evaluate performance in science is similar to relying on likes in the social media to measure importance. The number of citations matter more than the effort (cost) of producing the result. This environment favors those who can deliver something that's good enough, and then move on (and publish often).

Software engineering is a continuous effort. Maintenance often requires more time and resources than the original development. It's more like curating a library or a database. Their product is perceived to stay the same. Incremental improvements do not advance the career of the maintainer at all. Researchers who sink their time into software development mostly hurt their academic careers. Even if they publish a paper about ResearchTool 1.0.0, they won't be able to publish another one about ResearchTool 1.0.1 and another one about ResearchTool 1.0.2. Meanwhile, their peers will probably publish "A", "B", and "C".

Science needs a career path which does not depend on the number of publications.


I can only speak for how it appears in my country (Australia), but it seems like university academic staffing models in general are broken. I have 15+ years experience in service providers in Cybersecurity and SE roles. I have 2 Masters degrees with GPA7/HD averages, and love to teach. There is nothing I'd love more than to teach university level topics with a primary teaching focus, and research as a secondary. It's not about the money - they can pay me less than the Australian average wage and I'll be happy.

Over the last 2 years I've interviewed at multiple universities for Level A (Associate Lecturer) and they all had similar requirements that I've summarized below:

> you must have completed a PHD. > You must propose and perform original research, assist in upper level academics research, and publish X times per year. > you must participate in fundraising, complete your own grant applications and assist upper level academics in their fundraising & applications. > you must supervise and advise X number of postgraduate students thesis/dissertations. > you must teach both undergraduate and graduate level classes across 3 trimesters. > the position is fixed term for 12 months, rather than ongoing/permanent, with no job security and you are required to reapply each year.

Most current academics I know from Level A to Level E, say they don't particularly want to teach, and just wish they could focus on their research and fundraising to support their research. I met other applicants at networking and alumni events like myself who would like a teaching focus, rather than a research focus. There's a supply of "specialists", but an insistence on "generalists". As the fixed term aspect tends to apply to research fellows as well, the risk from the lack of security combined with the extremely low pay makes it very difficult to enter, and remain, in academia. I can afford to live on $65k a year, but the possibility that every year I may be faced with periods of unemployment and the requirement to relocate thousands of km away makes it a challenging prospect.


It comes down to teaching not bringing in money—enrolments (either domestic via the government, or paying international students) and grants bring in money (those requirements are basically "can bring in grants/other funding"). Even with ongoing positions, things like COVID (or even a change of focus) can cause the positions to end (and the ability to bring in grants would play a key roll in who stays and who goes). Any substantial change in the model is a matter for government, not the unis.


The other side of "We need software engineers in research" is surely the botched covid model from Imperial College way back at the beginning of the pandemic. The group that produced it had the audacity to complain that the software industry didn't make C++ foolproof enough, and it was our fault that the modelling was wrong.

This shows the attitude that some folks in these places have - what you do is easy, and rather than your industry being exceptional as a profession that makes its tools open and shares knowledge openly, you are at fault for releasing things which I can get wrong.

"Scientists" in this view are smart people, and anything you lesser mortals do should be obvious to them immediately, or it must be your fault. Training and experience be damned.

I'd hate to work anywhere where that attitude was prevalent, or even really existed at all. And yes I realise that non-software folks often hate to work with us because of the exact same thing!


I skimmed that code at the start of the pandemic and it was pretty tragic, eg how accurate could deaths be when there were no nursing homes in their models?


Having worked in both, software is much easier than producing quality scientific discoveries.


OK great, but being good at one doesn’t mean you’re automatically good at the other, or that it’s some ‘lesser’ field’s fault when you screw it up.

In this case, due apparently to arrogance, neither was produced.


Any reference for that incident, i havent heard of it and am curious.


Here's the open letter criticising our entire profession for producing something as terrible as C++ and allowing scientists to get into trouble with it -

http://blog.khinsen.net/posts/2020/05/18/an-open-letter-to-s...

There are links back to other parts of the story in the text there. Effectively, they built a model that predicted some real worst-case stuff, and used that as the basis for advice they sent out to try and affect government policy in a bunch of places. When the code was finally released and examined, it was in a bad state - 15k lines in one source file, race conditions, single name globals reused all over the place. The results were trash. Then the blame started to fly.



Science needs more "technicians" of all types, and less PhD students. Good luck making that transformation though.


SOX for the head of research. There are minimal consequences for producing poor quality software that creates erroneous results and often quite a few incentives for producing inaccurate results that can be blamed on a non-human entity (the software). Even SOX with an escape clause for "good faith effort" would be a game changer since what goes on now would never be considered a good faith effort by any court. SOX hasn't been anywhere near as successful at modifying behavior as it should have been but it still has caused a good bit of change and those at the top setting priorities and incentives for those below them to align with now know there's a possibility of some hefty consequences for reckless or negligent behavior.


> Less PhD students

PhD students are the backbone of any research project. They interact with postdocs and with undergrads in the research ecossystem. If anything, they will be required more in the years to come, not less.


It depends on the amount of technical debt in the project.

When there's no technical debt, sure, grad students are the backbone. They can happily go about their projects and supervise undergrads, while the postdocs supervise them and apply for jobs.

In a place where there's a lot of technical debt, a postdoc is going to be far more productive than a grad student. In some experiments many grad students spend years floundering around before they learn enough to cut through the cruft and do real research. It's pretty silly to fire all the postdocs just when they've managed to understand everything and replace them with new grad students.

The tricky thing is keeping enough of the technical staff around, and coming up with a system where they are rewarded for eliminating debt, rather than penalized for making their own hard-earned skills obsolete.


Could you elaborate on what that would look like?


I think they’re describing professional grad students.


I wonder whether this just creates another "underclass" of scientific laborers... I think we need tool-builders as academics. The only thing that academics recognize intellectually are peers. The rise of RSEs is a kind of intellectual outsourcing... implementation of your science should be part-in-parcel to its creation, not deferred to second-class non-academic "technicians." The friends and former students I've talked to who've entered RSE careers have largely been treated as second-class citizens in a research environment that they are integral to!

I think, like many things, the buck stops with academia itself, its metrics, demands, and incentives. We need more research engineering/science about science academics within compute-intensive science departments themselves. Things like JoSS [1] or Scientific Data [2] are awesome first steps at addressing this.

[1]: https://joss.theoj.org/

[2]: https://www.nature.com/sdata/


The pay gap is real, and a huge reason why perfectly good software folk (and scientists, data specialists, etc.) move out of academia and into industry.

I have an industry job with good pay with a side of academic work stemming from my PhD. My industry income is literally more than 2x the salary of my academic PI, who holds a senior academic post. My pay, which is probably around the median of most HN readers, is slightly more than the top peg of the academic pay scale in my area - i.e. University senior leadership team. So I do the academic work because I love doing the academic work. I cannot afford to take a full-time academic job, although it's been offered.

I am baffled as to why academic institutions aren't failing to recruit. If they were businesses offering half the going rate, they'd get third-rate engineers, if they had any applications at all. Instead, there seems to be a long queue for academic tenure - and even for non-tenured positions. The only competitive advantage I can see is the sense of academic 'freedom' that these positions confer.

But - it's not freedom. My academic colleagues spend hours each week chasing funding, filling in grant requests, attending meetings. Their teaching load is allocated to them without choice. They're in the office 9-5 same as everyone else. Academic freedom is anything but.

I am lucky that for the research I do, mostly behind a laptop, my industry pay lets me afford the things I need to do it - cloud services, international travel, software licenses, article processing fees. I can 'do science' the way I want to without begging anyone for cash or going to grants committees. And I'm hoping this brand of 'citizen science' becomes the norm, because University-led research is going to get harder and harder the more the gap between industry and academic pay widens.


> I am baffled as to why academic institutions aren't failing to recruit

Because academics (hi!) aren't in it for the money. If you value making lots of money, or having work-life balance, then academia isn't for you (nor is social work, teaching, nursing, fine art, etc.). But this probably makes it hard to hire RSEs who aren't academics masquerading as software engineers...


I agree, but there has to be a balance between doing it for love and getting a living wage. One of my colleagues, an Associate Professor, works in Halfords (shop/garage chain in the UK) on the weekends. That isn't right. And many of these folk are incredibly intelligent. They are Ricks amongst Mortys. They don't deserve to be paid not-quite-enough to afford a house and kids.


> What do you envision for the profession over the next 10 years?

> I want to see RSEs as equals in the academic environment. Software runs through the entire research process, but professors tend to get most of the recognition and prestige. Pieces of software can have just as much impact as certain research papers, some of them much more so. If RSEs can get the recognition and rewards that they deserve, then the career path will be that much more visible and attractive.

As a former RSE I think this is unrealistic. Academia just isn't structured to value non-professors (it's far from just RSEs that are undervalued.) For me it wasn't so much the money but the refusal to sponsor permanent residence. When you see your amazing colleagues leave due to dumb departmental policies you realize you need to plan your own exit. I miss working in a team with a mix of scientists and engineers but I don't miss working for a university.


Responding to title only, and a little rant:

No, scientists need to stop hand-waving away everything that's not their area as "trivial and not worth talking about" because yeah, I've heard that exact line from three separate individuals. When I shrugged it off and still produced tools for them that shaved off a literal 1-2h out of their day and giving them more time to do the actually creative things... they were so overjoyed they forgot to say thanks and immediately proceeded to act as if the tools were always there. They were extremely careful not to show any attribution or even basic human gratitude. Would it kill them?

Yeah, frak that crap.

A lot of scientists become severely tunnel-visioned, arrogant, conservative, downright disrespectful, and very hard to work with. Pile the extremely subpar payment and you really have to just be a bored college kid (or an adult with a huge financial safety net with a passion for the area) to engage with those people in a professional setting at all.

I won't go into details about why I think many are like that because frankly, even if I was 100% correct (and my analysis is not at all flattering) it still doesn't matter one bit. Lasting change comes only from within. Scientists should start holding each other to a higher behavioral and collegiate standards and expel those who don't comply. Peer pressure works but many are so independent and without any oversight that they gradually fall under the illusion that their insufferable quirks are a personality or a personal touch that helps them excel in their area (and it's often the case that they don't excel in it).

Looking down on people because they don't know your extremely niche physics research -- that has zero experiments attached to it, by the way -- is not helping your cause with people who usually get a lot of money AND respect AND a better working environment for actually improving your workday. In most cases these people will of course go where they are at least not looked down upon.


> A lot of scientists become severely tunnel-visioned, arrogant, conservative, downright disrespectful, and very hard to work with.

Tbf I can replace scientist with developer and it’s hold just as true.


And you would be correct (and I am saying this as a programmer). Truth is, it's extremely easy for us the Homo Sapiens to become very set in our ways. :|

I'm 42 and I've had colleagues at 28 year old that already act like they belong to a council of elders. Made me super sad... they are way too young for that!


I haven't seen a single competetive job posting from a research institution in 10 years of software development, despite having the desire and acumen to seek them out. I don't expect that to change. If I care enough about a thorny research problem, I'll start my own company, do my own research, and maybe hire some academics as bargain-barrel contributors to do grunt analytics work... the last thing I'll do is join a research mill as a cost-center tech hire.

"Most RSEs have a PhD"... talk about selecting aggressively for people with no experience building functional software. Hiring only PhDs to write research software smacks of hiring only white males who can drink and play golf to function as business executives. You're selecting for ability to posture, not deliver value.


Getting a PhD (ie, passing an apprenticeship in how to do research in an academic setting) is a lot more strongly related to being a RSE than playing golf is to running a business.


Having a Ph.D sounds like an ANTI-qualification for this kind of position. A PhD would be a diversion from the kind of practical experience that would make a software developer with a sci/tech background & scientific interests useful in an environment full of smart people who possibly can't code worth a damn.

More important than having one's own PhD laser-focus would be an ability to communicate with a variety of people doing their own research, and understand their domains just well enough to write/fix/improve their damn software and maybe even publish it (semi-productised) as open source too.

A couple of application languages, some shell skills, maybe some data wrangling, and some software development common sense borne of experience.


Possibly, but at least some experience in academia is terribly helpful, speaking from experience. There's a lot of particuliarities in funding, organisation, terminology, special stacks, ... You can certainly learn them "on the job", but if you already know them, even better


I found this to be true. I was totally blow away by just how different of a world academia was, and how differently it operated. Software as a cost centre was a very unique experience on it's own, and software practices melded with institutional processes was genuinely shocking to me. Being able to take that stuff on the chin and still want to work there is probably not something you'll get from people who've been enjoying a cruisey experience in industry.


Just wondering then, how much of that kind of knowledge is portable among academic research environments, and how much is peculiar to every individual environment ? Is there something that is dying to be written up for people entering the field ?


Different aspects will be differently portable (e.g. funding will region specific, modulo field specific schemes).

Having a PhD (in a relevant field) gives you domain knowledge as well as knowing where the pitfalls are, and a PhD in general will mean you're at least familiar with what questions to ask/who to ask those questions.


You need to be very good, knowledgeable, always willing to learn and research, hard working and you make 50% of what a guy who did a 3 months bootcamp makes for working at a Rails crud app.

Where do we sign up?


I work in an academic research software development team. We develop software that helps researchers execute neuroscientific models on HPC. Funding is our biggest headache. Currently we're funded through a (very) large EU grant, which gave our project a runway, but we find that nobody in general is funding such supportive (but essential) projects. Research funding is tied to scientific outputs (i.e. papers), and not anything else. We're trying to get some researchers interested in getting funding together, but it's not going very well, and we don't expect to be able to keep our current strength.

If anyone has ideas, I'm all ears!


Computer Engineering PhD here. I’m seriously considering transitioning out of my program because I’ve reached a point where I’m just writing tons of code on a graduate student stipend and quite frankly, it seems doors are opening to me which I had previously thought would only be open with the PhD.

I feel like my experience would be very different if I had one full-time research engineer who was paid a respectable sum to just... help me write code. I like working with undergraduate students, but I can’t do expect them to commit to the task like I would, or a full-time employee. They just don’t have the time.

I’m doing my work extending / improving open-source CPU simulators and it’s just such a huge amount of code. I feel like my work would be so much more useful to others if I just had another pair of hands to work on docs, do ops work, go back and forth on decisions. My advisor doesn’t have the time for such things.

- - -

On the topic of academic pay: as a PhD student in Boston, I get paid $2000 a month pre-tax. Studios and 1BR apartments renting for under $1500/mo in a 30-minute commute radius of my campus are basically non-existent, so roommates and/or long commutes are a must. I’m realizing now why so few US citizens participate in graduate CS/CE programs: if you aren’t looking for a path to citizenship, it just doesn’t seem like a great way to spend your mid-to-late 20s.


this inadvertent details why the immigration path to citizenship is almost modern day labor exploitation. don't get me started on how the labor certification process for the green card furthers that in a timeline that is decades


Thank you, this is great information


Research software engineers should be PIs themselves so they can choose what they want to work on, who to work with, and hire trainees to mentor the next generation of research software engineers.

RSEs are a different breed from SEs from industry, the work is not about scaling to a billion people but being nimble and adaptive to new results and ideas.

If one wants to see change, then hire senior PI positions who do research software engineering. This will help create a hub and prevent the feeling of being a “pet” in an experimental lab.


I think it's also worth clarifying that there are more than one style of RSE. The name came about to describe those researchers/postdocs working in research groups whose contribution was more to the software than to the writing of research papers. For them, the incentive structure wasn't there as they were being measured on papers written, not "research improved". By giving them the name RSE, it recognises that they are playing a different role in the group and perhaps need different metrics. At this time, they are still in research groups, experts in their field, having likely done a PhD in the particular domain. These are still RSEs and there are lots of them around (probably more than there are in central groups).

The recent trend is for universities to provide central RSE groups (either by core-funding or being costed onto grants in parts) which operate more like consultants. This allows a route in for people who do not have a PhD or a background in a particular field, and can support many smaller projects across the university.

For example at Bristol, we have a combination. We have a lose federation of "embedded" RSEs and some "RSE group" RSEs (funded through various grants) who work together to learn from each other, teach around the uni and provide a support network. This group was able to be formed due to a fellowship from the government back in 2015 (the same fellowship that allowed Paul to start his group at Sheffield).

It's important to remember that anyone who is doing "research software engineering" is an RSE, even if it's not their job title.


Just wondering what you think about the progression routes for RSEs?

A big problem as I see it is that someone not in the first wave of RSEs (who are now mostly in leadership positions across the various research intensive Universities groups) basically gets stuck with no appreciable career growth. You can probably get to a permanent job in the 42-50k salary band as a “Senior RSE”, which seems to be used for people even without that much experience to try and bump up the salary, but beyond that it would be very difficult to jump to the band above without staying in the same role for ~5+ years and hoping that someone left. At that point you’re obviously at top of automatic increment band and don’t get a meaningful pay rise for a potentially number of years. I thought it was very unlikely that there would more jobs at that level because when bidding for time on grants, academics don’t want to pay someone external to their team more than they pay their own post docs, and heard comments to that effect from academics themselves. For centrally funded positions it’s even harder, having seen my old group lead battling to try and get even 32-40k positions approved.

So in my mind, there’s a big problem where someone ambitious could job hop into an easier and better paid job in industry. Why wait a really long time to get promoted when you could leave and get paid more than the top of the next salary band anyway?


The issue with money is down to the university overheads.

An RSE is paid like a standard researcher (with none of the benefits, recognition etc as detailed in other comments), which means that the university will take, minimum 50% of the grant money for that person.

This is for building work, maintenance, general staff, etc etc. But I need a laptop, and I work from home, most of the time the IT staff just get in my way and slow my computer down with the crap they install.

If the universities didn't take that cut, we could easily pay the RSEs more. There's no job security, very little recognition, but every single subject requires good code these days.

The real issue here is that without decent coders our science will fall behind. Any idiot can write bad code, and smart people can write exceptionally bad code, we've all been there, not writing simple readable code because it made us feel clever, till we had to read it a year later. Academics are smart people who believe they can teach themselves to code, and they can, but it's generally not good software.

There are also the ancillary benefits of just having someone around who can do things like rename your hundreds of files by writing a script in 5 minutes rather than you taking two days to do it (this actually happened).

For the sake of western science this should be addressed.


Where do you find these jobs? And are they only available at Universities? I'd be happy to take the pay cut to work on something that contributes more to society than getting people to click on ads, but not at a college or university.


We (the Society of Research Software Engineering) maintain a jobs board at https://society-rse.org/careers/vacancies/. Or if you are in the US, they have their own (https://us-rse.org/jobs/)


in the uk, https://www.jobs.ac.uk/search/?keywords=RSE&location= gets you five hits today and I expect you would find more similar posts that werent currently branded research software engineer if you looked - the term is still quite new for many academic depts. One thing you'll need is patience if you want one of these positions - I remember being offerred an academic post three months after interviewing for the position and when I queried how on earth it could have taken so long to offer the post I was met with incredulity that there was an issue at all.


You might be able to find research institutes or national labs (I'm a research engineer at one of these). If you're in Europe, something like the Max Planck Institute might fit the bill.

Aside, can I ask why you don't want to work at a college or university?


This hit pretty close to home for me, as I started as a programming oriented PhD and then transitioned into a SV full software engineer over 10 years. I am exactly like they mentioned "Many RSEs started out as PhD students or postdocs who worked on software to support their own project. They realized that they enjoyed that part of the job more than the actual research."

I think the advice in the interview is pretty sound. OSS is a great way to contribute and become in demand.

But the main issue here is that there are not enough RSE's. "If RSEs can get the recognition and rewards that they deserve, then the career path will be that much more visible and attractive."

Here are my two cents:

1) Academia will never compete in salary with big tech (except maybe in the big Ivy League schools). Universities should promote spin offs, help with patents, run regular hackathons and an incubator program. That will give engineers an incentive to at least start their career in an academic lab.

2) Some of the best science happens in startups and org x labs, not just academia. But even here I found there are very few companies that value an RSE. You're either an RS, or an SE and the extra R or E is in title only. They will see you as a researcher who knows a bit of python, or as an engineer who wasted years in graduate school. You will be interviewed as one or the other. You will be paid as one or the other. I've never seen a company appreciate the value of fully grokking or contributing to the researchers algorithms and Jupiter notebooks and then architecting an efficient implementation at scale. You will also be kept in your swim lane. Researcher? Can't touch the production systems. Engineer? Good luck getting a patent application going. If there is a hiring manager out there reading this... please consider making a true RSE interview and role.


I had the title "Research Software Engineer" well over 10 years ago, and worked as a research software engineer in another role prior to that with a random title.

Contrary to many of the posts I see here, it was a great job, except for the below-market pay. I felt like I was contributing to something valuable. It beat working on yet another boring commerce website. I had interesting coworkers and worked with interesting technology.

I didn't experience any of the politics around academia. I felt respected by everyone. I don't have a PhD. I don't even have an undergrad degree. I dropped out of highschool, got a GED, and did a couple years of undergrad, got hired by the university, and never finished. No one ever treated me like I was second-class.

The only reason I'm not still working as a RSE is because I took a 50% pay cut when I took the job and was lured away by money later.

I wonder how many of the negative commenters here have any actual experience as or with RSEs.


I work as a research engineer in Italy. My path was academia (Physics)->industry (Finance)->mix. The "mix" where I am now is a strange case: it is located inside an university, nearly everybody has a PhD and we also have PhD students, but we are not academia, it is a private no profit company doing applied research. I am getting the same money I was getting in the industry but also 7+ weeks of vacation per year, more autonomy and less stress. Plus, I am not working to make rich people richer.

Most of my job is about optimizing numeric code, lots of numpy, pandas, numba but also thinking of new algorithms. I get respect and they put my name in the research papers they publish, even if I did not write a word in the paper and have no knowledge at all of the underlying science ;-) Such companies are rare but they do exist. Much better than academia because there is no bureaucracy. I got in because of a friend of a friend...


Sounds heavenly, are they hiring?


I used to work as a research software engineer in life sciences and I absolutely loved it. Jobs at universities allowed me to learn and experiment. However, universities cannot compete with industry when it comes to salaries.

For example, in Germany where I now live, all public universities (and almost all prestigious schools that do high-profile research are public) have strict salary rules. You can find tables (like this one [1]), according to which your pay is determined. It depends on the job, land (state), degree, and years worked. While this kind of money allows comfortable life (or at least did before the inflation hit hard), industry offers at least 30% more (based on my perception).

[1] https://www.jobs-beim-staat.de/tarif/tv-l_e13


The easy fix is to move software devs into the administration department and pay them administrator salaries.


Software engineer looks at some scientists' code and speeds it up 14000X:

http://james.hiebert.name/blog/work/2015/09/14/CS-FTW.html


If you need them then pay them.


Pay them and treat them well and with respect. Money can only hold someone for so long if they're treated poorly and like an inferior.


I was a research engineer for ~10 years, and my experience was that my immediate colleagues were very appreciative of the work I was doing, but that the university bureaucracy system is not. In the beginning it is a very decent position (compared to a PhD); less load and better pay, and lots of autonomy. However, it is a dead end for your career, there is no promotion path. I liked working as a RSE a lot, but indeed moving to the industry will give you a huge salary rise and a career path.


Most industries outside of the actual tech industry treat software engineers as low level technicians, pay and promote them accordingly, and then complain they can’t find anyone competent. Academia is that plus some of the worst internal culture you’ve ever seen. The work can be fascinating but unless you’ve already got fuck you money and just want something to do, stay away.


I am a research software engineer, and I worked short-term contracts only, and my last contract was not prolonged due to budget cuts (with a laughable salary about four times less compared to industry — thankfully, I wasn’t there for money). The administrative personnel, on the other hand, were all working on permanent contracts.

I guess science does not need more research software engineers after all!


I'm a Research Software Engineer with a PhD working in Harvard/MIT area for > 5 years. I skipped doing a postdoc and have been part several high-profile academic projects in genomics. Overall I've loved the experience and can be a great move for certain people.

A major pain point in RSE is building and sustaining larger software teams around a a single, focused long-term project with ~6-8 RSEs working together. Even the most well-funded and successful academic labs have a limit to what how much RSE effort they can support.

We need new funding models where engineering-focused leaders can muster sufficient resources (autonomously) to build great software teams that maintain high quality software in the long run. Focused Research Organizations (FROs) seem like an interesting new idea (https://www.dayoneproject.org/ideas/focused-research-organiz...).


I spent a few years as a RSE in the mid-2010's. Supporting people doing actual science is a rewarding job.

The worst parts of that job are the same for every body in research that isn't a PI pulling in lots of grant money: low pay, terrible or non-existent benefits, bureaucracy ran by non-science people, making do with older equipment.

There is also a breaking in period where you have to prove you are not a moron. Since you come from a different educational area, and most likely aren't as educated, this can take some time and effort.

The best part of the job: getting to work with highly educated people, getting to see science be discovered, being able to actually plan and execute projects because everything isn't on fire, regular business hours with no after hours support calls.


I think it is funny that Perl saved the human genome project (https://bioperl.org/articles/How_Perl_saved_human_genome.htm...). I wrote some Perl for a science grad student and it got me published in Genome Research:

    Matthew J. Lambert, Wayne O. Cochran. Kyle G. Olsen, Cynthia D. Cooper,
      Evidence for widespread subfunctionalization of splice forms in vertebrate genomes,
      {\em Genome Research.} 2015 May; 25(5): 624Ð632.


I guess I fit this category, because I'm working on a project that's based off a scientific paper and is basically a Matlab script turned webapp so that all those calculations don't have to be done locally.

The pay is indeed around 60-70% of what the industry offers, but I would have the same rate if was making a CRUD app[0] for the same large company, so it's more about the employer than the type of work.

[0] which is something I did in the past - a webapp generating documents ensuring regulatory compliance. This organisation likes to make webapps.


(Very) few research groups can fund a commercial grade software engineer. Unless they’re intending to sell the software, it might not even be sensible to do so - like hiring another nmr specialist or whatever. These would be considered departmental support staff and groups could bid for their services. Chem departments used to have staff modellers for this, but the $$ in industry sucked them all away.

Coding somebody else’s problem lacks status in academia, and the money is poor. Why would people do this as a career?


Please do not hate me or hit me, I say these with an utter interest in the field.

I worked as a research engineer for a few years. They are really needed but unfortunately this will be a waste of time, effort and knowledge. The real problem with software development in science is that it is driven by the scientists who are (mostly) short sighted in their sw dev planning/schedule, they do not appreciate software debt.

And given their budgets and their focus in publishing, I understand their decisions.


Interesting topic. I recently finished a role that was essentially 50% RSE, but working at a university-adjacent non-profit. I found the academic work highly interesting+ enjoyable, but then most of the tools / applications I built were built based on personal experience / perceived gaps and then adopted, rather than at the request of a researcher. For the most part I found the tech resources available to researchers ad hoc at best, but generally non-existent or otherwise out of reach.


Most academic code is crap so there is that. the coworkers will be very bad programmers, difficult to advance (you should always work with people better than you). Computational papers are not often appreciated. Too much talk and too many pivots , too little code, at least too little pleasant code-writing. Unless you have a lot of ego, you will be ignored.

I am sure a lot of academic code would be written by the open source community, but alas , researchers are very stingy giving out their data


No, they need to be willing to give a fair wage and benefits.

I looked into scientific programming, and everyone I offered to work for if given an actual position, not an assigned seat and the perpetual threat of firing if a string of soft money grants ran out, and I got zero takers.

The science field is full of performative narcissists who think if they're trying to "cure cancer" or whatever, the lessons from psychology, behavioral economics etc simply do not apply to them.


I think this doesn't frame the problem correctly. It promotes a separation between software engineers and real scientists which helps nobody. Anyone building software for scientific purposes really should understand the science behind it. Also, since better software allows better scientific outcomes, scientists need to understand software engineering. Both should be considered scientists and be on an equal footing.


They should probably pay them the going wage for software engineers too. I used to work in science, the salary's are often half of what entry level swe positions are, and require an PhDs/post docs...

People wonder why r&d in the us is in a slump... Well it's mostly because choosing a natural science for a profession is punishment compared to software.


As someone that works as one. I would recommend NOT being a research software engineer if you value your health and haven't obtained financial freedom. Cost of living is always increasing and the lower pay isn't worth it. The work is more stressful as well because proper software development isn't respected. Previously worked at FANG.


Need a software engineer? Hire me, then. My last employer was charging $150 an hour for my time, and I'd say most clients felt it was a fair deal, or a good one. I quit to do other important things with my brief time on earth, and now that I'm looking for a software dev role again, it seems I'm hardly worth a third of that.


I would actually love to contribute to scientific research. I work in industry, which is lucrative and impactful but not that satisfying, and I work on pet projects which are satisfying but not that impactful. I would happily contribute some of my time to scientific projects which are legitimately impactful and also challenging.


What areas of science are you interested in, and what languages/skillsets do you have?

Maybe I can steer you towards some Github repos


I am interested in biology and especially neuroscience, and I have a pretty broad skill set. I have a background in graphics/GPGPU programming and enjoy building simulations.


Maybe look into Molecular dynamics?

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6209097/

Not an expert but this looks like the most popular repo:

https://github.com/gromacs/gromacs

Good luck!


I started working at a top medical center 12 years ago. They didn't have a title for me. I picked one from the menu of current titles. Few years later, I asked to change it, two years later I got one. I feel like my title needs to be changed again.


“I want to see RSEs as equals in the academic environment“

Not just in academia, but also in biotech/pharma, where MDs or MBAs are too often the top dog.

This largely explains why Pharma and healthcare is so behind when it comes to tech

(Exceptions are the new, small biotechs. But the larger ones are more like dinosaurs)


When the research wins a Nobel prize is the Software Engineer one of the winners or just the scientists in charge? My understanding is that it's the scientists.

If Science wants more research software engineers then they should start treating them like peers rather than hired help.


In the world of ever increasing cost of living, sane ppl will always take best compensation offers.

Everyone has the same thinking that we will be here 40-70years and our finite time is worth more and more.

Im sorry for the ppl tho dont have options to do better for w/e reasons.


I wish there was a platform connecting research in need for some coding with volunteers. It probably wouldn't work for large scale projects, but for smaller problems it might be worth a shot, perhaps utilizing the stackoverflow platform?


Research software engineer here.

This is probably not going to happen unless my employer decides to pay something competitive with industry. I'm here because I want to be. There is an opportunity cost here and not many people are willing to pay it.


There is (slowly) becoming a concept of a "tour of duty" from industry into working in dot-gov (like UK digital service).

Is there room for something similar in academia - to bolster the profile of the RSE idea, to get some support funding etc?


Majority of frontend developers don't even appreciate a simple aspect of science; browser performance get worse as javascript bundle size increases. I think we also need software engineers to appreciate more on science.


I would happily stop everything that I'm doing to become an RSE if it paid well enough. I really like the field and what it is but if it doesn't pay the bill then it doesn't pay the bill.


Pay money.


Well, it's a bit difficult when these positions require a degree.

It's a problem with research institutes that they consider themselves academics and look down on people with extensive work experience.


Like looks after like. No matter how good you are, how much value you bring, if you don't fall under the employer's NAICS code, expect the waterboy treatment.


Sure pay me the same I am paid now and with the same benefits and I will be happy to do research. Your choice.


They will have more than enough once/if the tech valuations melt down and with them the software engineer salaries.


Then they'll have to pay market rates. But that in turn might cause the scientific staff to revolt.


Software Engineers need to be paid.


Science can pay me more.


I worked for 20 years at Dartmouth College as a "Senior Programmer" in the CS department, later reclassified as a "Senior Research Programmer". I came into the job with a BA in Physics and 15 years of industry experience (jamming systems for fighter aircraft EW systems, lead software QA programmer, consultant). While the pay was very low the work was interesting and towards the end I worked from home a lot. One difficulty was that my position was grant funded, so when a grant ran out so did my salary. The profs tried to keep me on several grants so I was always funded but at one point my primary boss went on sabbatical and I had to scramble to find someone to work for. This often meant I was working on several very different projects at the same time (computer security, privacy, robotics, wireless sensors, network monitoring, wearable health sensors, etc.), sometimes as many as 5 or 6 projects at the same time, usually 2 or 3. At one point I made an effort to collect information and build a proposal to create a pool of software+hardware engineers, similar to their already existing pool of programmers but more oriented to embedded devices, research code (the programmer pool mostly did statistics), wireless systems, drivers, etc. The College considered it, but decided not to go through with it because it would create a new cost center and they thought it would cost them money (benefits, what to do with the engineers between grants? put them on unemployment?). So unless a PI writes an engineer into a grant (difficult to do except for the larger $1,000,000 grants because you can't hire 1/3 of a person and get them to move to rural New Hampshire) there is basically no skilled engineering support for research projects at Dartmouth (except in the medical school). Most such work is assigned to students, who of course don't have a lot of experience and can't do some things (like write low level firmware for a Bluetooth MCU, CS students are not taught embedded systems coding, though they can take a course in the engineering school if they want to, but taking on a project that uses 500,000 lines of existing custom code and hacking it to do something no one has ever done before is not something a student can jump into for a semester.) So some types of research just don't get done. Also the work can be somewhat frustrating for an engineer because you never finish a project, you just get it working well enough to get a paper out of it and then it often ends. Even the longer term projects don't reach completion, at best there is a patent (I have my name on several as co-inventor, no one has ever licensed them.) There were very few other people like me on campus, I knew all of them and there were usually 3 to 5 at most. A year before I retired we hired a person to replace me, they stayed for a year then quit. It is very difficult to find people willing to move to a rural area, for lower pay, to do very complex unusual programming, that requires a wide range of expertise (from financial reports, to reviewing papers, to designing hardware, to all the kinds of software programming, collaborating with remote research groups, field experiments, travel to conferences, data analysis, cross disciplinary work in medicine, sociology, psychology, being comfortable in an international community, and more.) I think there are a lot of people who would enjoy this kind of work, but few willing to put up with the low pay and pressures (it's like being in a tiny startup that never grows and is always scrambling as they continually switch projects.) There was no path for advancement either (unless you call getting a PhD and leaving an advancement.) Pay raises were always fixed at about 2% per year (which actually meant losing pay since inflation incrementally reduced your salary until the next pay raise), or 3% at most in a good year. No bonuses. The benefits were good, the work was interesting, but from what I've seen it's difficult to find this kind of work at any research university, and it is extremely challenging in multiple dimensions. I think it likely this lack of engineers is /part/ of what has made much of US research short-term-focused on incremental changes and resulted in a lack of radical invention. Maybe it's different at some engineering schools?


In main FAANG labs this is already a constant Brain, Facebook Research, MSR and AWS. Researchers work with SWEs to deploy models and hire hybrids.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: