Hacker News new | past | comments | ask | show | jobs | submit | throwmamatrain's comments login


I noticed I saw way less information about shows in my area, the only thing I used facebook for. I missed nearly two months of shows at my local venues due to relying on facebook before I realized something had changed. Maybe they increased how much you had to pay to advertise or increase reach? No idea!

Looks like I'll be writing my own scraper, fun times are here.


American fixed rate mortgages are locked in for the life of the loan. If you get an ARM (adjustable rate mortgage) this is not the case. A fixed rate mortgage is a hedge against rising rents and rising interest rates. New landlords buy in at higher rates/higher prices and rents will increase as a lagging factor.

Yes, everything that goes wrong with the house is now your responsibility. Yes, you will pay property taxes. Landlords are not in the business of renting at a loss in cities just yet. Occupancy rates are still high.



In god we trust, all others pay cash.

Take the money, leave the cannoli.


This looks great, and I would have loved to see this when I was in the lab.

When the software industry says to you: "We will nearly triple your salary, you don't have to work weekends, and you also don't have to feed the mice on a Sunday night."

You will 100% take this deal.

I was a 10yr+ academic tool maker in biochemistry, built cutting edge microscopes, hardware, and image analysis software. My lab was successful in our field. I got some papers out of the deal. I also saw things that no human had seen before in a microscope. I worked with very interesting people from around the world. The work in academia is great. You're moving the needle, new data, new failures. These are the perks. It is also highly possible that you have complete creative control of your project. I did, and it was amazing. Custom designed UIs to streamline our experiments, derived from watching students use the system to do their work. A decathalon of software design.

Some reality: Your PI and organization will never compensate you the way the software industry will. In pay, expectations, and benefits. When you're over 30, and you don't technically have a real 401k, you are still paying your student loans that you needed to get into this field, and you're still in the same shitty apartment, something has to give.

Comparison is the thief of joy, and when you see your cohort of computer science graduates your joy will be stolen :). It's good, honest work. A short tour of duty would be useful, and can teach you the difference between splitting the atom and splitting the check.

Academia, at least in bioscience, is still very much an ivory tower. You don't have enough letters after your name to matter, and you will likely be a pet instead of a peer.

Don't stay underwater for too long. Life is short. :D


Since this thread is turning into a yet another complaints about academia thread:

One of the serious downsides of working in academia is you are basically doing the industry's work for them for less pay and they will one day turn around, pat you on the back, then sell your work for millions of dollars. It gets worse honestly the closer you are to applied fields. There, you already straddle the line between what your more "pure" (and less well paid) peers think is "science" and actually making things that will in fact make people's lives better, so you have less room to be idealistic about why you are doing what you're doing, that is, whether it is for "moving the needle" or "adding to the corpus of humanity's knowledge" or whether you really are just doing someone else's work for them they aren't willing to fund given the risks. And given that the latter is basically closer to what you're doing and your closer to the place where you'll see your work enable someone else's riches, it's hard not to want to jump ship and just become one of those people on the other side but make money hands over fist.

It's an upsetting situation honestly.


I think this is only a half-truth. There certainly are examples of academic research being translated into lucrative products by industry (there are even prominent examples in software/systems engineering) but I think that many times the translation of academic research into a useable product is also a massive endeavor that deserves recognition in its own right.

I see this scenario described in medical research all the time with people saying that industry just leeches off of academic research and what people leave out conveniently is the vast amounts of money and research that goes into translating research into a real drug (billions spent on clinical trials to meet regulation, millions to billions spent on scaling manufacturing and synthesis of the drug to industrial volumes, drug delivery like pill design or injection methods)

Additionally many industries also do have well-paid research positions that "move the needle" on science and basic research. While they're more targeted at producing and supporting products instead of full liberty to exploring just for the sake of knowledge, it's not like there is a complete black and white poorly compensated academic research vs industry.


Fundamental: the patents produced by taxpayer-financed academic research have no business being exclusively licensed to some pharmaceutical corporation. As far as the cost of clinical trials being borne by those companies, well, let's get the FDA involved in the clinical trials.

Then the competition can come in, i.e. whoever can produce pure preparations of those drugs at the lowest cost will win the most market share. This means investing in top-of-the line manufacturing platforms (much of this is now outsourced to India, Mexico, etc. for drugs being sold in the USA) instead of squatting on the patents, blocking competition, and using monopoly status to jack up prices.

Yes, this would greatly reduce the profit margins and perhaps the stock prices of Big Pharma outfits, but the overall benefits would greatly outweight this. As a practical example, look how the best Covid vaccines (mRNA types) have been monopolized, leading to low rates of vaccination in Africa etc., even though that was technology developed with taxpayer funding at public universities.


>manufacturing platforms (much of this is now outsourced to India, Mexico, etc. for drugs being sold in the USA)

This completely trivializes and misses the fact that the manufacturing process itself can be patented.

>As a practical example, look how the best Covid vaccines (mRNA types) have been monopolized, leading to low rates of vaccination in Africa etc., even though that was technology developed with taxpayer funding at public universities.

That's just patently false, Moderna for example, has been waived patent infringement related to covid vaccines. The reason the developing world does not have high rates of vaccination is not because of patents but primarily because of their infrastructure.


No, if a Uni has developed some patents, and wants to 'exclusively license' them to a Pharma, that's probably a good application of that patent, they become much less worthwhile otherwise.

It's a misunderstanding of the market to suggest that somehow 'the FDA will lead the trials'. This is about as likely as a manned mission to Venus, it won't happen, and it shouldn't happen for good reason (cost vastly outweighs the benefits).

It's also a misunderstanding to suggest 'whoever can produce pure preparations of those drugs at the lowest cost will win the most market share'. The 'cost of manufacture' is most cases is not a material or relevant issue.

Your example of 'COVID' monopolization is completely upside down - companies didn't maximize their profit potential there, and may not have even developed such vaccines in a normal case, they were giving very special prices to places like 'Africa' - and none of this has anything to do with 'low uptake' in Africa.

Africa has 'low uptake' for the very same set of reasons they don't have electricity, or consistent electricity in many places.


"companies didn't maximize their profit potential there" as said pharma companies enjoy the highest profits they ever have... The argument "well I didn't kill you" when you struck my face isn't a valid claim to mercy. I'm not one to go full pinko here, I'm just pointing out the obvious logical flaw.


Non-exclusive licensing is the far better option. This prevents monopolization and ridiculous price increases. As far as clinical trials by exclusive license holders, those have a rather poor record of producing reliable results over the long term in many cases (Vioxx of course, there are many others). The trials should really be independently run, not controlled by the very corporations that have a vested interest in seeing positive results so they can go to market.

As far as Covid-19 vaccines, there are actually many companies ready to go right to production if those patents are released to the public at this moment, and that would greatly increase supply, and that would benefit the whole world, instead of a handful of pharma CEOs and affiliated shareholders.


- 'Non-exclusive' is a non-starter for most companies, there just won't be a license.

- Some type of independent trials might be possible, but there already is a lot of oversight. That's complicated.

- 'Releasing the patents'. I'm sure everyone in the world could release all of their patents for everything, and things would be good for about 2 years, but we'd likely never see another drug produced again, ever.


>As far as Covid-19 vaccines, there are actually many companies ready to go right to production if those patents are released to the public at this moment, and that would greatly increase supply, and that would benefit the whole world, instead of a handful of pharma CEOs and affiliated shareholders.

That's just conspiracy. Many companies have not been enforcing any patents related to covid vaccines. The reason why the developing world does not have high vaccine adoption is because of their infrastructure, not because of patent blocking.


It's a valid issue and dismissing it as 'conspiracy' only weakens your argument. For example:

https://www.theguardian.com/world/2022/may/03/covid-vaccine-...


It really is not. No where in that article does it even hint at patents being a primary limiting factor for the availability of vaccines in developing countries.

Of course not all companies are going to release their patents, but the fact there are several that already have, means its not the main problem. mRNA vaccines require extreme temperature controls during transportation and that incurs much more of a prohibitive cost in those countries than any amount of patent royalties do, even if there were no waived patents.

There are cases of huge shipments of vaccine donations going to developing countries that then go unused or underutilized because they do not have the resources to transport and distribute them effectively. Look at the COVAX initiative. The countries that failed to get vaccine rates up even with huge donations of vaccines lacked the infrastructure to distribute them whether it was the temperature controls or not having enough syringes.

"in Benin, only 267 shots were being given each day, a pace so slow that 110,000 of the program’s AstraZeneca doses expired...The vaccine pileup illustrates one of the most serious but largely unrecognized problems facing the immunization program as it tries to recover from months of missteps and disappointments: difficulty getting doses from airport tarmacs into people’s arms."[1]

[1] https://www.nytimes.com/2021/08/02/world/europe/covax-covid-...


Less than 1% of 'research' ends up being commercially viable in any way.

Almost zero research is commercialized directly, in a manner that equates tech to 'product'.

There are usually enormous costs in applying research to markets - just because something 'makes a million' doesn't mean there were no costs.

As for software:

We probably need cleaner, simpler tools, better SaaS for many things.

We just can't afford to have a lot of devs doing research.

Think about the zillions in lost man hours due to Python weirdness of various kinds. It's a giant productivity sink.

Also, I hope tooling for many researchers starts to improve.

I think the target should be, in most cases, that researchers themselves have the tools available to 'do their work' without having to hire devs.


The next thought should be: why doesn't neo-liberal capitalism fix this problem? And: is my characterization of the problem correct? Why not start a new firm that better compensates researchers (and tool makers) for their valuable work? It seems like big tech (especially Google, and perhaps Microsoft) comes in from the commercial side and invests in R&D at reasonable rates for just this purpose! But surely if workers are systematically undercompensated, there is room for a disruptive firm to come in and take the best talent and still make a profit.

Perhaps the characterization is wrong and the EV (expected value) of this work is far lower than you think (this seems likely), and/or there are externalities like regulation, or the leverage of prestige that traditional orgs (e.g. universities and publishers) wield, that warp the profit incentive. Or (and this is my cynical view) pure science was always best left to the hobbyists. Historically the most important discoveries have come only rarely and to those who loved doing science in their free time or, more rarely, when a talented individual found a patron. Building a science factory and hiring science factory workers not only sounds distasteful, but it doesn't seem to work very well. (The exceptions being those very capital intensive projects like the LHC which require a large pool of professional scientists and engineers to do the experiment.)


"If it always worked, it would be business. Let's go to the pub." -- Me, consoling a grad student after experiment failure #24.

More seriously, if you're in basic science, your skills are valuable in transforming the work into a more useful thing to be used later. Using your science factory model, you have created a reusable widget that other people can use. The science factory model does work, you can see its results in things like MIAME: https://www.nature.com/articles/ng1201-365 Where large pooled datasets are used to get insights otherwise impossible.

There's not a ton of low hanging fruit in some fields, as time has gone on the edges are harder and more expensive to see to be at the cutting edge. Ex: you spend $2M on a microscope that does a cool thing and two years later the new model is all that, a bag of chips, and a soda for the low price of $750k. You hope you have a good enough relationship with the vendor that they will either mod or upgrade your system, or that those two years were enough for you to get ahead. It probably wasn't. And you now have a not as fast ferrari for more money than the fast ferrari.

There is a massive glut of international students willing to work for basically nothing, beholden to your PI by their visas. I say this not as xenophobia, but I was the only working class American (my parents do not have degrees) in the department. All students/postdocs that I worked with were from other countries, or if they were American, their families were doctors, or a faculty member. More generally, the kind of people that might own horses :D.

No firm would take this work on, as the profits are not clear, and the time scales for success range from two years to never. In this case success is "great job publishing, we'll give your lab another 2-3y of funding." After which, you better get good at writing books and eating pasta.


I would also say, and I'm surprised this needs to be said in a community that is so connected to the Open Source and startup cultures, that just because something is valuable doesn't mean it's possible to make a business out of it.

Imagine research into a technique for getting better blood pressure readings from people who are so nervous around medical settings that their blood pressure spikes (or more basic research into the mechanisms of blood pressure and anxiety). This is a valuable thing to society (more accurate data informing treatment decisions for individuals, screening for physically demanding jobs, life insurance, forecasting medical spending for Medicare and the like), but it's not worth a lot to anyone in particular.

For the field you described originally, complex imaging devices, there are only so many users of that research so it's conceivable that work could be taken up by a corporate R&D department.

There are all kinds of other very useful research topics that are very valuable to humanity as a whole but it's not clear exactly who should pay for it (I'm not saying you aren't aware of this BTW, hopefully I'm adding support to your argument). In those cases it makes a lot of sense to take a fraction of a cent from everyone and pay for it that way, as we currently do.


It's very difficult to tell what will become valuable in the basic research world and what will remain a curiousity. A classic example in biotech is the study of sex in bacteria - it seemed about as useful as studying the sexual reproduction of ferns at the time. Bacteria generally replicate themselves clonally, but the discovery that they were also exchanging genetic material by the use of plasmids (essentially, mating with each other) eventually opened the doors to things like cloning the human insulan gene, inserting it into a plasmid, getting a bacteria to take up the plasmid, and then, voila, human insulin could be grown in vats in bulk. That was the first real biotech business that I know of, and from there it just exploded.

The problem with universities pushing research that clearly has some short-term financial reward (due solely to patents and exclusive licenses under the 1980s Bayh-Dole law) is that they neglect basic research and so close the door to the potential of truly fundamental discoveries like that. This is generally known as the corporatization of the American academic system and it's really been a disaster for basic technological advances.


Do you think the decline of large corporate R&D efforts is cause or effect here (or is this a false premise)?

I am wondering whether we've seen the reverse of the idea I was originally challenging (if research was valuable it would be a business), where universities captured a profitable business because it was being neglected by the business community (and were distracted from basic research).


The original concept was that universities were places of basic research, and more translational (read: monetizable) research was thought to be done at corporations.

That theme changed after 2008~ when NIH was flat funded and most universities were gazed upon by the Eye of Sauron for funding. A lot of places that were basic science focused, let's say at the level of studying a set of proteins in mitochondria, had to figure out how to connect the dots to disease or therapeutics. Not everyone made it.

Also, universities got into the game of stacking patents to license. I don't know the arc of that, but I know for sure after 2008 my Office of Technology Transfer was really into it.

Ex before: "We study apoptosis signalling in mitochondria, to understand how mitochondria are related to cell death." After: "We study apoptosis during heart attacks, and how mitochondria contribute to cell death in ischemic conditions."

Something along those lines.


Totally! Most of our best equipment was stolen and modded from materials science imaging or manufacturing automation. There was a budding industry for automated fluorescence imaging, but they were still finding their legs.

We had a couple electron microscopes that we modernized from film, and the companies we contracted with mostly dealt with materials people.


> surely if workers are systematically undercompensated, there is room for a disruptive firm to come in and take the best talent and still make a profit.

Other good replies here, but this part of the comment reveals some assumptions that need better definition. Having been both, I can comfortably say that academics aren’t “workers” in the same way that industry programmers are “workers”. The parent comment is not correct about the norm; programming for research projects is not usually sold for profit later to industry. It happens occasionally, but most academic work stays academic. Sometimes when it does happen, it’s in the form of a spinoff company that brings the original authors of the software, and so they end up getting some of the proceeds… when the business survives.

Also the top comment didn’t say ‘undercompensated’ - in business this has a clinical meaning that someone is being paid lower than market rates. We know that adademics pays lower, but we do not know that it’s lower than market rates for academics. It’s plenty true in industry alone that you can make a lot of money at Facebook or very little money at a small indie game dev shop. That doesn’t mean the indie game devs are undercompensated, it means they’re in a different market.

Starting firms to compensate researchers better is what pharmaceuticals (for example) are. The problem with your suggestion is that the need for income can undermine the ability to do research that is unbiased, risky, controversial, critical, or just free of agenda. If you pay researchers in line with what Bay Area programmers get, it will put an enormous burden on the PIs to make many multiples more money than their peers, and compete with them using a small fraction of the number of people of peer groups.


I'd guess that the expected commercial value being low would be the norm, and discoveries making millions relatively rare, just as this is in every other context. However, the second half of your second paragraph is where my mind went to first, because what gp says happens does happen, albeit at a normal (low) rate. The motivation of people working in science is different, as it is in say the games business. Game developers have historically been paid lower except at a tiny handful of companies. Not 33 cents on the dollar, but maybe 50 to 70 (bearing in mind that FAANG/unicorn salaries are not the norm either)


> The next thought should be: why doesn't neo-liberal capitalism fix this problem?

You are the vehicle by which neo-liberal capitalism fixes the problem. By leaving academia to work for a firm directly, you are responding to a price signal sent by the industry, relaying that price signal to the academic labs.

You might object, this is slower than most price signals! That's because the university environment is somewhat insulated from the ordinary pressures of capitalism (and thus better able to exploit young research programmers).


> you are responding to a price signal sent by the industry, relaying that price signal to the academic labs.

Which means absolutely nothing unless a ton of other people do it as well. A handful of people here and there can be replaced.


They expected value theory is very plausible. there are a lot of r&D projects that basically produce zero output for decades. high risk high reward


>why doesn't neo-liberal capitalism fix this problem?

The whole point of academia is to subsidize research before it gets to an application phase. How can a private firm compete with academia who benefits from government funding and are tax exempt? Trying to pin this problem on "capitalism" is just lazy.


No, lazy would be straw-manning a stranger's argument for no good reason to elicit an emotional reaction. It's a style of communication that seeks conflict rather than understanding, and there is plenty of it on twitter and reddit, but not here.


There are plenty of firms that sell software to academia and many of them make a ton of money. I bet there are great opportunities in that space. I guess the issue is that most business educated/oriented people are both too disjoint from both engineering and science, so competition is rare.


>The next thought should be: why doesn't neo-liberal capitalism fix this problem?

Neo-liberal capitalism fixes problems?!


why should anyone pay when the government is keeping it all alive today?


This.

I have worked for almost 15 years in academic research, but in very close collaboration with the steel industry. The code we write can help steel companies to save millions when developing new products. This is quite complex software, which combines materials science, mechanical engineering and advanced mathematical concepts while requiring high performance and reliability.

I found a nice position for a tenure track in France, in a top research centre. Besides designing and writing software, I would have to design and implement experimental plans, teach, deal with students and administration, keep an excellent publications record, and find funding for future projects. Remote work would not be a possibility (but I would work may unpaid extra hours at home). And the amount of published papers and recommendation letters required just to be considered for the job was overwhelming. My salary would be lower than $30k/year. They do not even know what is a RSE.

I am searching a remote job in the software industry now.


> But I would work may unpaid extra hours at home

I think that's incorrect. You would work the number of hours you wish to work (considering you produce reasonable value, but the bar is low). Research engineer (or researcher for that matter) in a public French research center is a civil servant position. They are difficult to get but you don't get fired unless something is blatantly wrong.

Source: I worked 10+ years in such a position. I work now for a FAANG and the pressure is considerably higher. Evaluations every 6 months, lot of peer pressure (engineers are on average better and more ambitious than those in academia and you need to keep up - some of them seem to work 24/7), extremely stressful oncalls. Gross salary is 5 times my previous salary and has the potential to increase much more.

Of course, this is certainly not representative of all cases, but most of the time, there's a price to pay for a higher salary. Another thing to think about is ageism: as a research engineer in academia, you're all set until retirement. In software industry, it's getting hard after 50.


seriously? in what kind of alternative reality does academia live to offer 30k?


European countries have lower salaries in general? Although their social safety nets are better.

Also, assistant professors (or the equivalent there) generally make less but do probably make more once they get tenure. I'm assuming they meant the tenure-track position itself is ~30K USD, but making tenure usually does mean a pay increase.


In the Netherlands, 30k is the starting salary for a PhD student.

30k for a tenure track position sounds insane to me.


In Ireland PhD stipends are closer to 13-17k. It's not a perfect comparison because the PhD stipends are tax free, so your comparative salary would be in the 17-20k mark. That said, Postdoc research positions are much closer to 40k than 30k.


That's still pretty miserable. In Germany, an assistent professor or postdoc makes 60k Euro after a few years, even when they are in the pay scale that only requires a master's degree (TVL-13).


I have spoken to quite a few that made minimum wage (which is not even close to 30k).


The situation is a bit weird in The Netherlands. Some PhD students are paid employees (AiO), their gross salary is ~31000 to 40000 Euro per year (I think this is excluding vacation money, but including 13th month).

Then there are PhDs that get a scholarship (bursaal), that is only around 24000-25000 gross per year.

Not too long ago, there were only employee PhDs, but some universities really love the scholarship system, because they have to pay less tax, so it's a lot cheaper for the universities.

My wife had a PhD scholarship in NL and it really had some large negative effects after finishing her PhD:

- She contributed 4 years less into a pension fund, since bursary PhDs do not build up pension outside the state pension;

- In her next academic position, they didn't consider her four years of PhD work as working experience, while they did do that for me as an employee PhD. So, she was set back 4 years in salary growth.

- She finished her PhD in August and started a full time job after her PhD. Because she had an income that went over some threshold, she had to pay the taxes that the university dodged by using the scholarship system. She worked the rest of the year at a loss (the taxes were higher than the income from September-December).

The worst part of it is that many foreign PhD students do not realize that there is a two-class system.


Update: today's news is that the minister of education requires that all students on a PhD scholarship will get a regular employment from 2024 onwards:

https://ukrant.nl/minister-zet-definitief-streep-door-experi...


That's pretty tragic, aside from the pension - which I don't expect to receive anyway (like most everyone under 40).


That is a pretty disappointing situation for a country that is known for being "progressive."


In the us, a few years ago, my program offered a stipend of 22,000 usd per year. Provided I taught a few classes, graded homework, tests, etc. While doing research and taking my own classes.

That was very lucky, many programs do not offer stipends and require people to take out loans.


For context, you have 7 weeks holiday, you can't get fired, your working hours are quite flexible. In France, medical expense, education is free. And outside of Paris and a few other big cities, rents are rather affordable. So all things considered, it's not a bad deal (which is why they do attract good candidates). And a typical SWE position in private sector in France would be $50-60K (of course there's variance there, but in academia, there are also ways to make extra money)


I've seen people get fired from a academia on several occasions... When they couldn't fire someone they beat them down so regularly and buried them so deep people left or had a mental breakdown.


I'm aware of several academics that are banking 1M+ annual providing consulting services.


Only possible if you are a late career academic with a lab and a publication record. They are paying for the prestige and the pipeline to new hires more than the technical advice (ok sometimes they pay for the technical advice but I’ve never seen that get a good roi). I can think of a few instances of professors consulting with companies where I worked and they all had grey hair and Tenure. The rich get richer, but these examples are a variant on ‘lottery winners can make good money’.


that's way to low even for private sector. flexible working hours are pretty much the new standard in SWE so not anymore even an argument


> you have 7 weeks holiday

Paid?


Paid.


A university lecturer in the UK will start at around £30-35k.


come to Denmark! you can make about 60k a year as a phd student!


But then you need to learn Danish. Brains can only fit that much stuff ;-)


> You don't have enough letters after your name to matter, and you will likely be a pet instead of a peer.

This is an underrated point. This is the case for programmers in finance as well, and requires a hefty salary premium to put up with.


It certainly echoes my experience having just left a professional dev job in academia after an 11 year stretch. Anybody without academic credentials relevant to the subject matter is "the help" no matter how much you contribute, and it's flat-out demoralizing.

I worked on a tech-heavy project large enough to get an NYT feature article covering its launch. For it, I collaborated heavily on the service design and logistics, and singlehandedly designed, built, administered, documented, supported, and provided training for the technical infrastructure and more than a dozen related interfaces and tools. In lines of code, it probably landed somewhere in the low 5 figures, but that was certainly way more than it needed to be. It was hackish but durable and performant. It was an exercise in pure generalism— no individual accomplishment was close to technically innovative enough to warrant a novel white paper, but I was invited to speak at a few related conferences about it.

But the professor overseeing the project didn't even mention me or my role in his launch party speech for the folks in our building, let alone anywhere that would have provided career visibility. He thanked and spoke about the contributions of every other major contributor— even the temp worker who ran the machines (he wouldn't want to appear classist after all)— but I got a hand shake and quiet thank you after his speech for my 5 year effort. I was at every related manager's meeting and largely seen as one of three "go-to" people for the project in general, not just tech stuff.

This sort of gatekeeping is a part of academic culture I just don't get. At least in business there's some predictability to people stepping on each other to get to the top, but what's the purpose of this?


This sort of gatekeeping is a part of academic culture I just don't get

This is just a hypothesis, but I'd predict a high correlation between becoming an academia lifer, and having certain preexisting personality disorders, stemming from having never derived a sense of self worth from anything other than academic achievement since they learned to speak. Or maybe I'm just speaking for myself :)

Similar to the top tier of tech companies being destructive and amoral in their own ways, not only because they're corporations, but also because programmers see technical challenges waiting to be solved like a moth sees a porch light, but see ethical problems dimly. (still probably speaking for myself...)


> Anybody without academic credentials relevant to the subject matter is "the help" no matter how much you contribute, and it's flat-out demoralizing.

That's my number one advice regarding academia: unless there's a path toward a valuable visa, or it's paid work while getting a valuable degree (read, something that will have the prestige to open doors) or your co-author at a good university you're much better building something for yourself somewhere else.

> no individual accomplishment was close to technically innovative enough to warrant a novel white paper [...] But the professor overseeing the project didn't even mention me or my role in his launch party speech for the folks in our building, let alone anywhere that would have provided career visibility. He thanked and spoke about the contributions of every other major contributor

That's because papers are the metric by which visibility is measured. Pretty much the only way to move forward is getting your name as author on the main papers.


This is different, though. I was a professional developer working in a non-academic lab doing work the academic world really cared about. Public recognition for big accomplishments is what distinguishes me from the 'web guy' at the help desk who knows how to customize Wordpress themes, and will lead to progressively interesting roles that pay well in exciting organizations. Just having X number of publications under my belt wouldn't budge the needle for my career. It's a weird sort of in-between spot without obvious career trajectories but you can get a decent salary while working on cool stuff.


What you did doesn't matter, and they gave you the right amount of recognition. There are 1000's of imported indentured servants that will happily do your job the moment you leave. Of course, we don't call them indentured servants any more, we use terms like 'academic visa' or such.

I'm sure they also didn't think the electricity company for keeping the lights on, or Microsoft for creating Windows to write their speeches, or the guy that emptied the waste baskets in the office so the PHD guy didn't have to.

Don't carry water for someone else. Enrich yourself. That's all anyone else is doing, all the 'research' is for personal enrichment and prestige. Don't prop up the broken academic industry with less than market wages, let them fail.


Nothing anybody does matters. It's all a big scam, man. ::hits birthday cake flavored vape and sips energy drink:: I'm looking out for number one from now on, bro.


> no individual accomplishment was close to technically innovative enough to warrant a novel white paper

There are so many academic journals, from scammy, to bad (yet honest), to average, to good, to the top. You can publish almost anything, if you select an appropriate, less prestigious journal.


Yeah— wouldn't have helped in this situation. I was a professional and (deliberately) not in an academic career path, and at this very prestige-conscious institution, publishing in a scuzzy journal probably would have made me look worse.


Yeah but am I going to get the job if all I have is garbage published in no name journals?


Are you talking about the letters P, H and D? As in, if you don't have a PhD they don't see you as a peer?


While informal culture and individuals' self-importance do play a role, it's also down to strict old-fashioned salary scales that many universities have in place (even if your day-to-day colleagues see you as a peer, the administrative systems defining your salary range can't/won't). Salaries are often strictly attached to letters behind your name, at a high level, and largely immovable by individual research departments.

And secondly, while your PHD peers may earn more than you, they also often earn much less than software industry averages.


I don’t think this is true in the slightest. At UC, research assistants typically make more than Grad Students or post-docs (of course the overhead and mentorship are also different and allegedly there is some possibility for greater career advancement). The snobbery is just plain snobbery. In industry there are plenty of people who make substantially more than me and I have never once felt the levels of condescension that I got from mediocre academics. There are maybe rationalizations related to scarcity and all that but jerk behavior is still jerk behavior.


The intent of my comment wasn't to make out snobbery doesn't exist (or isn't rampant - it is & I've experienced plenty of snobbery from academics myself). Just that there are additional factors.

> At UC, research assistants typically make more than Grad Students or post-docs

That's cool but I didn't say every university; I don't think one counterexample makes my comment "[not] true in the slightest".


I've honestly yet to meet a research software engineer without a PhD because of the academic bias you will get in, well, academia.


Depending on the definition of RSE, I may or may not have been one. The company I worked for was a Synchrotron Light Source; I worked on software for data collection on X-Ray beamlines. I would say that only about half of those in the same role as me had a PhD.

Moving away from data collection to analysis, the fraction of PhDs went up, but only reached 1.0 when considering the sub-group specialising in structural biology.


For many years I worked in a high profile research institute (neuroscience) as an RSE without a PhD. Still don't have one, and that's okay (for the path I'm on). Quite a few of the other RSEs in the institute don't have one either. In total I'd say maybe 50% didn't have a PhD.


I'm one who started with only a BS, and I'm at a top-20 public university in the US. It depends on your PIs, but I've definitely been appreciated on many of the projects I've worked on (e.g., listed with 2nd most ownership percentage on invention disclosures, which also won a campus-wide yearly award).

Admittedly, my path was convoluted; I started as a engineer to help with non-research software at a large lab, and got pulled on to projects via reputation. But I was replacing a Master's student who was essentially at the same academic level as me anyway. It does pay less, but I made the tradeoff for the quality of projects, which was worth more to me at this point in my career. It's still much more than I need, just not at industry levels.


I found a role like this. I love it, with the caveat that doing research, software dev, and some lead-type stuff is a lot of work. Though my hours are capped at 40, I probably am thinking about it on some level at least fifty hours.

Pay is quite good, though, so I can't complain.


I work with two of them at the moment. One is planning to apply for a PhD studentship soon, but the other does not intend to do so.


Ok well I can fix that. Hi, I'm a research software engineer and I don't have a PhD.

I'm in Europe. My salary is definitely better than the PhD students' salaries, and I have a proper adult pension as that's a legal requirement here. My salary is approximately equal to what a graduate might earn 1-2 years after graduating in the local market, so doesn't match my actual experience, but I accepted the post for pandemic-induced reasons. Certainly the salary does not, nor will ever, compare to levels.fyi/FAANG type jobs or a large corp in country.

However it is true that my position officially is very much a curiosity. We don't have a defined RSE type role, so the slot I fit in is "staying on to help out on project after graduating". My job is a fixed term contract that can only be renewed a certain number of times and I'm approaching that limit soon. There isn't any viable track to joining the ranks of researchers - I would have to do a masters first, and this ironically would require doing an internship, in spite of the fact I have more actual industry (non-university) experience than the entire lab combined.

I'm also not sure if my lab head bent the rules or not on hiring me - it might be the case that I am supposed to have a PhD or at least a masters.

I would agree with top level post in most points. It is interesting work, but I don't "belong" anywhere in "the system". This might change in 10-20 years. Artefact evaluation is very much becoming a thing in systems research, because being able to reproduce other people's work is quite important, and very occasionally you will stumble upon papers whose claims are, ah, more expansive than the associated github project can fulfil. As more research relies on software that graduate students are simply ill-equipped to write (by virtue of having no experience in anything and by being taught by professors most of whom no experience writing production code) the role of an RSE might become more important in time, but like anything it'll be a slow change.


> My salary is approximately equal to what a graduate might earn 1-2 years after graduating in the local market

When talking about the "local market" in Europe one needs to take into account the large number of "dark matter devs" that are working remotely for SV companies, at SV salaries. They simply won't ever show up for interviews at local companies.


In this case then I mean local local market, not devs working remotely for SV. I am aware. One of my friends does this and earns 2x what I do, in cash.


I’m surprised they even let you in without one.

Everyone “scientific programmers” potion I’ve seen wants you to have a PhD and be a domain expert.


Even if you have a PhD it sucks. Everyone without a PhD is trying to one up you, and everyone with one has invented ten other arbitrary things that ensure you are human trash on arrival.

Also, imagine all the people who failed out of masters or PhD programs who end up in management and are resentful. It's a surprisingly common thing.


Agreed- you might appreciate a story that happened recently. I worked at a finance algo trading startup, right before and into the financial crisis. The first CEO/founder gets ousted, new guy is an old school "phones and traders" kind of guy, and didn't know, or even seem to care about tech at all. It was a strange choice, since we were built as a tech first company, but seeing as we were having difficulty getting traction, I think the hope was by getting the old guard type in there, we would have an easier time selling the new thing... anyway, I give this a go for a few months but eventually leave as I just could not stand his hardly contained contempt for technology, you could just see it on his face that he longed for the days of dropping F bombs on the floor, and then going out for expensive steak dinners at night. As I give my resignation, I get screamed at, he is red in the face dropping F bombs on me- "You are F---ing us!" etc... Long story short, I offered them to counter, with a 10x'ing of my equity stake, and even to extend my notice period- at my newly offered salary, but they declined all of it, though practically begged me to stay on for 3 extra months at my current salary. On the week I left, this guy tries to get me to sign all kinds of nasty non-disparagement agreements, which I had not signed previously, and with no additional consideration ($) in exchange, and I just refused, and he literally threw the stack of papers at me at one point. I guess I took all of this because I had literally been there since day 1 and just had a sense of ownership over everything- I also didn't think this guy would last very long.

Anyway, fast forward about 10 years to a few months ago, I get a generic "cold call" type message on LinkedIn from a unicorn data tools company, from the same CEO guy- he bounced around and somehow landed a sales role there. I ignored the first one... he sent a followup, and I was incredulous- did he not remember me? Did he not care? It was something along the lines of "Hey how are you? I am working with xxxx and think you would be interested- can we set up a chat..." and I just replied back saying "I am great, haven't been screamed at or had anyone throw something at me in ten years..." and he still had the balls to right back something like "lol, great. Let me know when we can set up a call..." and I wasn't really sure how to respond, but after about a week just wrote "If your next message isn't a very specific apology for your past behavior, do not contact me. I am surprised with your past attitude you would even work at a place like xxxx." If he was a dick I was going to reach out to their head of sales and possibly CEO explaining his past and how I was disappointed that a firm with their reputation would even hire someone like that. He gave me just a half hearted enough non-specific apology to not do that- followed up immediately with an offering of buying me a beer (so he can pitch me), so I opted not to escalate any further.

I have a few other stories similar to this, where Karma really bit those that were hostile and condescending towards technology and technologists, but this is the most direct. I kept in touch with a few of the old "traders" I used to work with, and used to go out for drinks with them from time to time, and they would invite a larger group of people, and I actually stopped because they were all kind of depressing- they all lost their old jobs, a few pivoted into different decent roles, but mostly they just got drunk pining for the good old days to come back.

While there was a tiny bit of schadenfreude, in the end its just sad.


As a counterpoint, I know someone who went back to academia after getting disillusioned with tech (academia > tech > academia). The key difference may be that they live in Europe, have no student loans and the pay disparity between a developer in bioscience and in tech is not as large as I imagine it is in the US. They are paid significantly more than the scientists they work alongside but not much less than they were paid in a good tech job. For them the bioscience work is much more interesting than anything they did in tech (they have a maths-based PhD, so were working on quite complex problems but in a relatively boring field).

Software development is important to more and more industries and the pay disparity caused by insanely large funding and little requirement to produce profits means that other sectors are being priced out of in-house development, especially niche use cases. The ongoing rise of no-code development will be increasingly useful across all sectors but will fail to deliver a lot of these niche applications.


I did this as well, but started my own company. The key for a lot of people making the switch is to make the money in your first endeavor - in tech - then transition into a lower-paying but more pleasing industry, with the money buffer you built up making it possible. I've seen it a few times lately.


Salaries in Europe are catching up...


A lot of the smarter people I know have been recruited into Europe. People say "the salaries are so much lower lol", but the reality is, you often have employment laws that remove terrible occurrences as possibilities that are commonplace in America, you have access to healthcare, being a home owner is actually possible and if you don't want that renting is better overall. European culture is usually way less cut throat, and managers typically know their stuff, rather then failing upwards to half a million dollar salary's where using the word "digital" and being a brute is the main requirement.

Salary isn't everything. European engineering is a pretty different culture.

The way the us tries to prevent this is by crippling their people with student debt.


I don't have any student debt pressure, but I'm debating trying to do the same. I have a lot of friends in Denmark and no strong ties to the US. I'm about to hit my forties and it's probably now or never.


> at least in bioscience...

BINGO! that field is notoriously horrible and interacts extremely badly with a 'when not discovered here, not important' syndrome. Biology is brutal toward bio-physicists, mathematicians as well as people who code who they are forced to work with rather than seek out to help them.

I still hold up as an example nonsense discussions around p-values in bio vs actual work going on in statistics in maths departments. It shows how far detached they've become.

Not to criticize too strongly, but given the above, combined with it's reproducabilty crisis, and existential problem of being in the back-pocket of big-pharma, I seriously doubt the professional integrity of a lot of people in the field.

Move toward mathematics, physics and chemistry. There is (some) serious money and a good atmosphere around areas such as finite-element modelling, or wolphram like tools as an example. There is a lack of direct funding for decent posts but you get recognised and paid the equivalent as a peer, I know from working with some of these people. It's not to say it's 100% always without friction, but no job is I'd argue.


The reproducabilty crisis in chemistry is just as bad, if not worse, than biology. Anyone with a pen can reproduce a math proof. If you work on a big project (physics experiment) where every paper has a dozen eyes on it you can't slip crap work by your peers because that's their livelihood on the line. In between you have bio/chem fields where each project is too expensive to trivially reproduce but still small enough to have only one career on the line for each project.


Most of the reproducibility issues in chemistry happen in biochem in my experience(meanwhile it gets the most funding). That said, synthetic chemistry is also a problem area. Usually in synthetic chemistry it's not that the work can't be entirely reproduced, but rather that yields are fudged. That's mostly because PIs say "you can't graduate until this reaction yields 99%.". So after someone has written four papers, taught classes at minimum wage for 7 years, they fudge a 95% to a 99%. It's not okay, but neither is the way academia is structured. Super glad my discipline was elsewhere, but I saw colleagues suffer from this stuff...


No there isn't good money in physics and chemistry or pure math. PhD chemists almost never make 6 figures even in high cost of living areas serving as a specialist. I made less as a senior scientist or a project manager in chemistry than I do as an entry level software engineer. I don't know how many physicists I've met who work minimum wage jobs, usually call centers, after their PhD/post doc (even finding a PhD is difficult, let alone completing one in 6 years).

FEM can offer money but you are competing against engineers who that's what they've done for years.

If you interviewed software engineers and data scientists right now I bet a third of them once were physical scientists/mathematicians who mostly regret their degrees or the fact they can't find survivable work using them.


>"I bet a third of them once were physical scientists/mathematicians who mostly regret their degrees"

Would mathematicians truly be regretting their degrees, if they decide to work in software? I read that mathematics one of the best degrees for a career in software engineering, as computer science is very closely related to mathematics (to the point where studies of algorithms are largely the same for mathematics and computer science students).


Theoretical parts of computer science is connected to discrete mathematics, sure. But that is only a subfield of mathematics and mostly happens already at CS departments, so you'd get a CS degree anyway.

It is also possible that aptitude for math is related to aptitude in software engineering.

However: The mathematics content of 90%+ of mathematics degrees awarded is fully irrelevant to 95%+ of software development tasks. And when that 5% task needs that some kind special mathematical insight, the people who want that task done are going to get the top professional they can find for it. Maybe the prospective math student is going to be that professional, but I don't recommend planning a career for it.

I am not saying there isn't work where some math is useful but the most commonly used applied stuff ... say, linear algebra ... is typically covered in a respectable engineering program; degree in mathematics would be superfluous. Proving theoretical properties of Hilbert spaces or measurable sets or bifurcations of dynamic systems or advances in differentiable topology or fascinating behavior of cellular automata or whatever is going to be gigantic waste of your time if you won't use it later in your career or you don't find it intrinsic motivation in itself.


And five years of gluing APIs together that help get more people to click advertisements - you'd be surprised how much math you forget. Machine learning can be better for exercising math, but most company's do not want anyone doing anything new. Same goes for physical sciences in my experience. You basically get a PhD to do associates level work. Even if you know a better way, that comes after you get ten yrs experience and have authority over projects. See the first sentence of this post for a catch 22. Bleh.


> "good money" apparently a very relative term, I think I'm in it for job satisfaction then at 5 figures, shame I'm a qualified expert.


I would say that is a very strong criticism and very warranted! For note, I witnessed the immolation of two careers over retractions of papers that could not be replicated. You could say that the system worked. That was a while ago, and I'm sure the paper mill phenomenon is in full swing. You get echo chambers of PIs that rubber stamp each others work.

In my case, I was in basic science which hit a crisis near 2008 when the NIH was flat funded. This caused a come to Jesus moment, where suddenly all basic science labs were rebranded as translational medicine. My department was absolutely gutted, down from 15 or so PIs to maybe 8ish in the span of a year. Our field was bioenergetics which at the time was pretty competitive, and easy to link to diseases/metabolic disorders. We didn't work with pharma, some labs received contracts for small work. NIH was by far the biggest funder, followed by DARPA and other smaller health organizations.


I will say IMO (and experience) in professional math that while there is perhaps more of a chance for an outsider to have an impact, Mathematics is hardly free from bias towards insiders: it can manifest itself as subtly as using notation as a shibboleth (e.g. it’s somewhat easy to tell which community an author comes from through their notation and terminology, and equally easy to harbor resentment towards those outside your field) all the way to active “prove I’m the most clever in the room” syndrome during seminars. I’d like to think that a more collaborative atmosphere is prevailing now due to the rise of interdisciplinary and applied math, but people are people everywhere and as Sayre stated “Academic politics is the most vicious and bitter form of politics, because the stakes are so low.”


"Not to criticize too strongly, but given the above, combined with it's reproducabilty crisis, and existential problem of being in the back-pocket of big-pharma, I seriously doubt the professional integrity of a lot of people in the field."

Lack of professional integrity is a very real problem.

Over the past two years I wrote fairly frequently about some of the nonsensical / pseudo-scientific COVID papers that got published, especially the quality problems in epidemiology. Epidemiology isn't bioscience (actually that's one of the problems with it - total lack of biology), but it's adjacent. After that I got contacted by a former research software engineer who worked with top epidemiology teams in the UK. I also got contacted by a member of the SAGE committee.

Both of them told me some absolutely mind-blowing stories of ethical malpractice. I wasn't totally surprised because it was obvious that those sorts of things must have been going on behind the scenes just from reading their model source code, reports, watching their behavior etc. The RSE had become so disgusted at what he'd seen that he actually left the country and switched from working at Oxford to some US university I'd never heard of, switching fields along the way too. Quite the downgrade in prestige but after years of trying to help epidemiologists he concluded the entire field was utterly morally corrupt and he wanted nothing to do with it.

Here are some of the more memorable things I was told by those two scientists:

- The RSE at one point found a bug in a FORTRAN model being used to model malaria outbreaks. It had been used as the basis for hundreds of papers but at critical points was using pointer values as variables instead of dereferencing them. Obviously, a typical pointer has a very different value to most organic things (some FFI bug). He reported this bug to the authors and got a reply back within 30 minutes saying they'd checked the papers (all of them) and it didn't affect the results. This claim was very obviously a lie: not only could they not possibly have checked even one paper in 30 minutes but he already knew fixing the bug did indeed change results! They didn't care and he was shocked that his "colleagues" would bullshit him so directly, especially as they must have known that he would know.

- Same guy flagged code quality issues to some of the scientists and proposed introducing some rules designed to improve quality. He was dismissed with the words: "oh <name>, we're scientists, we don't write bugs".

- The SAGE member told me about some of the internal discussions they had. Criticisms of the methodological validity and accuracies of their models were dismissed with reasoning like this: "that person reads the Spectator so it doesn't matter what they think". Relatedly, he made clear that the supposedly scientific SAGE predictions were sometimes being altered to reduce criticism of the group by left wing media and journalists. The changes were presented as "the science changed" but that wasn't what was going on behind the scenes.

- Malaria research is (supposedly) being badly distorted by the Gates Foundation. Gates only cares about eradication which leads to lots of problems. There are some smaller ones, like many researchers don't genuinely believe that's possible but lie on their grant applications to make mitigation efforts sound like eradication efforts. And then there were unethical experiments on entire populations where e.g. whole areas are blanketed in anti-malarial drugs. If it works, great, you eradicated malaria in that area. If it doesn't you just selected for drug-resistant mosquitos and now the drugs that were being used only to treat the serious cases don't work for anyone. He told me this has actually happened more than once.

- The RSE told me they'd at one point tried to recruit an RSE working with climatologists to help them with their modelling (a belief that climatologists are more rigorous than they are seems to be common in epidemiology). The RSE they interviewed refused to take the job. His reason was he was quitting academia entirely, as he was so disturbed by the practices he'd seen.

A few years ago if you'd told me that a whole research field could be unethical I'd have thought you were crazy because, well, that's a whole lot of people being painted by a very broad brush. Now I've seen it for myself and heard from other former insiders, it's easy to see what happens - the honest ones discover what's happening and leave. Because academia hardly ever penalizes intellectual dishonesty, the pool of people who remain are the ones who are OK with it and have learned that it works / has no consequences. Things steadily become more and more toxic.


I probably shouldn't go too public with what I know of report 9 that isn't on the record, but frankly next to no code from biologists has gone through peer review and people put "experts" on a pedastle because of what they claim their tools can do.

What I can and will say (and is on record) is that reproducibility was not a concern from the Imperial College virology dept.


The converse is something akin to Tableau, which came out of the graphics department of Stanford and Pat Hanrahan's lab. Tableau was acquired by Salesforce for around $15B. No doubt, proximity to SV was key to their growth. But the original research from 20+ years ago on DataCubes visualization was a product of academia. It just so happened to coincide with our era of "democratizing data science" ;)

Multiscale Visualization Using Data Cubes

https://graphics.stanford.edu/papers/pan_zoom/


This is pretty much my story as well. I work less and get paid much more after leaving academia. Idealism can last so long before one gives in.


I have some similar feelings about working for a game company (that was fairly successful and on the “better half” of the distribution of game companies).

Loved the creative control and influence I could have (even as a mid-20s tech lead on a title), loved my colleagues, loved the work, and even enjoyed the satisfaction from shipping a golden master after a multi-week crunch period.

Ultimately, a hedge fund was willing to pay me a multiple of what I was making in games and I decided I’d rather have a house than work on games.


Having worked in academia before the age of 40 (though not in computer science), I can agree with this view. To enjoy the benefits of status, compensation and, to a certain extent, flexibility, one has to reach the level of professor. After failing to reach this level at 40, I switched to software development, without ever regretting it.

A while ago I saw a position for image recognition in astronomy advertised by the university of my town. It sounded all very exciting for someone who enjoys figuring out solutions for a complex task -- until I looked up the pay grade: Less than half of my current hourly rate, without the flexibility I have (working from everywhere I want, even pre-Covid). Well, the problem seems to be that a competitive salary for a software engineer would probably have to be higher than for the professor leading the group.

As an aside, I recently had a discussion with a friend in my country's military about the cyber defence forces wanting to recruit software engineers. There is a similar problem here: if they do not use contractors (whom they can pay what they ask for), they have difficulty finding an appropriate pay grade, since a well-qualified software engineer would have to be paid better than a general.


totally this. I had a boss once who took me aside in her office to probe interests and direction. Somewhere she quipped about work in academia being paid peanuts. It was a pretty shocking statement and attitude to hear it so bluntly, especially being so close in time to my own graduation and years at the university and admiration for that world. But in all honesty I never heard it challenged much.

Later as I started to hear more about how economies function and how revenues really build up and stem from consumers in volume I came to realize that things that ultimately benefit a lot of people generate a lot of money. I saw an intermediary of this working at Apple and seeing how much funds they had to spend vs biomedical companies that were more conservative with their funds. Consumer electronics and consumer products in general have a lot of customers, benefit a lot of people, and ultimately earn a lot of money.

Academia is much more limited in its scope and immediate benefit. That delay in benefit shapes the money involved in all sorts of surprising ways that aren't immediately apparent while still under the wings of the academic world and the "currencies" they operate with be it notoriety, prizes, grants, etc etc. Ultimately the results and products of academia are suspect and risky since they're often in the prototype unvetted phase of birthing into existence. Those thoughts Elon Musk shares while touring Starbase about design vs manufacturing in the gauntlet of tests against reality where the two forms get vetted side by side come to mind here. His statement of "design is overrated" probably has a close analogue in academia. Products of the mind are essentially untested and may not stand up to reality along whatever dimensions one needs to evaluate them against, or as is probably more often the case simple don't scale to the degree needed to impact a large number of people in a short time frame to translate to paying customers.


I dropped graduate research into adversarial algorithms and generative adversarial networks when I realized instead of being paid beans to do something genuinely interesting I could get paid 6 figures to make business software and do whatever I want with my free time. Like so many other potential promising academic software engineers, I had a family to raise and a life to live. No kidding science needs more research software engineers, but that isn't going to change until science can pay software engineers at least a basic minimal income. When that changes I'll considering picking up where I left off.


Tbh the more I hear about the academia the worse it sounds.

Look I’m willing to take a massive paycut, I was starting to come around to the time cost, and hell, I’ll even work in Matlab. But:

> you don't have to work weekends, and you also don't have to feed the mice on a Sunday night

This is just wild, and if anything reminds me of low skilled near minimum wages jobs.


I heard nih might be a decent place for possible permanent position other than PI in bio fields


What's Nico Stuurman up to these days? https://valelab.ucsf.edu/nico_stuurman/


Ah, the uManager guys! Great software, sits on top of ImageJ (open source image analysis).

Our software was a custom C+Win32 app that was ported from CodeWarrior on MacOS 7/8. Windows timers were so crap that I ended up using Ryan Geiss' timer from Milkdrop: http://www.geisswerks.com/ryan/FAQS/timing.html

Yes, that Ryan Geiss, the Winamp one. He now works for Nvidia I think.

Our machines were bristling with serial ports, driving high speed filter wheels, lasers of every color we could get our hands on, special shutters coated with teflon, fast expensive cameras, and more! Their work is very much in my old field, I was in bioenergetics, specifically mitochondria and their dysfunction.

Thanks for the link down memory lane!


Thank you for your comment, it was a joy to read and get a peek.


As usual, xkcd is relevant: https://xkcd.com/664/


I think the idea is that the SNR of .edu TLD is likely higher than all other TLDs. Probably true for now.


Too bad .edu seems to be restricted to US-based schools. There's plenty of great stuff hosted by universities in other countries but none of them are on the .edu TLD.


There are other national domain schemes with educational subdomains. 'ac.uk' and 'edu.au' being two of which I'm aware.

A listing of top global educational institutions would only have a few thousand entries.

It's not without some irony that I note that the early, and much belated, Internet was largely edu domains, along with a handful of tech firms, government agencies, and military entities.


There is theory, and there is practice.

If you do this kind of work, sometimes you will meet people that are completely unreasonable and there is no satisfying them.

Try working at a Toys R Us at Christmas and get shouted down by a parent because you ran out of the hot thing that season. For a concrete example, it was Tickle Me Elmo for me. It is not your fault, or the business' fault, but you can reset assured you will be the lightning rod for this.

As for serving, you can bet the people causing the most trouble are the least likely to tip. And at BEST they will tip nominally. There is some division that I don't understand between certain diners and servers that these diners consider their servers to not be their peers.

The customer is not always right, the entitlement of customers is off the charts in the past ten years. Expectations of online shopping applied to real life are very extreme. "I just want to have a good experience" style reasoning, when sometimes, things just don't go your way, and that is life. Deprioritizing a table is a survival strategy, to keep the plates moving.

How we solve this, I don't know, but I would say top down thinking is assuming that customers are 100% rational all the time, and I can assure you from the trenches it is not.


Torpedos are not bullets, I don't think this is a good analogy.


> Torpedos are not bullets, I don't think this is a good analogy.

The point of the analogy is that the enemy platoon / nuclear submarine force would need be to be located and behave in a way that allowed a single shooter to wipe them all out almost simultaneously. This is particularly unlikely for submarines, especially nuclear-armed ones.


> especially nuclear-armed ones.

I took "nuclear sub" to mean "nuclear powered" (as opposed to e.g. diesel-electric, which are still fairly common), which I think represents a somewhat larger set of submarines, outside the US, than just the set of all non-US "boomer" (nuclear-armed) subs. That is, attack subs without nuke-tipped missiles can still be nuclear subs because they have a nuke plant onboard for power.


The submarines in question, Seawolf class SSNs, have nuclear reactors but not nuclear weapons.


Agree - it's not actually clear which type is meant. If boomers then this is a indeed very small number (e.g. only four for the UK, which is one of the few non-US countries (four?) to operate boomers).


> Torpedos are not bullets, I don't think this is a good analogy.

They aren't, but I'm sure there are operational complications to using them that make the analogy work (e.g. countermeasures).


I believe my point was that a modern torpedo hitting a ship is a very high kill probability (mobility or other), being it is guided and explosive.

I would say a missile is more similar to a torpedo. We can look to the Falklands as to how dangerous missiles are against ships. This isn't your grandfathers WWII torpedo.

There is only one other case of an anti-ship missile I can think of off hand, in Desert Storm when a Silkworm was shot at a fleet and shot down with a SeaRAM(?) I think.

I am surprised at the downvotes, but I guess also not. Either way, the information is out there. I agree with the article in that military software is generally awful, but I'm guessing it is not easy to innovate.


Trading company is separate from the foundation, purpose is to split the edu org from the industrial org. For those that don't know, raspis are bulk ordered for industrial use and this part intends to escalate that part of the business. RP2040 is an example of spinning off raspi tech for use in other boards.

If you look back, you can see where Upton and friends are good stewards of the raspi foundation (separate).

hn community is so used to companies being two-faced and kicking the ladder out when they "make it", but I don't think this is happening here. Even if it is, its fair to say we've gotten a lot from raspi.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: