Hacker News new | past | comments | ask | show | jobs | submit login
Surely you're publishing, Mr. Feynman (praveshkoirala.com)
209 points by pkoird on Dec 1, 2022 | hide | past | favorite | 104 comments



Quoting the last two paragraphs of the blog post (cuz I think that's where the gist of this post lies):

> Research should never be driven by success, because the pursuit of it either disheartens, or beguiles, even the greatest of men. An ideal Research should truly ever be ignited by curiosity. It should be fueled by the desire to know. And it should be done for the pursuit of truth and truth alone, no matter what it is. Feynman investigated the curious wobbles only for the pure joy of satiation. And the rest, as they say, is history:

> "I went on to work out equations of wobbles. Then I thought about how electron orbits start to move in relativity. Then there’s the Dirac Equation in electrodynamics. And then quantum electrodynamics. And before I knew it (it was a very short time) I was “playing”–working, really –with the same old problem that I loved so much, that I had stopped working on when I went to Los Alamos: my thesis-type problems; all those old-fashioned, wonderful things. It was effortless. It was easy to play with these things. It was like uncorking a bottle: Everything flowed out effortlessly. I almost tried to resist it! There was no importance to what I was doing, but ultimately there was. The diagrams and the whole business that I got the Nobel Prize for came from that piddling around with the wobbling plate."


The implication of the quote is that anything "ignited by curiosity" is worthwhile. This is all fine and good, as noble ambitions go, but it does not answer the key questions for academia: (1) Which papers to accept for publication?, and (2) Which projects to provide with funding?. The context of course is that there is only limited room for papers in publication venue (journal and/or conference) and limited funding for science.


> Which projects to provide with funding?

There's a somewhat old adage (?) that says: "Don't fund projects. Fund people."

The gist of it is that by funding projects, you're funding a somewhat myopic view in that anything discovered adjacent to the project is now out of scope of the funding. By funding people with interesting ideas, you get a much more creative and free-flowing research program.


I've had a pretty successful research career, but it's very rare I've had a project that actually delivered what I said it would on the funding proposal. Something more exciting has always come along while we were working on the original topic, and I've always gone with it. We've ended up significantly delivering better research than the original project would have.

And in 30 years, no project officer from a funding agency has ever complained that we'd delivered really good results but they weren't the results we'd promised. They're generally happy something good came out of the funding, because frankly, most funded projects don't produce much of value. Very occasionally (mostly on EU projects) we've had to write up the results in a project report with the original section titles that no longer match the new content to tick some boxes and keep the funder beancounters happy.


The amount of shifty money practices I've seen so far is astonishing, all in the pursuit of more science per proposal. Ideally you use some of the funding from the previous project to do some exploratory research somewhere very different and then write up a proposal around it, but since you already did some of the work there'll be money for more research and instruments and things and maybe fun ideas that students have.. creative accounting all around. I'd be appalled if it wasn't in the name of science.


I think many people do that - it's very hard to write up a compelling proposal without having spent some time working on the idea to see if it's viable, but there's often no funded way to do preliminary work on an idea. If you don't do something like what you describe, it's nearly impossible to branch into new areas, and that makes it nearly impossible to do good research. This is another problem with funding project proposals.

But what I was describing was slightly different - once the proposal is funded, many PIs feel compelled to deliver what they said they would, whereas in the two years since you wrote the proposal, the world has moved on, you've learned things, both from your early work on the project and from others, and more promising avenues are now possible. If you feel compelled by the funder to continue in the original direction, you'll very rarely deliver good research.


You shouldn't be appalled: grants that force you to work on a precise area with no flexibility are what's appalling. Research just doesn't work like that! The current grant proposal process is the creation of administrators and politicians, not scientists.

Many of the most critical research results came out of some random hallway conversation, as a response to some new result that didn't exist when the research plan was formulated, or because someone had a brain fart. If at each point someone had put up their hand and said "sorry, scientist, that's not on your current grant roadmap" we'd probably still be dying of preventable diseases and lighting our homes with gas fixtures.


Preferring people over projects creates a situation where you are carried by your status rather than your work. If someone had a successful project, it doesn't necessarily mean their next project is going to work. A total newcomer could potentially solve a very important problem (but get their funding denied because they are still not the right "people" to invest in). This also makes it harder for new ideas to propagate.

Funding should be a mix of track record with project consideration, not either exclusively.


No, we said fund _people_ not status. Someone’s status is their job title etc. that should be irrelevant in funding decisions.

Funding projects is a terrible idea that you should keep at a minimum. If you fund projects you will end up with the BS we have now - marketing in the form of grant proposals.

I think a really helpful analogy for how science should be funded is to think of how angels and VCs fund early stage startups - I mean, how they really fund them. Which is: is this a promising area to startup within? Is this a great team? And that is pretty much it.

The downside of this approach I think is that personal biases can creep in and create a lack of diversity. You can counter that with pots of funding that directly target diversity.

Fund people, your assessment of their ability, their team, and their area of interest.


The question then is, how do select the people. In VCs, like you said, you'd look for a great team. What does that mean? It means mostly one thing: track record. This is what I meant by status. It's not your title. In academia, very similarly to tech, some people are rising stars, and they can get funding for anything they like. I'm just saying, consider the project and not just the person.


I don't think that's accurate. One of the things that make incubation and choosing which startups to fund so difficult is precisely the lack of track record. According to Graham himself, when everyone involved with a startup are new to the scenes they're invading, you instead end up relying on such nebulous concepts as "imagination", "naughtyness"[1], or "earnestness"[2].

[1] http://www.paulgraham.com/founders.html

[2] http://www.paulgraham.com/earnest.html


Einstein completely transformed physics with some stand-out papers - most written as a patent clerk - and then spent the rest of his career producing almost nothing of interest. He commented on other work at conferences and in letters, but there were no more huge breakthroughs.

In modern academia that would be a terrible record.

But giving him tenure was a smart bet, because he might have produced more.

It's the Sabine Hossenfelder problem. Should physics fund incremental research which is a fairly safe bet in employment terms? Or should the money go on talented and creative researchers who may waste most of it - but one or two may produce something transformative?

Academia is heavily slanted towards the former approach. Because academia is now a business and has the same bureaucratic and corporate values as other bureaucratic corporations.

This is terrible for original research, because smart people need to given a free hand to follow their intuitions and interests.

Being able to afford to explore, play, and potentially fail would transform physics.

Because in reality academia is producing a lot of failure anyway, without the upsides of transformative new insights.


> almost nothing of interest

> terrible record

> might have produced more

You do not seem to have much knowledge of his publication record.

Here is a list of 272 journal articles by Einstein and coauthors, the majority of which would certainly be considered peer reviewed by modern standards.

https://en.wikipedia.org/wiki/List_of_scientific_publication...

From just these papers, among the things you put into the bucket of "not huge breakthroughs" were (in small length scales) Bose-Einstein statistics and Bose-Einstein condensates (notably his 1924 and 1925 papers), his atomic emission and absorption theory (the "stimulated emission" in laSEr), and his paper with Podolsky & Rosen and (in large length scales) General Relativity, Einstein gravitational lensing, the Friedmann-Einstein and Einstein-de Sitter expanding universe cosmological models.


Actually, Einstein's 1935 paper along with Boris Podolsky and Nathan Rosen basically highlighted the existence of entanglement in Quantum Mechanics. Of course Einstein was trying to show that QM was wrong via the EPR paradox but although reality has since proven him wrong, it was still a major paper written decades after his more famous work.


I don’t disagree with your main point, but I’ve actually seen Einstein cited as someone who had a long scientific peak. He first transformed physics during the 1905 miracle year and then again in 1915 after completing the theory of general relativity. From what I understand, special relativity was built upon the work of Lorenz and Poincaré, and likely would have been discovered only a little later without Einstein, but general relativity was a staggeringly original achievement. The University of Zurich’s bet paid off more than well when they granted Einstein tenure in 1909. His best work was still in the future.


I agree with you. Einstein published general relativity when he was 38-ish around 1917 or so, over a decade after his miracle year in which he revolutionized or kickstarted several domains in relativity, Brownian motion, and quantum mechanics. Even in the year of publishing general relativity, he basically kickstarted the principle behind lasers.

I mean general relativity is one of the most tested and verified physical theories, and it was practically done all by Einstein himself. It’s a hallmark of a theory. That alone is worth everything, and yet he was also the genesis for several other fields, both before and after general relativity.

Einstein is not a great example for this discussion. Even his “failures” are useful contributions. Many here are missing the point in that it is failures that we should not be afraid of funding, within reason of course. I.e., it’s okay if a promising idea “fails”. It is input to the broader research community and questions.


Maybe the trick is to do the y-combinator of research?

In the startup world it is widely believed that a bit of hardship is needed to show dedication.

1. Make the application process much simpler

2. Support with smaller amounts, individual subsistence salary

3. Provide the right placing in the right environment.


What you're describing there is called grad school


Einstein is really a great example for when your past success does not translate to future endeavors and I even considered mentioning him in another comment.

I know a few institutions where exploring and playing is a standard procedure, however they aren't many. And even this has a limit because science is expensive. So you need a way to know which silly ideas to pursue and how to best utilize your resources and budget.


Nah. The big 4 were written in 1905 . It was like in the 1920s or something that he worked on general relativity and wormholes.

Everything else I’m not going to read because it’s just bleeding heart nonsense.


Economy is always finite, sadly. Sometimes it may be just bad administration (inefficient, low funds, etc), but then you have to deal with politics, and thats another whole show...


There’s a subtlety there. It isn’t suppose to be about who you are but what you’re about.

Stopping caring about whether projects work or not is practically the point. Research isn’t about only tackling things we know will work. That’s effectively engineering and not research.

A lot of the funding today is already about status. There just needs to be less funding given out to people based on status and funders’ pet projects and more funding based upon the people and proposals. Solid people and proposals should not be turned down via lack of status or not fitting into some box of “this will work”.

A lot of topics are lumped into “that will never work”, but it’s often the case that they were never tried or even explored.


Agreed.


Just before I read your comment I came up with that idea "there should be a select few people based on their academic merits (whatever that means/however good that will go) that will have carte blanche."

Then I read your comment. Needless to say, I wholeheartedly agree.


It's a fair point, that there must be some criteria for some individual or lab to compete for funding. It might be worth creating a stronger distinction between truly novel papers, and general scientific papers, that goes beyond an informal understanding of which journals are prestigious. For example, going through Arxiv each week can be really tedious, with many papers being redundant or written that makes it hard to understand.


>Which papers to accept for publication?

That may have been a problem in Feynman's day, but today academic papers takeup negligible storage. The only limiting factor is the number of peer reviewers, I have no idea if there is a shortage of those or not.


Acceptance is less about storage space, but about gatekeeping for "quality". Which in turn is about what papers are going to get read, because nobody has time to read all of the papers. People rely on high-reputation journals to narrow down the time they spend reading.

The number of peer reviewers may be one limitation, but I think the main one is the number of readers. The hard limit is how much time readers spend reading. Getting access to that is hard.

Anybody can publish. You don't even need a journal; just put it up on your blog. But as with all blogs, the hard part is getting anybody to care that you've done it.


Administration has no clue which projects are good and which "scientists" won't spend the money on a sea of booze.

Two mechanisms should be used:

1. Administration decides which projects had real world effect, years or even decades after the work is done.

2. A directed social graph is built from citations and from working together on projects.

Money should be allocated to the network. Administration decides who did good work in the past; funding is allocated to everyone endorsed by those people; scientists vote with their grant money for projects that they believe in.


I've never known an administrator who had even the faintest clue what science was all about, let alone what previous was 'good' or what projects would have 'real world effect'. If they had insights on such things, they would likely be working in science, not administration.

Indeed the mess we are in is because of administrators who think organizing academics is like organizing work on an assembly line. They find something quantifiable and insist on more of that. Papers are quantifiable, so they go for that. To try to get at quality, they count citations. Always it's counting, and I'm not sure if that's by analogy to the assembly or because the administrators are challenged to do much more than count.


I think this would lead to a hellpit of sycophantic behavior. And, genius or not, I would not trust any group of people to be immune to the influence of the sycophants.


Would that be worse than the current disaster though?


Not necessarily. Still, a drastic change asks for strong evidence of improvement.


Honestly your proposal isn't too different from what I've seen going on in many university administrations. My initial reaction is that it is the current disaster.


I'll just say that whatever it is, it has to be corruption-proof. And in the current environment, I regularly see gaming the sorts of things you're suggesting.


Maybe overly limited funding for science is the problem? On the whole it all seems to me as if capitalism and the general problem of the (projected by too many?) “Leistungsgesellschaft” might have invaded science to a much larger extent than it used to?

It’s just a wild guess looking at it from the outside, I don’t have any experience with proper science at all.

To me this stuff just always reads as “ah great so that’s how we killed the potential for that (classic/next-gen) Star Trek future”.


capitalism offer more money to scientists, but you right, we need more money on science.


One of the great promises of the Basic Income idea is that within many domains, it eliminates project funding as a constraint,

with the anticipatable outcome that a great many people "waste" their time on uninteresting, incorrect, ludicrous, ideas,

and a few people who might have been fundable today, do solid real work,

and sometimes, a few people who would have NOT have been fundable (per OP), do work which Changes Everything, which might not have been done today otherwise.

Those of us reading this, we live in a surplus economy.

We could fund everyone.

Indeed, we should, because it would over a reasonable time frame, pretty likely Change Everything.

All it would cost is fewer 100m long yachts for a few thousand sociopaths.


How do we capture external value from letting people stay in an environment where the ey can stay curious and work on whatever they want at the time?

Does it always escape?

Could everyone work like this? (Surely not)

How can people that want to work like this do so while others sort of harvest the value that may not be harvested by those who are consistently curious and researching?


I'm not sure I understand what you think the difficulty in capturing the value of research would be.


If someone moves from one thing to the next without publishing and or producing something than the value is not captured


If they don't produce anything, there doesn't seem to be any value to capture.


Research and novel thoughts can have value to broader society.

What I'm getting at is how can we keep the curious curious while also capturing value from them. Let some people stay in this state -- I am mostly one of them, and have others follow them around implementing their ideas. Some of us have too many ideas and good thoughts.


This was a memorable passage from the post for me:

"This is because all efforts trivially lead to some results. It’s an entirely different matter whether it is an expected one or not. But such is the world we find ourselves in that anything short of a ground-breaking discovery is treated as an unacceptable embarrassment. And thus, every marginal increment is touted as ground-breaking, and every deviation is repackaged to look like an intended result (which is again touted as ground-breaking)."

It's even worse than this in my opinion, as the reasons ground-breaking discoveries occur are generally not why the field thinks (or asserts?) they occur. So it's not just the results that are misrepresented, it's the whole chain of things leading to them, and what comes afterward: how they came about, who was involved in what capacity, who considers them ground-breaking or not and why, and so forth and so on.


The second (quoted) paragraph is from the book which is definitely worth reading.


We complained about greedy publishers making millions, and touted open access as the solution. I used to get a couple of review requests a month, with maybe 6 weeks of deadline. Now I get 10 review requests a month with two weeks (max) deadline for open access journals, who are making even more millions than the greedy publishers by publishing basically any crap with a ridiculously high fee. So, now I deny 90% of these requests because it's just absolutely meaningless, and so the requests go to other people less critical (and who write quicker reviews) which means the papers are even easier to publish. In the end, the whole thing is just a joke. If I wanted to really keep up with my research area I'd have to read 10 papers a day.

The right model should be like the Oscars. Write your paper, put it up in arXiv for free, and let people read, review, and react to that. At the end of a certain period (months or year depending on it's a journal or a conference), the committee chooses a bunch of them and publishes them officially, somewhere. And leave all the rest of the crap out.


That's my general sense of how things should be, although increasingly I see issues with that in terms of basic "visibility" dynamics in terms of the search engine problem and so forth.

For example, Google seems to be downweighting search results older than a certain amount, and I've noticed that if you go that route, it can be difficult to find certain unpublished/preprint-type papers of a certain age unless they appear in press. I think this might lead to an exacerbation of fad dynamics, because then papers that people know and remember are those that get a lot of buzz -- this happens now of course too, but I think it could be magnified in a different system.

You could introduce new search engines and so forth, but that adds a layer of complexity to it, and I think journals always have a way of drawing attention to certain new papers and maintaining their real-world availability.

I'm not sure I have a solution to this, but I've sort of become convinced that the approach you're describing, even though it's something I'm sympathetic to, has its own sets of problems that aren't necessarily better. I also wonder if journals will just end up being reinvented because there seems to be a logical train of thought that goes "well pay attention to those you trust to find papers" -> article curators -> fickleness about what's curated -> curation by committee/group -> reinvention of journals.

Things are completely broken imho (in academics in general, not just journals) but I'm not sure what to do about a lot of it. Maybe the problems in academics have always been social weaknesses that get brushed under the rug, but now with so many people involved there's too much to brush.


Over the last 50 years the value / norm system of mediocre minds have won. I think it’s as easy as that, and you see it everywhere.

Sadly I think it has to do with external threat: When there is one (like during the Manhattan project) the mediocre can accept a standard of excellence, because deep down they know what the alternative is. But when there isn’t one the cultural focus shifts towards mediocrity, and “politics” becomes the dominant game. And the less socially focused curious minds are easily outmaneuvered.


The financial and attention industries (and crypto, lets call that fin too) then gobble up a lot of the geniuses (and plenty of smart but not genius people)


I would say that we failed to establish an (educational) system that would produce higher average of so-called "mediocre" minds. When you have a educational system that is, in it's core, unchanged for more than 100 years, and where the only goal is to enable everyone to pass and not to learn, then it is hard to see any other outcome.


> When you have a educational system that is, in it's core, unchanged for more than 100 years, and where the only goal is to enable everyone to pass and not to learn, then it is hard to see any other outcome.

I think this idea that the most important thing of all is that everyone passes is a good example of the mediocre mind’s value system. It’s utterly pointless. There’s plenty of work you can do without understanding calculus. But it does limit excellence, and (sadly) I think that’s the whole point.


It has changed dramatically: we went from teachers doing their own thing to having strict top down curricula taught towards tests + dramatic drop in expectations.


Speaking of publication pressure, motivation etc.:

I spent my post-doc trying to tackle a rather fundamental challenge (in the architecture of analytical DBMSes). The result of my efforts took a long time to materialize, and their form was not a short article of the kind you put in conferences, but a long tract. I believe it is very useful for my (now former) field; but I could simply not get it published. Not because I was rejected for lack of novelty or anything - it was simply not in the "least publishable unit" form, and not easily (or at all) reducible to that form. The large chunk of preliminaries required motivation in use, and the motivation required the preliminaries etc. But in my field, people don't publish new research as monographs. Bottom line - put the monograph on arxiv, very few people noticed it, and life goes on without it having left a mark.

Plus, by the time this was done, my post-doc had ended, and my weird former career and lack of "publishable paper generation skills" meant I didn't have much chance at another position.

So, in a sense, my personal experience is a bit of the flip-side to that of Feynman, described in the post. Still, the sentiment definitely resonated with me.


The skill here is to find some ways to chunk the final output into smaller sizes. That usually means compromises and sometimes publishing some stuff that you want to reference later in lower level (or only tangentially related) conferences just to get it citable for later work. It's usually not fun and sometimes a huge compromise so I fully understand (and applaud) that you didn't go that route.

If you're still interested in picking up an academic career I noticed you spent some time in Europe (if the link under this post is your puplication). I'm not sure about the Dutch system but in Germany there are universities of applied sciences. Requirement for tenure as a professor is usually PhD+3 years work experience in the field outside academia. It's very teaching focused and you have to slowly ease into research again because you'll start doing it as a one person show but it's a decent option for an academically inclined person with a PhD to "come back" into the system from odd/unfitting career paths. Not sure if this construct exists in other countries but my guess would be the anser is yes. These days there's also plenty of positions that are international programs so language might not be an issue.


> it was simply not in the "least publishable unit" form, and not easily (or at all) reducible to that form

I know a couple of people from grad school who worked on massive infrastructure projects ranging from designing the ISA to verilog implementation to compilers, simulators and everything in between. I think they faced similar challenges in that it's a lot of important work but not easily publishable. It's unfortunate that we don't have a good way of recognizing this type of work.


A computational model for analytic column stores

https://arxiv.org/abs/1904.12217


Try treating the conference paper as a 15-page long abstract. You can still try to publish now, you don't need to be a postdoc for that!


A (tenured) professor told me he presents these ideas as a 2x2 for new untenured faculty: One axis is to work on what you think will be fun, or work on stuff you're not interested in, but that you think will get you tenure. On the other axis, you get tenure or you don't. Evaluate the likelihood of happiness in each box.

He thought it broke down roughly like this:

fun + tenure = Awesome!

fun + no tenure = Not bad, you still got paid for six years to work on fun stuff, which probably more than you'll get anywhere else.

no fun + tenure = Okay, but was six years of no fun really worth it?

no fun + no tenure = What am I even doing?

He thought it was clear that working on fun interesting stuff before tenure maximized happiness. Your milage may vary.


While I love 2x2 matrices, there are two incredibly important things missing. The first is the career of the PhD students you (as the faculty) are mentoring and supervising. Sure, you might do fun things, but if it ends up not helping their career, then you as the faculty have really failed them.

This issue of students is something often missing in discussions of the modern research enterprise. Speaking as a computer science faculty, if a PhD student wants to compete for an academic job, they need about 1 publication per year. Sure, back in the 1970s and 1980s you could get a CS faculty job with just 1 good publication, but that's not the case anymore. Of course, everyone is doing this, which then leads to more publications overall.

The second is the overall funding of science. There's an unstated compact that the public is funding science to help solve important problems and lead to breakthroughs that can advance the economy. This is the framework that has been in place in the USA since WWII thanks to Vannevar Bush, and is why NSF, NIH, and other funding agencies exist. Sure, as researcher, I strongly agree that we should be doing things we view as fun. But those fun things should have some angle as to why others should care. Another way of framing this is Pasteur's Quadrant, which is good science and good applications.

The more general advice I give to all students (undergrad, MS, and PhD) when looking for jobs: do something that is (1) fun (because there are many job opportunities in our field and life is short), (2) something that will keep pushing your boundaries (because you're smart and will be bored otherwise), and (3) good for humanity.


There's a vicious cycle here: One must work on "important" problems in order to get funding that mostly goes to pay too many grad students whose work allows new grants applications to happen, which then reqires more grad students. As a result, departments deliberately take on many more grad students than they will reasonably be able to place in tenure-track jobs. Thus every incoming class of PhD students contains a large fraction (more than half) whose careers will never be served by holding their noses to an unpleasant grindstone. Much in the way humanities departments exploit PhD students for cheap teaching labor, science departments exploit PhD students to keep the stream of indirect funding coming in. A big grant that requires a lot of assistants provides a lot of indirect funding for hiring and paying administrators.

(Also, I would argue that advancing the economy is not actually part of the compact for the NSF, which is really a jobs program for scientists based on the experience during WWII that having scientists around in a pinch was really useful.)

Interestingly, not every field is on this economic treadmill. Mathematics (pure, not applied), seems to declare problems as important based on whether they're unsolved and subjectively interesting. And economics, unsurprisingly, doesn't consider it rational to run large PhD programs that flood the market with competitors and reduce the market value of an economics PhD.


As someone who has had tenure, this is exactly correct. The catch is that that process doesn't end with tenure.


> But such is the world we find ourselves in that anything short of a ground-breaking discovery is treated as an unacceptable embarrassment. And thus, every marginal increment is touted as ground-breaking, and every deviation is repackaged to look like an intended result (which is again touted as ground-breaking)

This is a very nice quote.


Richard Feynman was a math genius, this is why he could derive groundbreaking results "effortlessly". There's this narrative around Feynman that he derived new physics results just for the fun and out of intuition that I believe is pretty wrong. If you read Feynman's work you'll see it's full of heavy maths and it's not intuitive at all (at least if you're not a genius like him). I don't think that the way he worked can be applied to everyone, because when you're not a genius and you do things for the fun of it it's very probable you'll end up wasting your time. This is why usually you have advisors that help you to find important problems that need to be solved.


> X was a Y genius, this is why they could Z.

This is a non-explanation. Chalking something up to genius is just a specific kind of chalking it up to magic. "Here's this person, how does their mind work, what goes inside on it? Magic! They're a genius!"

It may be so that the way Feynman worked followed from the way he thought. If so, that doesn't mean we just throw up our hands and recommend it to people who fit our genius-pattern. Instead, we need to explain genius, which involves treating it like any other object of scientific study. Once we do that, then we can figure out how to cultivate it. That's what this article should inspire.


I don’t know if its intrinsic or because he started learning/playing with math from an early age but in any case there’s no doubt his mind worked “differently”.

There’s people I know that can quickly understand complex ideas that would take regular people several hours/days to grok. Some humans just have different kinds of brains; either they’re born with it or through their love/training their brain is great at understanding knowledge. I don’t think thats very controversial?


The entire point of Feynman's worldview is that he is not "great", he is just an individual, and plays to his individual interests. Those interests become strengths because of the natural interest.

It is not about being the best. It is about contributing something that you actually care about.


> when you're not a genius and you do things for the fun of it it's very probable you'll end up wasting your time. This is why usually you have advisors that help you to find important problems that need to be solved.

Also add if there are not great examples to learn from. Most radical ideas require mastery of the topic. Incremental work can be done copying and simply adding one on top. Advisors have gone through all of this experience. Most people underestimate power of good mentors.


Feynman didn't think he was a genius, he said that he was just an ordinary person who worked very hard. The story in the article actually points this out; he was just curious which led to very beneficial working methods. The idea of geniuses is more of an excuse than an explanation.


Ordinary people tend to not do very well with the Stokes formula. Much less with infinite sums (which is essentially what Feynman. diagrams are for).

Lots of geniuses are unaware of the difference between their intelligence and other people’s.


You can't just start calling people who have a few years of math and physics training geniuses... Infinite sums is a very simple concept once you have some math background and Stokes theorem isn't that complicated either (once you have the right training).


I've always been a big fan of the "Feynman method of Problem solving":

Write down the problem. Think very hard. Write down the solution.


Me too. But I guess it does border on that famous meme:

1. Draw some circles. 2. Draw the rest of the f*ing owl.


Personal anecdotes from a family member of mine who is just breaking into genetic research at an Ivy: - Power tripping principal investigators - Cutthroat competition leading to actual sabotage (researcher deliberately social engineered their way into a former lab and unplugged a sample freezer) - Gossip that makes Mean Girls look tame - Abysmal, poverty tier wages for people who have PhDs


That's the tip of the iceberg really. Pretending the iceberg doesn't exist and doesn't hit ships is also part of the problem.


But you still should trust the science


I agree that unaligned incentives have lead to the replication crisis, but I fail to see how research driven by “curiosity” and not by “success” will align those incentives. The author admits that poor incentives have determined the course of academia/research, but I think rather than getting away with incentives altogether and just letting people be “creative”, we should find better ways to align incentives, and encourage a system of reliable peer review to restore the trust in research.


Talented people have intrinsic motivation and don't need artificial incentive systems.

Extrinsic rewards or punishments are at best poor substitutes and at worst destroy intrinsic motivation.


I think I agree with this. The "great minds" will always (or let's say often enough) be driven by intrinsic motivations. So the goal should probably be to give them "good enough" environments so that they can concentrate on what they deem interesting instead of struggling to "survive".

It's of course a bit tricky because sometimes it would be nice to have an accelerator or Google compute power at the ready and sometimes you can do excellent work as long as you have a roof over your head and are well fed and have a piece of paper and a pen :D

It's a complex problem, especially with competiton from other acedemic institutions and industry. I feel like the current system is ok-ish but many things are horribly wrong. I still feel like by and large good academic minds will end up in positions where they have a good environment to do what they want but the cost for this is that there's a lot of cruft and mediocre stuff that'll end up in essentially the same or better environments.


Also, as a mathematician I have little trouble if I get little money (as long as I can live decently). I cannot imagine at all what it costs to fund a biology department, or the LHC.


They know too much free money destroy the talent.


>it is not uncommon for many to inflate marginal results or engage in dubious practices, leading to a noise that greatly burdens the reviewers of these journals/conferences ... It is no surprise then, that many low-quality (or downright incorrect) works escape the sieve.

Eh, I blame the reviewers for this in large part. Every reviewer craves novelty like a drug addict. They'll reject papers because "it's not novel, just a bunch of engineering/implementation", or reject them because a mere 1.5x improvement does not meet the bar (Is it even science if it's not an order of magnitude ?). Then people will cry about dubious papers. The academic community deserves this.


In my experience you find a lot of spectrum. Keep in mind that reviewers today are the authors tomorrow. Some people in academia are incredibly arrogant and view themselves as the next Feynman, and dismiss marginal advances or small results. Anything under perfection and lots of work behind isn't worth publishing. But you have also the honest ones that can recognize value in the small advances.

But in general you are right: we deserve this because the amount of arrogance is too high, and more often than not you have to deal with someone who firmly believes they deserve the Nobel price.


Would someone qualified mind explaining, in layperson's terms, how the dynamics of wobbling plates led to QED, keeping in mind I barely passed partial diff-EQs all those years ago? A cursory search online only leads to commentary on Feynman's memoir.


I'm not sure there's any strong or direct connection in the academic sense. My recollection from reading the rest of the Feynman story that was quoted is that the wobbling plates problem lifted him out of a self-perceived slump or depression. So what's important is the impact that working on a fun and unimportant problem had on his state of mind and emotions, more than any particular insight or inspiration that problem had on QED stuff.


It's not that dynamics of plates gave him insights about QED. Working on a simple problem have helped him to get back into working rhythm of his phd years.


I don’t have full explanation of the connection that would be satisfying. However, I think it mainly comes down to symmetry group of the spinors and their 720 degrees symmetry. It is called plate trick for a reason after all: https://en.m.wikipedia.org/wiki/Plate_trick


This quote says to me imposter syndrome can tap on your shoulder even if you’re as good as Feynman, so no need for anyone else to take it personally.

And then I thought to myself, “You know, what they think of you is so fantastic, it’s impossible to live up to it. You have no responsibility to live up to it!”… I realized that it was also true of all the other places, including my own university. I am what I am, and if they expected me to be good and they’re offering me some money for it, it’s their hard luck.


Feynman's imposter syndrome was actually justified, he couldn't really work at the same level of impact as most of the other Nobel laureates. His Nobel prize work was largely an extension of Dirac's previous work on path integrals, the bulk of the work that showed it to be equivalent to the traditional operator expansion method developed by Schwinger was done by Dyson, which is what really was the crucial piece. In fact, Feynman couldn't even solve the path integral for the hydrogen atom, that was done by Kleinert. In one of his books, he recounted an anecdote of Schwinger who, having made a mistake in a Bessel function, spent a few minutes to derive the solution from scratch to refresh his memory.

It's basically for this reason that he "retired" from theoretical physics relatively shortly after his Nobel prize, he just couldn't keep up with the rest of the research being done.


Could you please give a reference to this anectode? I could not find any on a quick google search. Thank you.


It is one thing to pursue something out of curiosity, and another to spend hours upon hours every day for weeks on the same thing.

I think your curiosity might give way to other obligations at some point which is why you have to be somewhat goal oriented.

Worth noting that I have spent hours pursuing a hobby as a kid: programming. And I think that is probably true of many of the readers here.

I think the computer lends itself to this in a weird -- perhaps unique -- manner. The act of programming, or writing, is like a video game in that you get this reward just for the act. And it is easier to spend hours for days for weeks programming (or writing) than say mulling over other matters.

One trick for turning a task you cannot spend time on into a task you can spend time on is to turn it into an activity. This is why mathematicians have to write out their equations and diagrams when they're mulling plates.


There is some strange disconnect between citizen and worker - a citizen has tenure unless they violate fairly high bars, but a worker has no tenure - yet we expect better results? maybe tenure is something we should consider at business?


Citizenship is not about tenure. Transgressions by citizens are punished by governments, but not by removing citizenship. The state punishes "its own", while outsiders may be removed or denied a path to joining = gaining citizenship. i.e. a sovereign state is not like a club.


Not sure how it connects to the post, but workers certainly have tenure in the context of unions. It is, however, arguable whether unions yield better results, at least for the customer.


Hah. As for the wobbling plate, it reminds me of the super secret revelation of what happens when you juggle nuts and wrenches in spaaace!

https://medium.com/illumination/why-a-tennis-racket-led-russ...


this is a lengthy diatribe borne out of frustration of having to constantly produce a peice of work, not out of will or pleasure but from pressure to show that one is doing somework. i am ignorant when it comes to politics in academia; however, isn't this the reality of academia? one has to show work irrespective of quality if one is to maintain one's position?


It’s not even the “politics” of academia. Publications are what bring in grant money and prestige. Grant money brings in overhead (a cut that the university keeps instead of the researcher) and prestige brings in donations. So it is the economics of it.


Most academics are struggling to get tenure. They're not Feynman who has the good fortune to not have the threat of perishing.


Speaking of Feynman, does anyone know that one poem he either wrote himself or stole and edited a bit?

All I remember is it was the same sentence, repeated, with a different word overemphasized each time to show how you can change the meaning of a sentence with a bit of… emphasis.


A lot of Feynman's stories come down to being dishonest for personal gain. His charm and charisma are great at hiding what would be pretty controversial stuff from a more stuffy/boring person.


can't help but shake my head as I read about Feynman turning down a fancy job to remain in academia, at humble Cornell (and later, Caltech I guess).


That's all very well if you're Feynman...


That's all very well if you're starting an academic job in the 1940s. Being Feynman would not have saved him in 2022.


Feynman downplays his own talent and appetite for work (though he did occasionally talk about the latter), and it's natural to discount anecdotes like this because of that reality. But I don't think that changes the lesson; it means don't take the wrong lesson.


And what are the mediocre coders to do, if only the best coders get hired!


glue existing libraries and frameworks together, then come home and have a good life




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: