I think the example from the history of astronomy the author provides can by turned against him. It was the accumulation of astronomical observations and the refinement of ever and ever exacter measurements that provided the basis for the transition in ther scientific community from the heliocentric to the geocentric system, that took between one and two hundred years, depending when on lets it start (Copernicus or a predecessor) and end (Galileo, Kepler or Newton).
If we look for example at the career of Kepler, we see that he had to accumulate expert knowledge of both systems and that he arrived at his later so-called laws not by a sudden insight, but through tireless work, trying again and again over several years to make sense out of series of measurements that did not really make sense in either the Ptolemaic or Copernican system.
It was not the epicycles that worried the late medieval astronomers, but the fact that according to their geocentric theory the earth was not really exactly in the center of the deferent. Correspondingly, however, the sun was not really at the center of the Copernican system either. Both systems lacked elegance, and both systems were more or less equaly in accordance with the observational data. To solve this debate and to make progress, the strategy of the most famous astronomer of his time, Tycho Brahe, was to collect better measurements. And it were these accumulated measurements that enabled his pupil Kepler to develop a solution. In a sort, he was lucky. Only the data for the planet Mars, which he considered first, was exact enough to match an ellipsis. His data for Jupiter or Saturn would not have allowed him to come up with one.
Accumulated knowledge was not an obstacle, but the basis for Keplers insights. That we might get a different impression is a result of simplifications that happen after a scientific community reaches a conclusion. At that stage such a theory becomes textbook knowledge: a student needs no longer acquire the accumulate knowledge of the previous period, but only the successful doctrine. These doctrines appear and are all too often presented as ingenious insights from geniuses, instead of cumulation points of a collective work of generations of scientists and scholars.
He figured out that the orbit of Mars was oval, and tried to approximate it with an ellipse while continuing to look for its actual shape. It took him a while to realize 'maybe it really IS an ellipse'.
No, ideas are not getting hard to find. Ideas are getting harder to discuss.
There has never been such a ripe time to find such ample reason to propose and discuss new and radical ideas. And there has never been such a hostile time to looking into it.
In physics we’ve lost 50-70 years to string theory and related nonsense and the people still working on effective quantum field theories have had to re-brand as working on “quantum information / quantum computing” to stay in rent and lab desks. There aren’t many serious QFT people who doubt Everettian epistemology but it’s still nothing undergraduates hear about. Most serious non-string gravity work comes out of Perimeter and Marletto/Deutsch.
In economics and finance we still put our hand into the sink shredder of the Chicago-style strong-form EMH, as resoundingly and empirically disproved by Simons (may he rest). And we continue to embrace Friedman-style supply side economic policy, without peer the nastiest wealth transfer from those with surplus away from those facing scarcity at the barrel of a gun in absolute terms ever. It never trickles down.
In the fundamental epistemology of our day we are surrendering the reigns of influence to people who market chat bots trained to lie convincingly at scale.
Knowledge, and the new ideas that approximate at the limit the derivative of knowledge are under attack. They’re under attack by some rich guy’s choice for the job he did well and decided to keep in the family.
If the rent in a city is too high you are not going to get the MOST interesting restaurants, bars, and clubs. You are going to get only the businesses that will DEFINITELY convince an investor to write a check to dump on such high rents; regardless of whether that is a good idea or not.
The PhD cohorts for R1 Universities hasn't really gotten any bigger than 50 years ago. The number of academic jobs hasn't really gotten any bigger either. The only people having success in the system are the types of people that seem low-risk to the system.
So of course we should expect a decline in innovative ideas as time rolls on.
The only way to reverse this is to literally create more tenured jobs (or perhaps temporary tenure ex: 10 years you're guaranteed employment) and increase the size of PhD student cohorts so that they are large enough that iconoclasts can fit in again.
You make a really good point. Defining risk in terms of financial ROI and trying to assess it algorithmically and default-gating more and more endeavors behind such risk assessment is a surprisingly coherent model for a lot of widespread modern rot
Thanks for being one of the few commenters questioning the microconomic angle itself. In my view, the problem is that no rational economic agent will ever embrace the idea that research should be expensive. Well some very few of us do, we realize that we are competing against Nature and the Universe (not other economic entities), so one of the sensible avenues left to us is to try and recuperate all the losses that have already gone into past research. (maybe Kepler was trying desperately to salvage Hellenic heliocentrism? He also had a thing for platonic solids, after all)
Reinvent the way academic careers work. No need to even have the concept of tenure. No need to get a PhD degree to be recognized as someone researching something.
Anyone should be able to publish research without being on such restrictive terms. The article's main takeaway is that the institution of academia has perverse incentives that hold back innovation. I believe the way jobs and degrees are structured are part of that.
How do you decide which researchers to invest in? Ultimately you need some kind of institution to organize this, and I promise you it will end up very similar to a university.
Also, tenure is generally seen as a very important part of research. Without tenure your livelihood is dependent on your output and your relationships, which creates numerous perverse incentives. You can't research things which might lead nowhere, or you'll lose your job. You can't prove some other senior researcher wrong, or they might work to have you fired.
With tenure, this all goes away: you are free to choose your own path in research, and empowered to know you don't have any superiors who can lose you your livelihood if you step on their toes, or their friends' toes. Of course, this is the ideal. In practice, lots of tenured professors actually want to become ever more wealthy and powerful and are not content with the guarantees of their tenuership.
> I mean, anybody can do research and publish their papers in whatever journal.
Not really true. Publishing research in a reputable journal without an academic affiliation is many times more difficult. Sure you don't need a PhD to publish, but you pretty much need an academic affiliation.
Good or bad, researchers without an affiliation are often taken as crackpots.
Regarding the number of researchers, of all of my friends who got a PhD only a single one did it because he liked research. For most it was a stepping stone for a totally unrelated career in industry and for some just the easiest option due to lack of an alternative.
Given that, is is surprising that progress does not scale proportionally? Maybe, maybe not. For sure my comment is not backed by anything but anecdotes.
> Regarding the number of researchers, of all of my friends who got a PhD only a single one did it because he liked research.
This is sad; luckily for me the percentage is higher among my friends. I suspect your experience is closer to the statistical mean (perhaps because there's a good chance your friends are a few decades younger than me and many of my friends).
Also: I bailed out because I didn't like research, or rather because I very like the practice of research but only ever worked on super obscure things, and I wanted to make a difference I could see. Also I didn't want to have to deal with the culture of research funding and related bureaucracy. Not that there's anything wrong with those who do like that stuff! It just seemed not worth getting a doctorate; it definitely would have made some things easier but in retrospect I made the right call (for me!).
There is a lot of basic research done that does make a difference but where that difference is unknown to the original researcher -- perhaps separated too far in time. I have found value in decades-old PhD theses in the life sciences, mainly physiology.
> There is a lot of basic research done that does make a difference but where that difference is unknown to the original researcher -- perhaps separated too far in time. I have found value in decades-old PhD theses in the life sciences, mainly physiology.
Have you ever tried tracking down the author to express your appreciation? It might make their day (or month!).
That's actually the system working as intended. There's not nearly enough faculty positions or even non-faculty research staff positions to accommodate all the PhDs who graduate. The presumption is that most people who graduate will go into something other than academia. If you want to see the system failing, look to humanities, where most people who get a PhD are legitimately interested in academics and yet they will, in the best case, be stuck as terminal adjuncts
Put another way, I think one of the author’s arguments is “what if we’re pouring more and more effort into research in an area, because our primary abstraction is wrong?” That would make it harder and harder to continue to roll the ball up an increasingly steep hill. And I think in some fields, they may be right. It’s akin to how in math some problems are very difficult in Cartesian coordinates but trivial in polar coordinates. For instance, it seems to me that the existence of the “wave-particle duality” in physics and our inability to explain quantum-scale phenomena in a way that makes intuitive or logical sense to people is the canary in the coal mine that we took a wrong turn a while back. That’s not to say it’s true in every field, but that’s one that has stood out to me for a while.
I think another cause could be bureaucratic. From what I understand, several hundred years ago, research was funded by rich patrons who wanted some credit in relation to discoveries. Nowadays there is a giant, lurking, centrally planned grant machine that distributes money to researchers. And as we know from economics, central planning becomes increasingly untenable as the system becomes larger. Results get worse as the complexity skyrockets.
Additionally, if we are going to blame the “burden of knowledge” in any capacity, we have to acknowledge the abysmal education system (in the US). An anecdote which I will never forget is, in 6th grade we had a student who had recently immigrated from India. By his own telling, he was an average student there. But compared to native students, he was light years ahead in math. We learned multiplication in third grade, division in fourth grade, long division and fractions in fifth grade, or something like that. He had learned multiplication in first grade, and all of division in second grade. While our smartest students in sixth grade were grappling with pre algebra, he was bored in classes with the eighth grade algebra students. Our education system has, since experiencing that, seemed to me deeply flawed; there really is no reason an efficient and effective public education system should take 12+ years to be able to have a child ready for college and then another 4+ in college to have them ready to contribute to an isolated field.
> in 6th grade we had a student who had recently immigrated from India. By his own telling, he was an average student there. But compared to native students, he was light years ahead in math. We learned multiplication in third grade, division in fourth grade, long division and fractions in fifth grade, or something like that. He had learned multiplication in first grade, and all of division in second grade.
Whether he knew it or not, he was lying to you in claiming he was average. India's education system is notoriously poor.
> "what if we’re pouring more and more effort into research in an area, because our primary abstraction is wrong?
Absolutely; perhaps "thinking of knowledge in terms of physical environments" also helps to illustrate why – sometimes the only way to get to where we want to be, is to redefine where-and-how we begin
Concretely, looking up a unconquered mountain: it may seem as though many approaches are viable, but when attempt proves otherwise, the only option might be to return to base, and consider a new origin for approach
I think this also helps illustrate why "progress by accrual alone" is so inherently flawed. Sometimes you must begin again, in whole, but also in part – because "you can't get to anywhere from anywhere"
In fact, for almost all domains other than science, beginning again is such an optimal remedy to progress dead-ends, it's operational
Writers know what's up – kill your darlings. Software engineers too, famously – measuring software progress by lines of code added is counter productive – we learn to be unsentimental, delete and reconsolidate often. This is the continual redefinition of the "origin" of where we continue-on-from
Though it does require a change in perspective: written words, and lines of code – not as asset, but liability. Redundant words, code or literature, is contrived complexity. Unnecessary, and after a while, prohibitive to progress. I think science hasn't done this yet? – we know that naming things is hard, but perhaps for science, calling every artifact 'literature' doesn't help...
I also think that science catastrophically misunderstands "beginning again". Finding new origins is easiest for those who haven't been conditioned to only see one. It doesn't take magical genius, just (selective) ignorance. This is why (non-trivial) revolutionary paradigms often come from outsiders (or historical scientists who did not complete the endurance test of the modern day burden)
I can't help but think that scientific genius is based on "eh, fuck that, i'm sure there's a better way"
It's as if, when Kuhn wrote "Structute", a million normies cried out in pain, and because they'd already been conditioned to one origin, and couldn't stand the idea of "newbs" getting credit, vowed to prove that "progress by iterative accrual alone" was all that was necessary. And here we are
> This is why (non-trivial) revolutionary paradigms often come from outsiders (or historical scientists who did not complete the endurance test of the modern day burden)
Can you give me an example of this in the last century? Or even historically?
Why artificially limit the (historical) set-of-all non-trivial revolutionary paradigms to the last century?
Current academia seem to treat the phd (implicit modern) as a minimal barrier to credibility (which is curious for several reasons). But given the modern phd is a relatively recent invention, this heuristic ought to exclude every paper written by those who predate this new form/ qualification. Of course, historic progress seems to be revered most of all
The point isn't to suggest that the modern form was beyond historical figures (though I suspect given the nature of earlier scientific progress many might choose to not endure the modern form), but more frame the question "had they (historical figures) been conditioned to think narrowly, within the contrived 'lanes' of modern isolated disciplines, would they still have thought as they did?"
All of Hofstadters "analogies which shook the world" (surfaces and essences) necessarily depend upon thinking across arbitrarily plural scopes of phenomena, and deeply — with commitment to the premise that the deepest understanding of our universe which that they might contribute to the endeavour, frames an implicit "universal general domain" of characteristic forms which apply universally. This is the opposite of isolated special-domain thinking/ stay in your lane/ etc
Education includes learning and conditioning. I wonder if the degree to which we stress test conditioning impacts ongoing learning, specifically open learning, learning across (artificial) boundaries of special-domains
I've repeatedly encountered research I'd have liked to use, but which was inaccessibly buried by the pipeline from academic research to patents, to optional failed startup, to bigco portfolio entombment. And wished for an industrial policy dialed more towards a progression from research, to open source and community exploration, to cottage commercial, to small-scale industrial. Or consider a slider from non-competes, to California I'm-working-at-a-competitor-tomorrow, to China and-I've-brought-stuff - I'd like to see more California and China than not. Our optimization for pharma and VC, often seems terminally ill fitted for progress along many vectors.
But then, Copernicus came along with the heliocentric model which, in its simplest initial form, made worse prediction than the tuned-up Ptolemaic model. But the burden of knowledge was dissolved in an instant. Improving the Copernican model meant shifting orbital paths from perfect circles to ellipses. It had nothing to do with the epicycles and perihelions of the Ptolemaic model and none of that burdensome knowledge was necessary to expand the frontier anew.
disagree here. it does not make it easier. to show why a new model is right or superior means having to understand the old models well enough to show it's wrong or suboptimal. hence knowledge.
I think people have recognized that the current research grant system tends to reward safe, status quo ideas, and discourage potentially revolutionary risky long shot ideas. Its been talked about for a long time now. Maybe even decades at this point.
There isn't an infinite amount of grant money to go around, and of course handing it out to any crackpot that comes along is going to be a nonstarter. It seems to me that the solution is somewhere in between the two extremes.
Besides the burden of knowledge and the flaws of academic funding, another factor to explain the slowdown in scientific progress is: the world has more things to be entertained with, including books, comics, games, movies, music, p@rn, social media. These things have dramatically increased over the past decades, meanwhile the number of hours in a day did not increase at all.
The cost of complexity is inherently wicked. We could discuss its impacts on HR, on R&D, on supply chain, on education, on manufacturing or project management. However, let's stick to the core: the vast majority of investment-related decision making systems are still stuck in a finite-dimensional reality barely evolved from a spartan economic rationalism analytical milieu derivative of double entry book-keeping functionally based upon entrenched inefficiency and a lack of timeliness. Regulators, insurance, and other aspects of society with ill-deserved critical veto capability do not just support or require but effectively celebrate this uncompromising backwardness, whereas it should actually be banished. We desperately need holistic decision-making taking in to account factors lying outside of this purview. Academics, entrepreneurs, frustrated youth, competent industrial craftspeople and environmentalists are unlikely allies at the coal face.
“ Ptolemy and his astronomical ancestors explained these “retrograde” motions with the extra loops you see in the map above called “epicycles.” By the 15th century astronomers had accumulated centuries of meticulous measurements and incorporated them into complex orbital paths, matching their observations. Learning these models and taking enough measurements to improve one of them took an entire lifetime of monastic devotion to studying the stars. The burden of knowledge was immense.
But then, Copernicus came along with the heliocentric model which, in its simplest initial form, made worse prediction than the tuned-up Ptolemaic model. But the burden of knowledge was dissolved in an instant. Improving the Copernican model meant shifting orbital paths from perfect circles to ellipses. It had nothing to do with the epicycles and perihelions of the Ptolemaic model and none of that burdensome knowledge was necessary to expand the frontier anew.”
Yea a cool example, and sure there are others around this time where "we" figured out previous thinking was just fully wrong. But the author then also says:
> There are thousands of other examples.
Really? Thousands? Are they all similarly examples of pre-scientific method/ pre-industrial revolution knowledge? How many of these "thousand" examples have happened since 1850? Why didn't he list any that have happened in the last 50 years?
If we had a rather good way of attracting, selecting and retaining top scientists then by power laws inherent in any creative/competitive endeavour we’d expect that by increasing funding drastically to get more people employed we’d only see very marginal improvements, as the wider base of the talent won’t contribute all that much.
Heretical opinion: we should have much stronger merit-based gating on access to university. You can get in if you can demonstrate facility with real analysis, matrix algebra, and Bayesian reasoning.
Imagine how much the general level of educational noise would be reduced in universities, and how much more space there would be to do meaningful research.
We compile knowledge into institutions to reduce the burden of knowledge, distilling vast information into simple phrases. However, the rapid technological and cultural changes over the past 80 years have hindered the formation of new institutions. As technological improvements slow down, we will likely see more established institutions emerge, reducing the burden of knowledge and allowing us to break through the current knowledge ceiling.
The comparison the author makes with astronomy doesn't support their point at all.
First, it agrees with the notion of a burden of knowledge, by agreeing that improving the Ptolemaic model took huge amounts of time to learn its intricacies.
Secondly, the Copernican alternative worked not because it was simpler, but because the Ptolemaic model was wrong. There is never a guarantee that a model we currently is is wrong and can be replaced by a substantially simpler model. Maybe someone brilliant will come along and find a simpler model than quantum electrodynamics, but maybe they won't: it's absolutely possible that, say, QED is the right model, and we can at best make it even more complex to explain more details of phenomena.
Finally, tools only get you so far. Tools don't necessarily encode some of the scientific knowledge you need to advance fields. They are a separate track from it.
Author's main idea is that knowledge is not always cumulative, sometimes new ideas disrupts and replaces old ones, and in some cases, even reduce the burden because it is also simpler.
The problem is that most new ideas are, in fact, cumulative, and hence increases the burden at a faster rate than disruptive ideas reduce it
The idea of "Burden of knowledge" only requires that most knowledge accumulates. The fact that there are exceptions that do not accumulate does not disprove it
My observation is that both of those could be linked. We might be reaching the limits of technology.
It means that to push boundaries we need more money to buy/construct new tests. In other words, you need exponentially more money. Which leads to the kind of blockbusterization of science. You don't get hundred blockbusters each year, instead you get two, by team with track record. Which sounds a lot like institutional decay.
Exponentially more money isn't just sign of institutional decay. Resources/energy required to discover X ray is a grain of sand on a beach compared to energies required to discover Higg's boson.
That's just energy. Compare cost to make an X ray tube vs LHC.
Science used to be able to snatch low hanging fruit. Now those are picked, but people still expect same rate of innovation.
This sounds like the decay, not the rising cost.
I’m sure every researcher understands this, but not every executive or shareholder.
The lack of trust in your workers / focus on hyper growth is the issue.
Like you say, discoveries become more expensive over time, but we’re seeing similar institutional decay in non research institutions as well, so that can’t be the only factor.
Like how much more expensive is it to teach a single college student now than it was 20 years ago. In all honesty, probably much cheaper. Especially when you start looking at online classes.
So why is college in the states so much more expensive? It’s largely because of administrative bloat, which, among many other weird incentives, are symptoms of decay.
And why would you need administrators? IMO, it’s because you’re unwilling to trust small departments or individuals to tech effectively.
> Like you say, discoveries become more expensive over time, but we’re seeing similar institutional decay in non research institutions as well,
Sure, but they also might be the issue of ending on the Saturation curve. For example, chip manufacture is getting more and more expensive because it's getting up to the limit of an atom. That's caused by physics and not economics. What is caused by economics is the expectation of exponential growth.
Knowledge is not always cumulative, but you have to be aware of its prevent contradictions to refute them. This is a point in Kuhn’s “Structure of scientific revolutions” that I think is neglected
It's crazy how much institutional decay there is basically everywhere in this era. Like I grew up hearing stories about institutions being crazy effective compared to the ones I actually encountered. Sure, humans have always had folly, but the society we currently occupy is obsessed with often nonsensical metrics, rewards con artists disproportionately for gaming those metrics, and creates significant barriers for anyone trying to navigate the world through other means. There is no faculty in any academic institution untouched by the Reagan-era push to make colleges into degree factories, the administrative bloat caused by incursions of the demands of an increasingly muscular and obtuse finance sector (that also affects most other institutions), or the pervasive instinct to automate decisions based on dubious metrics, and most adults understand this at an intuitive level
We are in a period of widespread outright institutional collapse. Institutional decay should be the null hypothesis when considering the causal factors behind any widespread problem
Institutional knowledge is even actively destroyed. There is only so much you can maintain with written procedure; rather, for all of human history, institutions have mainly relied on the people and culture that make them up. But the "modern wisdom" is that you avoid such reliance on people--the employees, the rank-and-file--at all costs. Executives must maintain absolute leverage against workers, so institutional knowledge has to be ignored as a factor. Every employee must be maximally expendable and easily replaced. You know that one senior team who knows how to reliably carry out some important function? Too bad, we're firing them (or denying them raises, or replacing their manager with a toxic sycophant, same diff) and then we'll just have to manage.
Even non-commercial organizations where a healthy culture might still form are affected due to carry-over from the much larger corporate sector. I mean, nobody working anywhere can expect to stay with the same firm until retirement; that's just not the done thing anymore.
>We are in a period of widespread outright institutional collapse
who is "we"? This thesis is easily tested by zooming away from the US for a second (I know, very hard exercise). The burden of knowledge is visible everywhere even in countries that have drastically improved their governance. Lee Kuan Yew used to lament the increased levels of education at the expense of family formation, experts are getting older everywhere from India to China, to Indonesia and South Korea, making the same gains in computing performance is harder and harder regardless if you're in Taiwan or Arizona. Capital required for economic growth, or drug discovery cost or any other reasonable proxy for the cost of new ideas also increases everywhere.
Institutional collapse is completely coincidental to the basic mechanism that is perfectly well laid out in the article. To build better knowledge you need to account for the old knowledge and then some, so pretty much by definition it's an information theoretical necessity that better ideas get more expensive.
Try to imagine a world where the theory is actually true that there is no burden of knowledge and the reason is institutional collapse. Then you'd have countries where the life expectancy grew faster than when we invented soap and antibiotics. You'd have places that make advances in computing as we did in the 80s and 90s. You'd have places where VC's invest less money every year but get more returns every year. It'd be a completely bizarre world compared to the one we're in.
I think you're right to point out that some "burden of knowledge" is essentially inevitable, especially within a particular field or targeting a particular metric (e.g. life expectancy, though of course that's one that's so broad that there will be considerable influence from factors other than knowledge advances at the technological frontier). Burden of knowledge shows up in individual endeavors too, with most skills following a learning curve to mastery that looks somewhat logarithmic
But I don't think the article is arguing that there is no such thing as burden of knowledge, nor am I. In fact, the article has a whole section where it explicates this, concluding that since it's talking about a phenomenon for which there appears to be a marked shift in the last few decades, an inherent and permanent problem like burden of knowledge isn't adequate alone to explain this change. The article is arguing against treating this phenomenon as the sole or most important factor behind the widespread sense that it's hard to do research, and I stand by the assertion that the more relevant driving factor right now is institutional decay: The faculty of universities are in a near-constant state of complaining about administrative bloat in their institutions, which has dramatically increased relatively recently. Also, there are a lot of methodological issues that have come to light as a result of the widespread cross-disciplinary replication crisis that got some press in the last decade, and which seem to largely stem from misaligned institutional incentives which come from various sources, including pressure from increasingly muscular industries about the nature and results of research they're interested in, pressure from publishers, both traditional journals and the journalistic organizations that by and large publicize scientific results in practice, and careerist pressures that create publication biases as well. I think for this specific case, the article makes the case more eloquently than I can. My comment made the further claim that institutional decay is such a widespread serious problem in many of our institutions today that it should be our first guess for why any outcome we care about that can reasonably be assumed to be mediated by some important institution is worsening over the last few decades. I call this the null hypothesis because that is the traditional name for this kind of assumption
As with anything complicated compressed into a single articulable concern, the reasons why research is perceived as slowing down are no doubt attributable to a variety of factors, and saying this perception isn't primarily a burden of knowledge issue isn't equivalent to the claim that burden of knowledge doesn't exist at all
This is a natural outcome when governments create rules that in effect solidify a market practice. It becomes irrelevant over time and fails to change because the rules will not allow for it, and even if they did, committees are not known for being industry leaders.
In education the solution is simple and effective - cut all federal student aid. The educational institutions that have strong industry ties and therefore the best outcomes will persist. Those that were warming chairs to hand out fake degrees will go under. Let it fail for the admins, because it's already failing the students.
Your posted suggestion, of defining the best outcomes as immediate job success, would solidify a market practice of lowering fences between town and gown (viable during only the past 3 or 4 generations) as the only viable way forward for educational bodies that intend to survive.
Because I agree with your general statement that market situations lose relevance once legislated, I think the principle more naturally extends to mean:
It is better to consider how to keep universities, whether research institutes or liberal arts universities, as a distinctive part of "the whole spectrum of techniques by which one generation transmits its insights and abilities to the next," Prof. Edsger Dijkstra's words. Not enshrine a set of commercial principles by which every piece of a spectrum must play.
Immediate job success is a fickle master; it would be a shame to LinkedInify any society. In 1996, Dijkstra said, [0]
> The University with its intellectual life on campus is undoubtedly a creation of the restless mind, but it is more than its creation: it is also its refuge. The University is unique in that on campus, being brilliant is socially acceptable. Furthermore the fabric of the academic world is so sturdy that it can absorb the most revolutionary ideas.
> But it is not only a refuge for the restless minds, it is also a reservation for them. It does not only protect the restless minds, it also protects the rest of the world, where they would create havoc if they were let loose. To put in another way, the fence around campus is essential because it separates two worlds that otherwise would harm each other. The fence ensures that we have relatively little direct influence on the world "out there", but we would be foolish to complain, for our freedom to be as radical as we like is based on the fact that, for at least the first 25 years, industry and the world-at-large ignore our work anyhow. Currently, there seems to be a world-wide tendency to try to lower the fence; the effort strikes me as ill-directed.
He had also been writing similar things in 1986 [1] and 1994 [2].
No one can live in a capital market without capital. Education has always been about capital markets - uneducated people are highly limited in their use. There has never been a school that was built for "free", it's always come from capital one way or another.
It used to be primarily private capital, people who believed in it's purpose. Outside of the top 1% Now nothing is built without students and taxes footing the bill.
If it cannot survive on its own merits then its existence is an abuse to the people it claims to support. Turn it into something else and let real solutions arise naturally.
No it hasn't? That's some insane revisionist history, and this is what I blame Reagan for. The attitude that universities and even education broadly primarily or solely exist for producing vocational training
The original and, to this day, primary unique purpose universities serve is facilitating research. Capitalists try to claim credit for all the progress in science and innovation that's happened since the industrial revolution, but the institutions that evolved into what we now consider academia predate it significantly, and it's important to note that they were not always viewed even by their benefactors as having immediate value in a direct economic sense. Most of their educational capability is a side effect of developing competency in training new researchers. People who fell off this track still often got a quality education and became better-rounded, and also people who could get to a university tended to be high-status, so for a while it was a decent-fidelity signal for the competence-status mix that hiring decisionmakers cared about. Trying to get everyone in on this not only devalued that signal, but has caused a lot of this institutional rot. I think we agree on the end of that story but not the beginning of it
Vocational training and especially apprenticeship should be encouraged and enabled more, and shoehorning universities into this role was misguided from the start
ideas of how it should be and the actual circumstances today are completely different. That ideal is abusive because it is not reflected in reality, taking advantage of young naive adults.
What are you on about, mate? Universities significantly predate even the concept of capital. Not universities as a concept, but specific universities that you can go to today - notably Oxford and Cambridge, but many others in Europe. And their purpose has almost always been scholarship, not employment. Most modern universities stem from institutions meant to find better ways to understand God's work, and perhaps to better worship it. The Catholic Church was the primary "investor" for a long stretch of time, at a time when it was a more powerful "state" than almost any other in Europe.
If you can find me a university free of a capital market I am more than willing to be wrong. The idea of universities generations ago are not at all how they are today.
You have claimed "education has always been about capital markets". I have exained that it hasn't. That we today live in a capitalist society and every single aspect of our lives is touched by capital to a higher or lower extent is a completely different claim.
Furthermore, you suggested that this (obviously false) history of education being driven by capital markets should be taken as a guide to letting capital control our education even more, as if the problem with universities today is that they are not capitalistic enough. The actual history suggests exactly the opposite.
You cannot live in a capital society without capital. There is not nor has there ever been an education system disconnected from industry until government subsidy and "support" did it. This is because once there is free money there is no longer a natural market effect and organizational failure is acceptable and funded.
It's crazy how people just make shit up and say it so confidently. Most education prior to the industrial revolution was considered a luxury for the wealthy, and its value was not considered in terms of producing economic value directly
> Most education prior to the industrial revolution was considered a luxury for the wealthy,
In fact this is still the case in many ways. There are loads of kids studying things that are not directly economically useful, mainly for their own enjoyment and as a luxury status symbol.
This narrative is very heavily purported by right-wing propaganda outlets. I wonder what you mean by "things... as a luxury." Do you mean MBAs, or history majors? And, what do you mean by "loads?" Do you have any data to back this up?
I think they mean the general study and refinement of culture and knowledge for its own sake--the kids who go to college/university for language arts, fine art, music, theatre, history, philosophy... these classes are widely attended, but have a reputation for not being useful for gainful employment. (Plenty of famous writers, poets, artists, actors, musicians, and journalists have such an education, but the rest of their classmates probably did not end up employed in the same sector.)
Side note: it's not just the far-right that disparages higher academia as effete elitism. The far-left has done so as well, notably during China's Cultural Revolution, and within Soviet Russia, Pol Pot's Cambodia, etc. Being an academic at the wrong time in history means you have a good chance of ending up in a gulag or concentration camp.
Yes, I have studied a bit of China's Cultural Revolution, and discussed it with several (strongly biased, of course) Chinese expats. They violently opposed the subjects that many Americans are privileged to receive via our "liberal arts" educations. I can't argue with that. All I can say is that I view China as a deeply conservative, even fascist nation. From my (limited) perspective, it is deeply racist and misogynistic in ways to which only the most conservative, right-wing Americans compare. So it is hard for me to trace today's China's lineage "leftward".
I also didn't mention the less "artsy" academic disciplines that might get you in trouble with the Party: mathematics, physics, astronomy, economics, biology, sociology, psychology... These can be seen as possible heresy and/or profligate navel-gazing that weakens the Glorious Nation. But it depends on the ideology and how useful the Party thinks your discipline is (for example, if expertise in your field is required to build nukes or launch rockets...)
Well, what degrees are actually necessary for the work? As in, if you don't have the degree, you cannot be allowed to do the work?
Medicine, law?
Pretty much all other degrees are a thing that you do because it's interesting, and you want to signal your interest in some area. People are trained on the job for the most part. I'm fairly sure I could have jumped into my career from high school, were it not for the social convention that smart kids have to go to university, and thus only graduates can be hired.
The sibling comment is right, btw, it's narrow to think of it as a right-wing talking point. Historically the left thought the same way.
It's a fact, money doesn't come from nowhere and for nothing. You can see it for yourself if you do the work. It doesn't matter how things were eons ago - it matters how it is now.
Yes. It matters how they are now. It matters what problems there are with that state and what we can do to change them. It matters what's worked before and why and in what context, because it's a source of information we can use to inform our decisions. If you're going to talk to me like you think I'm a baby, I ask you to at least say something vaguely useful, instead of prattling about "realism" that makes no substantive claims except that every facet of our current reality is inevitable. As someone who makes money by doing work in the current world, I would like for fewer decisions to be made by people who hold capital instead of doing work, and think there are concrete steps we could take to rescue the important human endeavor of research from the influence of such people, not the least of which being restructuring how governments fund such institutions, prioritizing their independence rather than their hijacked function as yet another ill-conceived social mobility hack (actual social mobility comes from removing obstacles, not creating new ones and then conditionally subsidizing them)
PS: Since these are articles penned by academic economists rather than pro scientists, it'd be nice to hear criticism from an economic (or finance) angle..
If we look for example at the career of Kepler, we see that he had to accumulate expert knowledge of both systems and that he arrived at his later so-called laws not by a sudden insight, but through tireless work, trying again and again over several years to make sense out of series of measurements that did not really make sense in either the Ptolemaic or Copernican system.
It was not the epicycles that worried the late medieval astronomers, but the fact that according to their geocentric theory the earth was not really exactly in the center of the deferent. Correspondingly, however, the sun was not really at the center of the Copernican system either. Both systems lacked elegance, and both systems were more or less equaly in accordance with the observational data. To solve this debate and to make progress, the strategy of the most famous astronomer of his time, Tycho Brahe, was to collect better measurements. And it were these accumulated measurements that enabled his pupil Kepler to develop a solution. In a sort, he was lucky. Only the data for the planet Mars, which he considered first, was exact enough to match an ellipsis. His data for Jupiter or Saturn would not have allowed him to come up with one.
Accumulated knowledge was not an obstacle, but the basis for Keplers insights. That we might get a different impression is a result of simplifications that happen after a scientific community reaches a conclusion. At that stage such a theory becomes textbook knowledge: a student needs no longer acquire the accumulate knowledge of the previous period, but only the successful doctrine. These doctrines appear and are all too often presented as ingenious insights from geniuses, instead of cumulation points of a collective work of generations of scientists and scholars.