Hacker News new | past | comments | ask | show | jobs | submit login
The University Is Like a CD in the Streaming Age (theatlantic.com)
360 points by pbui on June 22, 2020 | hide | past | favorite | 282 comments



Ya'll use University wrong.

The classes, etc. ... whatever!

During my time at Uni, I:

* designed a demonstrator car frame to show off a bunch of new manufacturing techniques, which was actually built in the end

* was part of a team who tested a new type of fiber optical vibration sensor on a sounding rocket

* worked with a PhD student on developing a sensor to measure water flow through a tank to validate his simulations

* wrote code that is now orbiting earth on board of a CubeSat we built with a team of ~100 students

We had groups who built race cars, a group who won ~all of the Hyperloop challenges, a group which builds rockets (seriously they have a lab in the basement of the cantina - the only rocket powered cantina in the world), a group that designs, builds and flies their own soaring planes (no-one died ... yet) - and that's just the ones I considered cool. All of this groups get access to top notch workshop and lab equipment and some of these groups are so present at their respective chairs that they end up driving a serious amount of the research happening there.

University is a place where teachers and students who are interested in a field meet to learn and research together. Modern Uni has often forgotten that, but if the students treat it as such it works in this way. We have to tell students how they can get positive experiences out of the current system - by trying to do so, they will change the system from within.

I honestly regret wasting my first couple of semesters before I learned about all of this stuff.


Your specific experience is great, and I'm glad for you.

The point here, however, is what University is for all people, not just for individual cases like yours where things align nicely, and where maybe your own character and personality (and some of your professors and class mates) helped you get a meaningful experience out of it.

My personal opinion is that the article is mostly right about what doesn't make sense for Universities in 2020. Many things are wrong, including providing access to it for ALL valuable students irrespective of income or wealth.


> The point here, however, is what University is for all people

Traditionally, that was not true and still is not true in places like Germany. A university is the place where you can get the highest education of a field, or ideally where you study a field on your own. Both requires a lot of prerequisites that not everybody fulfills. I am talking about knowledge, ability to learn although something is not flashy, and the will to endure. And it should be like that, otherwise you loose the ability to train the brightest people of a society. For everybody else, there are applied universities, dual study programs, and well established and respected vocational training programs, here in Germany.

Replacing universities with a netflix-esque experience may seem attractive to students, but is doomed to fail to train students. The question how much teachers should adapt to the wishes to their students is as old as learning itself. In the end the students have to trust, that a teacher knows which material to cover to teach students the desired skills, facts, and abilities. For an example of that, search your trusted video-streaming platform for the film Karate Kid. ;-)


I really like the German higher education system. Unfortunately, the USA conflates universities and applied universities.

Most people who go to university in the USA, go to what would be closer to a Fachhochschule. Smaller colleges and universities that don't grant PhDs and whose professors have research output is closer to "write a paper in the summer" than "run a research lab".

Most people who have negative views toward their university experience in the USA, I think, feel that way because they thought they were going to a Fachhochschule but actually went to a university (or vice versa).

I don't quite understand how the USA can look at our education system, look around the world, and then conclude that technology will be our salvation. No. Just build a system that works like the perfectly functional and affordable systems in much of Europe.


> Unfortunately, the USA conflates universities and applied universities.

This. I wish this point were brought up more often in these discussions in favor of/against universities. People tend to have entirely different ideas when it comes to what a higher-level education should provide: For some people it's a scientific way of thinking about the world, for others it's just preparation for a future (and hopefully better-paying) job. Separating those ideas might be a good first step in solving this conundrum.


Thank you for this perspective, I've never thought of it but it makes a lot of sense


> No. Just build a system that works like the perfectly functional and affordable systems in much of Europe.

I don't see how that can happen until there is broader acknowledgement of what is actually appropriate in terms of education. No one is prepared to the admit that university degrees from PhD granting institutions aren't required, or are inappropriate, in all cases.

College is now a prerequisite for many jobs that don't need a university-level education. People are lazy and upping the bar is an easy option compared to handling the risk of choosing a "less qualified", but appropriate candidate.

Research drives investment while teaching doesn't. Even if you are a relatively bad at (or ill-equipped) for research you can still bring in money with this approach. In some cases, only parts of institutions should be participating, but you aren't a real university if your focus isn't primarily on "world class research."

Students are stuck with a bunch of bad options. They are unlikely to want to disappoint parents who want them to go to college and be successful. If they go down an alternative route them they'll be filtered out because systems are built around people have degrees.


What not to love, they begin the class segregation early. Wait what? Guess they are that way after centuries of pretty authoritarian governments. Every one knows their place /s


The USA is just as bad, if not worse. The mechanisms are different but the effect is the same.

If you don't take honors/AP courses throughout high school, then you're cut off from an entire world of possibilities. And if you are in the wrong ZIP code, you might not have access to those courses at all. Do you know what percentage of US high schools offer AP CS?

The US segregates pretty heavily starting the last year of middle school / first year of high school. You can jump tracks, but it's hard (and the same is true in Germany btw). Some tracks aren't even available to poor (esp rural) folks. Wealth determines which tracks are available. Within the set of available tracks, teacher recommendations play a big role in which path a student is suggested to pursue. Biases exist, and people will often follow whatever path their parents followed.

Although you can get into some college without honors/AP courses, those lower-tier colleges are typically closer to something like an applied university. With correspondingly similar differences in outcomes. So, in the USA, your ZIP code and teachers' decisions from freshman year of high school do have a large effect on your life path.

Germany's system isn't perfect, but "egalitarianism" isn't a strong defense of the US system relative to Germany. If anything, the US system is worse. The idea that living in some particular part of the city might mean you actually can't get a gymnasium-level education is insane in Germany, but AP deserts are a de facto reality in large swaths of the US.

At least in Germany it's explicit, people realize what's happening, it's at least somewhat merit-based, and your parent's ability to purchase expensive real estate doesn't play such a huge role.


An apology for the snark, just remembered a comment a professor(anthropologist) made that so much of the German culture was shaped by the monarchical and authoritarian regimes, just highly hierarchical and regulated to death, that ruled over it. Not comparing to the US and not American just curious how cultural differences play out. Think both US and Germany still have it decently good compared to what you will see in other countries (just plain terrible).

Now, these 2, I won't say one's situation is better than the other but it is curious that both models of public education have their origin in Prussia/Germany and seem to be keeping the original 19th Century purposes of reducing social unrest.

I do see it is more difficult to keep a good level of education in the vast extension of the US with the current system. Seems it is in need of an overhaul but way deeper than AP courses. Like how would AP courses benefit a rural schoolkid other than maybe getting access to work in a city? Could we think there is any education for him to be a good farmer? (as 1 example, not sure if it is practical) Ideally, moonshot, he would be able to choose whichever path suited him best, whether more education or moving, etc...


Yes, both are relatively better than many other systems.

> Like how would AP courses benefit a rural schoolkid other than maybe getting access to work in a city? Could we think there is any education for him to be a good farmer?

I think this perhaps misunderstands the typical life paths of rural students.

20% of the US population lives in rural counties, but only 2% of the population works in agriculture. And even that over-estimates the number of school children who will grow up to be farmers because a huge amount of the unskilled farm labor in the US is done by first generation migrants.

So the average rural schoolkid in the USA isn't going to work on a farm (and the ones that do will often be skilled workers or owners and will therefore go off and get agricultural degrees.)

The government runs a lot of jobs programs for rural areas. Things like state/federal prisons and military bases are intentionally placed in rural areas to provide employment opportunities. And of course national forests/parks tend to be in rural areas and provide a small but steady stream of jobs. We also explicitly subsidize hospitals and schools in those areas; the local tax/consumer base probably couldn't support those services on their own. So the government sector is the lifeblood of many small towns these days, at least as far as employment goes.


Thank you, your answer has helped me have a clearer picture of rural USA. I think I can see your current government's tough stance on immigration as making sense to the people in those areas. What do you think could improve the growth of these kind of area?


University should still be open to everyone. Everyone should, independently from where they are and how rich they are, be able to study on an university if they are capable to do so.

I believe that:

1. the entry barrier is not fair and designed for a very straight funnel you either fit through or not.

2. If you would open up an university even more and you don't allow anyone to missuse it (by sleeping at the university, playing games instead of listing, getting money while attending but not caring for it) it would not have any downsight but might allow one or other mind being part of it.

I also have the feeling that people which were able to go through a bad university, already don't mind it anymore and don't care to optimize because it worked for them. We are not a poor soceity anymore and we have plenty of experience and technology. Its the duty of the university to try to help as many people as possible to learn the stuff you learn at the university. University is being paid by all people.

Its also the reason why OpenAccess should be mandatory.


You can't cater to everyone, especially if what you serve is dependant on exiting skills, knowledge or levels of understanding. One would maybe argue that a university might fix that by making dumbed down versions of things or adding extra pre-learning options but at that point it just becomes a supermarket. If you try to cover everything and everyone you just end up being mediocre which is not the point of a university.


I'm not saying you have to dumb things down etc. but i do say that the university today is not doing enough and this in direct connection to modern technology, internet and digital.


The problem in the US (as opposed to Germany) is that the barrier is not ability, but money. Wealthy families will easily place their kids on some of the greatest universities, while a poor kid needs to be a genius to go to a place like Princeton, or else will have to enter into loads of debt to have a decent education at a state university.


I don't know... 9 years ago when I graduated from a tier II state university which was highly affordable[0]. Additionally it was locally known for creating the second best engineers in the region (after Georgia Tech), best nurses in the region, and has well respected education and agriculture programs.

I leveraged my Tier II undergraduate into a tier I graduate school experience and am now a PI/lead at a large defense contractor.

If you are smart,diligent and stay out of too much trouble, you can take advantage of the cheaper/smaller state schools for undergraduate and push yourself where you want to go. It's not Princeton or MIT but there is still plenty of affordable opportunity (especially in tech and engineering).

Granted my college experience, didn't have the glamour of the division 1 football tailgates, or the excitement of a big city, but those things cost extra.

[0] It was in a small town ~30,000, so local 3 bedroom apartment was $550/mo (split 3 was was ~190 a person), and tuition was about $4000 a semester. Currently tuition is $5000/semester and you can get $5000/year from the lottery scholarship.


You did it right


> Replacing universities with a netflix-esque experience may seem attractive to students, but is doomed to fail to train students

To be honest, for very mathematical subjects the lectures are almost useless. Exercises and practicing is what gives insights, determines the grades and tells which part of the lecture is actually important.

I'd be curious how it is in other subjects, like literature or history. But I assume without writing texts, one doesn't get very far there either.


I used to think this way, but learning how to take math lectures really changed my university experience. To my first semester me:

1. Go to the lectures, sit upfront (where you are definitely not going to use your phone or fall asleep)

2. Get the book or script that corresponds to your lecture. Almost all math lectures follow some sort of book. If unsure, ask your professor.

Just reading ahead for half an hour prior to the lecture will make _such_ a difference, as you won’t be lost and can focus more on contexts than on definitions.

3. Feel free to ask questions. If you know where you are (see 2.) it’s much easier to know what’s wrong. Professors are human and sometimes forget to give some theorem they are using. It’s really easy (and frankly surreal at first) to be that guy that points out errors to the professor (in a respectful way!).

If your shy or you have the feeling, that the problem is indeed on your end – take notes of things that are unclear and ask them after the lecture. Professors usually love to talk about their subject!

4. Last but not least: Keep calm and carry on. It’s completely normal to feel overwhelmed at first ;)


Sure, it's possible to prepare ahead and also afterwards. But this is so time-consuming, I mean after all everybody has to decide that for themselves but IMHO Pareto-optimizing that means not visiting most lectures. At least that's how I studied and it worked out really well. But I agree with you one can seek a lot of value from it, but that doesn't limit itself to 1.5 hours sitting there and listening. But rather being really concentrated for 1.5 hours, and spending time before and after.


> lectures are almost useless. Exercises and practicing is what gives insights

And community -- both the other students, and the professors. To motivate each other, and to calibrate expectations, and to fill holes in each others' knowledge. That seems like the big (academic) reason to be physically on campus, and ideally at a selective place.

My impression is that the same would be true of (say) history, but I don't really know.


While I agree that exercises etc. are the way to deeply understand the subject I disagree with your take that math lectures are senseless. I admittedly always struggled when I skipped lectures and attending them helped me to see the “bigger connections” and actually understand proofs given I was taking notes throughout the lecture.

In philosophy however - which for me this semester is entirely online without any video chat whatever interaction - the in person lectures were highly valuable to me due to their nature on heavily relying on discussion. While I do have a paper here and there to write my understanding mostly comes from attending the lectures.


My experience was similar to his.

University was never designed for all people. What do you think the admissions process is for?

The community college system is for all people. It’s open enrollment and will work with you wherever you are in your path to learn.

Universities bring together bright, motivated people to work on things that are difficult to do by yourself or spread around.

That is what they’re for.

We overload the system to also educate people who just want to do their assignments and get out. Those people would greatly benefit from an online education I believe. There should definitely be more options than for-profit “colleges” that mostly rip you off. This is a service that is desperately needed.


It's odd to me how "experience" has replaced "education" as the goal of universities. Like the comment says, "y'all are using it wrong":

"University is a place where teachers and students who are interested in a field meet to learn and research together."


Or, more accurately, Universities don't make much sense for most applicants in 2020. For those that are willing to put in the time and effort building & researching things, it still sounds like a great experience. However, for those that are are simply seeking a piece of paper while doing the absolute minimum to graduate, yes, it's probably not for them.


>is what University is for all people

I guess this is a problem of all people, then. You have a perfect opportunity of free time, networking and connection with your peers, if you are not leveraging it it's your own fault.


Few people seem to be able to cope with post 16 education. It’s amazing the number of children who try to shoe horn University into their lives. It’s not elitist, it’s just hard. It’s meant to be.


TBH, college for me was about finding myself as a person. Being away from home for 4 years, I grew so much as a human being in numerous and incalculable ways. Being grouped with thousands of others I've never met before -- you learn "human" skills in a way that can't be taught online.

Online education is great and all, but the crowd that argues that it is the end-all-be-all of learning is missing a key element of the experience.


for the amount of money you probably spent, there are a dozen better ways of "finding yourself."

I think about how much I spent over those 4 years. I could have travelled around the world during that time and come out of it with a much richer experience.


Education does cost too much, definitely. But in my case it led directly to a stable career path in ways that traveling the world could not. It set me up for life.


> University is a place where teachers and students who are interested in a field meet to learn and research together. Modern Uni has often forgotten that.

Universities are like venture capital investments...it’s that one single class that will matter the most. 1/10 classes will have the biggest impact on your life...

It were the night courses at the Art Institute of Chicago for Walt Disney.

It was the calligraphy class for Steve Jobs.


This is an interesting way of thinking about this. There were definitely a few classes I took that really made an impression on me and changed the way I thought about things. There were also a lot of classes that were highly forgettable.


Maybe time to bring the Little Blue Books back?

https://www.stitcher.com/podcast/mass-for-shutins-the-gin-an...


> University is a place where teachers and students who are interested in a field meet to learn and research together.

It would be nice to have places like this open to the general public and not gated off like universities are. There are a lot of people who become interested in these things after university, people who didn't attend university that are interested in them, who were interested and involved in these things while they were in university but then drifted away from them, etc.

It seems like we'd be better off trying to encourage the public at large to continually improve (and also connect more with their community and improve civil society) than acting like all of this should only exist in a 4 year window that many people aren't even a part of.


Our University has a Makerspace that is frequent by a lot of people outside of University. The trainers for the machines and such are also not from University, but have gone through vocational training are experienced metal workers, wood workers, textile workers and so on. These interactions are definitely awesome for the students, to learn about their perspective, and the other people also gain insights in uni work, i.e. supporting students with prototypes for their thesis projects and so on.

I definitely think University should be open for all people and there is a lot of cool things you could do if you attracted morn people from different backgrounds.

When I ever move back to the countryside where I grew up, I definitely plan to start a Makerspace that combines an open workshop, a youth center (with tutoring), some low level research work.

Actually, this conversation has me excited to go do that :)


When you work on your project, please consider including a "library" component with selected materials and resources on various topics. I know you can get all the info on the internet, but sometimes a smaller, local selection of books/resources can go a long way to solve the information overload problem.

If you get in touch with me by email, I can send you some links (math book recommendations and Kolibri Learning Platform which is like the best thing ever - works totally offline on localhost).


I think the issue is so many jobs require a college degree for no other purpose than further general education skills. I am of the opinion that public schooling needs to be extended to include these skills and then we still need higher education where the exact experiences you described are the main point.

Far too many people go through college for no other reason than to get some degree, doesn’t even matter what degree. And it’s no surprise, as it’s much harder to find a decent job without any old college degree.


There's an extremely old joke (like 100+ years old; I can't find the source atm) that goes something like: "US universities are the best high schools in the world". And it's true. It's not safe to assume that someone with only a high school education will be able to write well, do basic mathematics, etc. The high school grads who go on to teach themselves how to program are an extreme outlier.

The entire US education system needs an overhaul. You can't really fix our university system without making corresponding reforms in K-12.

A lot of it grounds out in trying to be overly egalitarian, realizing that doesn't work, and then hacking the system by introducing an honors/AP track. As a result, things are no longer egalitarian (the split between honors/AP and not-honors/AP happens freshman year or before). However, instead of having a gymnasium-like true high school liberal arts curriculum, we instead just moved some fairly boring college 101 courses into the last year or two high school. Which really isn't the right thing to do.

We should now make the split explicit. And then, instead of teaching mediocre university 101 courses to the honors ("gymnasium"-like) students, simply study the high school material but at a much deeper level in the "honors" track.

Then, the university system should split in a corresponding way, with non-honors students going to applied universities designed for them and honors students going to universities designed for them. NB: this split already de facto exists; there's a whole class of universities you can't get into without doing the honors/ap track in high school.


> there's a whole class of universities you can't get into without doing the honors/ap track in high school.

Which would those be? I don't have much insight into the "traditional" admissions system since I took quite a few years off from school between high school and college, but I didn't have any trouble at all getting into a well-regarded UC without any honors/AP classes. I did go to community college first, but I'd advise that for virtually everyone who's not funded by wealthy parents and interested in the social experience as much or more than the academic. No one needs to pay for the privilege of going to $TOP_SCHOOL just to do freshman level coursework.


> Which would those be?

Most of the top-tier R1 universities; i.e., ivies and similarly competitive institutions. Also, similarly competitive departments at otherwise less selective universities (e.g., UW's or CMU's CS programs. Getting into either university is considerably easier than getting into the CS program).

NB: I guess the sentence you quoted is slightly too strong. You can first go to community college and then transfer. But it's not the default path, and request some special effort. That's also true in Germany, by the way. You can go to University without going to a gymnasium.


> The entire US education system needs an overhaul. You can't really fix our university system without making corresponding reforms in K-12.

Sure you can, have an opt-out model for students like they have in Germanic countries by the time you're in your mid-teens and apprenticeships in a various crafts and professions to help them into working Life. It works really well, and prepares them for a life in a ROBUST manufacturing Industry, especially now as the West is finally trying to decouple form the CCP's horrible Human Right's violating dependence and corruption.

I'd say outside of STEM there really isn't a need for most to go to a University campus, there really isn't. I often philosphized with grad students despite declining my honorary position in the department as I couldn't afford to deviate from my studies and debate moral ethics for fun, despite being promised a 'fast track' career in Law.

I could enjoy music, without the theory or the knowledge behind it, and began to bedroom DJ, eventually I even got to meet and know what would later become really famous EDM composers from the UK by going to clubs and just talking to them on the forums and eventually going on the decks myself in a club. Many of them super eager to share their knowledge, some were classically trained, and did Master Class series for free on Youtube!

I could go on, but I think most non-STEM degrees, and even the ones that don't require physical labs like Math or CS, can be moved to an Online model saving the resources for those that do require Laboratories and guided lectures and office hours for experimental designs and Scientific Journal publications.

I'm sorry to say but if you go to University to study Classics, Theology or Music Theory (or any of the Arts really) you really are doing yourself a great disservice if Knowledge is your real aim as so many more opportunities exists outside of academia. And you aren't stifled by a curriculum that forces you to limit your breadth or understanding of a subject in order to fit a paradigm of what pretty amounts to a churn system trying to maximize its profit model to the detriment of what an Education should be.


> I could enjoy music, without the theory or the knowledge behind it,

One of the most enriching classes I took was The Art of Listening to Music.

It was a grand tour through music history and practice up until the early 20th century. I learned how to understand the compositional elements that go into making music, and while the class was biased towards western classical music, by the time the quarter was over I found I had an appreciation for an incredibly wide variety of music, whereas prior I listened to only a few genres.

> I'd say outside of STEM there really isn't a need for most to go to a University campus,

Have you ever met someone who can do 0(ish) delay translation between languages? With no accent? A non-university program to teach that would end up looking like a university. To do it properly the translator needs deep cultural understanding of the target language, professional training in pronunciation, and a lot of practice.

And why are arts programs useless? Sure art collectives can form and group buy resources, hire models, pay for trips to destinations, and bring in more experienced artists to help teach technique, but at some point again you've recreated a university.

None of this even goes into the value of the core classes. Having people educated about their own country's history, and about world history, having an understanding of economic systems, geopolitics, and mathematics, these are all worthwhile endeavors. A University offers are curated selection of courses and subjects, with pedological methods that have, presumably, been tested over time (some over centuries!) to be effective.

Is the system perfect? Heck no, spiraling costs are evidence of that, but throwing away a system that has, at minimum, allowed for the past 2 centuries of dramatic growth of civilization, seems a bit hasty.


>Have you ever met someone who can do 0(ish) delay translation between languages? With no accent?

I have met quite a few professional translators that work for the European Union and all of them add a significant delay to translation. And accent.

You are probably referring to speaking different languages like a native, but that is not translating anything: When I speak Spanish(as a native speaker) I just think in Spanish, the same in any other language like German or French.

In fact if you speak for example in French or Italian you need to add some "music" to each phrase. Usually when you are translating you subconsciously carry those tones to the other languages.

E.g If I translate from French to Spanish I will carry the French intonation to Spanish even when I am native in Spanish.


> I have met quite a few professional translators that work for the European Union and all of them add a significant delay to translation. And accent.

My point is more that learning this skill is an academic pursuit that involves a lot of training.

There is a delay, (word ordering, brain time) but the top translators I've met have zero accent. (Most impressive was meeting someone, not a translator IIRC, who had not only gotten rid of their accent but learned a California American accent!)

But again, doing all of this involves a lot of study, from the biology/mechanics of speaking, to the deep cultural knowledge that is needed to translate idioms.


> Have you ever met someone who can do 0(ish) delay translation between languages? With no accent? A non-university program to teach that would end up looking like a university. To do it properly the translator needs deep cultural understanding of the target language, professional training in pronunciation, and a lot of practice.

Yes. My friend from Croatia was born in Germany and worked as a Barista in the kitchen I took over when I lived in Istria, but he majored in Tourism in a tech school and learned to speak 5 languages seamlessly to get an advantage on his peers for work: German, Croatian (which is really 5 languages as it spans all the dialects of former Yugoslavia and with enough effort Czech can be understood, too), Italian, English and French. The first 4 he learned by the time he was 15, and French he learned online using things like chat-roulette, MSN chat and watching French TV.

He wasn't alone, either his roommates also spoke Swedish and Finnish as they often had seasonal work in Scandinavia for the skiing season. Its not as rare as you think, I personally speak 4 (5 if you include my broken mundart/Berndeutsch) but after returning to the US for several years now I can only speak English/Castilian/Italian (each with an accent) as I've gotten really rusty with no one to speak with.

My Japanese has gotten to the point where I'm having to force myself to at least listen to a few podcasts and watch anime to refresh as my vocabulary is completely abysmal now. I'd never say I was 100% fluent in any of them, as a language is an ever changing system of communication, but I could get by enough to work and communicate without ever taking a class and just immersing myself in the Culture/Country.

> And why are arts programs useless? Sure art collectives can form and group buy resources, hire models, pay for trips to destinations, and bring in more experienced artists to help teach technique, but at some point again you've recreated a university.

They aren't as an Intellectual pursuit, as a study in the University format where you're coaxed to have limit your scope and Interests is where I take issue with it. Especially when you consider its immense cost to do so, and for something that is so widely available online for free or at a library if you have the desire to search for it.

You can spend a lifetime on just the Myths and Legends of Ancient Greece, and some do, but you're often rushed and only get a superficial view of what was said or how they were viewed that gives you the illusion of an understanding of its role in Greek Society and are graded on how effectively you can repeat what was said to you.

The Arts as a means to better ones own edification is incredibly rewarding, and I too am a benefactor of it.

As I said earlier, I received honors in Philosophy for my work in a General Education class by my post-doc Professor and I was invited to start upper division courses under his guidance while I was awarded an opportunity to travel the country to debate Moral Ethics in a collegiate division--all while we were facing budget shortages in the Health Sciences which forced me to transfer to yet another University for the 3rd time. This, I'm told was another way to avoid going to Law School and just take the BAR exam and enter a career in Law, where they felt I'd be am effective litigator.

It may have been a missed opportunity to refine my oratory and rhetoric skills, but the truth is I needed to graduate for financial reasons in order to start paying off the debt I accrued mainly owed to my family and friends who helped me offset my ever growing expenses. Also, I had enough opportunities to do public speaking outside of University due to my personal interests.

> None of this even goes into the value of the core classes. Having people educated about their own country's history, and about world history, having an understanding of economic systems, geopolitics, and mathematics, these are all worthwhile endeavors.

Agreed, but these can be achieved rather well in compulsory K-12 Education if we reformed it. I used to speak with my Senior year English Lit. teacher in HS in my free time about various topics and he wished he could engage other students in class in the same way but he was limited and forced to follow the curriculum for testing standards and public funding.

His job depended on being able to maintain the status quo, and being so close to retirement he had given up trying. Things weren't that much different in University either, the only ones that challenged the system were those who already had tenure, had published journals/books and were close to retiring and didn't need to play the game any more, in my experience.

That's one of the biggest issues with Education in the US: teaching toward the exam is the constraint, not lack of funds, resources, knowledgeable teachers or material to share. The US has outspent most of the West in Education for very little in return for most of my existence--and after my personal experiences in school and those shared by Gatto I think that's by design and not a flaw.

> with pedological methods that have, presumably, been tested over time (some over centuries!) to be effective.

Pedagogy is a very loaded word with many implications about how Education is formatted, specifically in a regimented conditioned format; the most prevalent one being the Prussian model which I will refer you to read books by John Taylor Gatto [1] who was awarded the best teacher award in NY who's views on formal education in the US (and I'd argue in Europe and Australia as well) are far more valid than my own anecdotal ones and reflect the discord and division the US based class system relies on.

> allowed for the past 2 centuries of dramatic growth of civilization, seems a bit hasty.

You do realize that the Hellenic era in Athens, with institutions like the Lyceum and other public forums in the Agora for philosophizing proved to have the most literate and Educationally engaged society/Polis for most of Western History? One that has withstood not just centuries but millennia. The fact that we still reference much of what was taught back then in Law and in certain Sciences (not Aristotle's stuff on Biology of course, but on topics like Math, the Nature of Matter and to a certain degree Astronomy) and still holds up today should be enough of refutation for what is and isn't capable of withstanding the test of Time.

I think we've deviated so far from what an Education was in the West and what you're describing (Prussian Pedagogy) is exactly what has many feeling so resentful for being sent to a sort of mind prison where young minds are forced to do repetitive tasks, in a designated manner that was optimized for a system that simply will not exist in their lifetime--expendable workers for tedious manual labor in factories.

I'll agree it has certain benefits, the US space program may have been a byproduct of Ex-Nazi Scientists in things like Rocket Propulsion, but the Engineers and Techs that followed at NASA for the Apollo Shuttle program were mainly US educated and despite the Soviet Union probably had having more People in STEM per capita, the US had a less State based oppressive system that favored and ultimately allowed them to win the Space Race. And this ultimately would pave the way for Private Space Companies like SpaceX that quite honestly would not be possible anywhere else on Earth. And as ITAR requirements demand, most outside of the first hires when it was created, are products of the US Education system as you must be a US citizen to obtain clearance.

But this is a rare exception, one I do admit it exists.

I think the best thing to do is we admit that formal education has limited purposes for much of the populace, and the sooner we can free them from the drudgery of said tasks and help them dedicate their time and efforts to more fruitful endeavors that benefit society as a whole the sooner we can stop feeling the World is going to end every second because we have people engaged in sectors that help curtail those imminent dangers.

It's already been happening, you need only look at things like Extinction Rebellion, which was already going strong in a pre-COVID world, because young people are mortified of the state of the planet that the previous generations will have left them with, and they'd rather boycott school and protest than ignore that the World they're inheriting is terrifying. I bet if you found better outlets for their energy that street demos and used the funds they use for public schooling they'd be way better off planting trees and focusing on renewable energy, and recapturing wasted materials in certain Industries under the the proper supervision they'd be way better off in every possible respect. And probably create more sustainable forms of employment in the process they can dedicate themselves too.

When I was a farm manager in Hawaii I'd get applications from all over the World of all ages for an unpaid internship, unfortunately because of budget constraints I could only select 2-3 outside of Hawaii because of the program/grants we had with the Department of Ag.

Personally speaking after seeing the impact Gretta Thurenburg's work has had amongst th eyouth, especially in a post-COVID World where these very institutions have broken down, it has probably shifted an entire Generation's attitude of the entire Establishment and it will force them to collectively seek answers and look for change that simply is incompatible with modern Formal Education. Many felt their Future depended on it, now they were forced to accept and see how these institutions have failed them.

1: https://www.amazon.com/John-Taylor-Gatto/e/B001K7S0AE/ref=dp...


Colleges sell prestige. That's why a harvard education is free, but their license to discriminate is coveted and charged for.


For students who arrive at a university without already knowing these opportunities exist, it's not always apparent that they do exist and may be available to them.


The clubs are definitely advertising and trying to find new people.

What people don't know is how good this stuff is, that they can actually get it done in parallel to the classes and how rewarding and educational it is.

This is definitely something where kids with parents who went to uni have a benefit, since they will know more about how to navigate the system and about things like this.

I didn't have this benefit, my parents never went to uni, but we had a lot of books and dad taught me to poke at everything :)


True. But it's even more difficult to participate if you don't go. Even my fractionally-aware stumble through the resources available at the universities I went to opened up doors that I didn't even know existed.


In recent years it has gotten a lot more obvious how massive the network effect is for undergrads. Anyone in stem knows to get research experience, and to bombard every professor's email in the department for a position, no matter what career they might end up with.

In fact, at my uni there is fear that going online will result in students deferring for a year, since undergraduate surveys are overwhelmingly in favor of in person class, to the point of not paying tuition for online offerings and extending their degree plan as a result.


Most of those sound like things that would usually be mentioned during Freshman orientation, so even if students don't know about all of them, they should be aware that those kinds of opportunities exist.


They show these kind of things in freshman's week don't they?


If you pursue a non-software-engineering major, you will at the very least be forced to use expensive equipment in a required lab class.


Good for you. You were lucky enough to be in a program that had a firehose of resources (mostly money, but probably also high quality professors). There's only so many flagship programs at any given university. The overwhelming majority of people who get a BA or BS in a given field will not be that lucky. How many students to the top schools in a field graduate? How many do the rest graduate?


You could make this argument about anything in life.

Everyone has good and bad breaks. The mistake is thinking that we can somehow create the same experience for everyone.

The entire American experiment is based on individualism. We have to have some responsibility in our lives.


You did it right, with deep engagement. I dropped out of high school to secretly enroll in college. I felt high school was a waste of time, plus my home state is overly Christian and I'm not. I did not care about a degree, I wanted an Education. As soon as I could, I went to Boston and just started taking classes anywhere I wanted with any class of interest. I started with Harvard Summer School and continued with Boston University and MIT for a period of 5 years. I got to participate in the early 3D graphics research community, was a part of DeVanney & Mandelbrot's team that wrote Beauty of Fractals, participated in the AI research of the time, wrote my own 3D animation language, and finally graduated B.U. one semester shy of 4 more undergraduate degrees. I just did what interested me until I was forced to graduate.


Yes, and it's this kind of thing that the commercial model of the university and the competitive model work against. When students see it as "I give you money and you give me the piece of paper which lets me into the elite", none of this happens.


I agree whole heartedly with this comment. Until last year, my family and I lived in the undergrad dorms at UCLA (as a professor in residence). In the end the major thing I’d advise students to do is get involved. It might be research, but there are a million other things as well, and opportunities to explore these types of activities become a lot more rare after you leave college.

I wonder if over the long term there might be a way to replace that experience too, because I agree with the OP that it’s the most valuable part.


I'm guessing that for each of your bullet points listed you needed some initial set of technical skills, and that you had those skills before even arriving at University (otherwise the first semesters would not have been wasted.) I am curious whether there were any classes that you feel were more educational than these work experiences.


I went to a technical high school, so I definitely came prepared, and I also learned a lot of useful skills during all my semesters in classes.

I didn't mean that the classes were a waste of time, but I wish I had started doing extracurricular stuff earlier.

It is true that the clubs usually do not have a lot of use for the 1. Semester people, as they don't have a lot of skills yet, but the bigger clubs have their own "course" system and will definitely invest in helping you develop.

I actually think all my classes were useful and I am happy I had them. It just that they were not what was fun, motivating and special about my time at uni.

My space engineering class was taught by an actual astronaut, so that was a definite highlight.


Dang, sounds cool. Which university is this?

Edit: saw below you said Technical University of Munich


I had the same experience regretting my early semesters before I discovered research. It is what most of the professors are most interested in anyway. Lectures where one is sitting in a hundred+ student hall listening should 100% be replaced by online. But more resources should be given to the higher value add like research and networking.


You know, not everyone can study fulltime and take part in all other 'events'.

Would be great if that would be the case though.


Jup, Curriculum definitely has to be changed towards allowing more people to do that kind of stuff, also financial aid.

I did work 8h per week through roughly all my time at uni. My grades sucked, tough. And some people had to work more to get enough money. I actually had enough help from my parents to need to work, but I liked the extra cash and experience.

Helps that uni in Germany is basically free.


I went to a state school that still had one of the best CS programs around. Some decent teachers, a few great ones (and some noteworthy awful ones), and tons of under-subscribed hardware, and self-starters bouncing ideas off of each other.

Most of the time you could find an unused computer to work on your own thing. Frequently you could find several contiguous ones for you and some friends. Some of the most capable people in my class were pulling B-C grades because they spent so much time working on their own projects instead of cramming for tests.

I had hardware problems for four classes, but only one of those was a CS class, and by then I'd written enough code that I'd just shell into a machine to get a rough draft to compile, then spend a couple hours in front of the box debugging. Get an internship or programming job on campus, kids. Programming assignments are a lot easier once you've written real code.


This experience is highly atypical. You can’t get this just by “treating” it differently. The current bullshit situation is a stable Nash equilibrium. Of course, if you should have certain advantages, you could sacrifice those and change your payoffs.


Actually, I would argue even your approach does not really represent the real utility of a university. My list includes

1. failed to start a startup about hardware tutorials in local languages

2. failed at productionizing a propeller display, they are pretty popular popular now.

3. failed at productionizing a gas sensors with high resolution

The list of failures go on and on actually. But the real utility I had was the nice cozy university bubble where failure of projects isn't really ... accounted for.


You did use University correctly. But your example could just as easily been prison. Some people use prison to get off drugs, get an education, learn a job skill, develop a moral code and purpose.

Something that can cost so much, and require such a huge time commitment shouldn't have such a high failure rate of people who fail to "figure out how to use it".


Cool idea, prison is also cheaper and easier to get into.


I'm not so sure, the high end prisons can get pretty competitive. A lifetime of violent behavior is certainly a lot of work


Which university was that?


It's likely the Technical University of Munich based on the hyperloop piece. Though I had to do a double take because it sounded incredibly similar to some of my experiences at Georgia Tech. If you want to know the kinds of universities that do these kinds of things and do them well, its useful to look up some of the student design competitions and see who regularly attends. From recollection, MIT, CMU, Olin, Dolhousie University, Vanderbilt, IIT, RPI, Embry-Riddle, Harvey Mudd are a subset of the places where you can have this kind of experience.


Sounds similar to Delft University of Technology too.


I know many of our teams compete and cooperate with Delft, seems like a cool place!


Go Jackets! Yes, technical schools (schools focussed on stem) offer a lot of such opportunities. The other side of the coin is if you don't have that inclination out of high school, you might miss out on such opportunities and vice versa.


TUM it is.


How could the essence of this comment been captured and communicated without running the risk of coming off as braggadocios?


It's meant to be a bit braggy insofar as to show off as how cool that time was. I am a very normal guy with no exceptional talents and intelligence. Also I am quite lazy.

It was the opportunities I had and the people I met who enabled me to have this experiences.

It's meant to be motivating to other youngsters to go try. That's why I sound a bit cheerleads as well ;)


I didn’t view it as bragging. OP was just throwing out examples on how to best utilize an university.


I don't view it as bragging either. But it's still interesting to see a list of concrete things has achieved, yet only come to the rather vague conclusion that "University where you come together to learn and research".

I think higher education is an opportunity shape a person during a crucial stage of their development as a young adult. It's a time when you get the opportunity to get exposed to many of different ideas, views and beliefs; to learn how to have a healthy debate and formulate a well-founded argument; it's a time to reflect on what you have learned thus far and what you will do with it in the near and far future. It's a time of start off discovering the world away from your parents and your guardians.

To me, that's what higher education is about. It's that phase between adolescence and becoming an independent adult in your own right.

I don't think that's something you can ever acquire by passively consuming content via the Internet. Some of the good and critical things do take time to learn and aren't really quantifiable.


How many degree-granting universities do you think have access to this kind of funding and network? I'm guessing less than 5% in the US alone.


It's fine as it is.


Didn't say it was, only a question, and AFAIK bragging is allowed here...but why is it the top comment? How does that work?


I’ve noticed newer comments (sometimes as new as 1 minute) can sometimes appear at the top and that may be in an effort to get people to participate with newer comments instead of people only seeing the stuff that was commented first.

Not sure if that applies or even how the system works, just a thought


> University is a place where teachers and students who are interested in a field meet to learn and research together. Modern Uni has often forgotten that, but if the students treat it as such it works in this way. We have to tell students how they can get positive experiences out of the current system - by trying to do so, they will change the system from within.

I agree with you that most do not take advantage of the opportunities that exist on some University campuses; personally speaking I had to do my Flow Cytometry training at the Scrip's Research Institute (where several Nobel Laureates were on staff) because my campus only had one FACScan on campus and it was often reserved months ahead of schedule. That the only time I ever sought anything from what my University offered and it failed to deliver. (I won't even talk about how hard it was to get into the lower/upper division classes for my Major, either.)

And while this ended up being a good opportunity for me in the long run as it was both free for me as a local student it also helped me network for job interviews in the Industry, the truth this was happening at the same time I was working 40+ hours in catering as prep cook/chef. I couldn't afford to eat outside of work due to tuition/books obscenely high rent and had to stop attending on daily basis and set up a study group where we recorded lecture on voice recorders because gas shot up to $5+/gallon and many of us were struggling to keep up with expenses as we all worked.

Hell, I showed up to my Microbiology final after an 11 hour service at a lunch and dinner banquet for AMD only hours before the exam and I had to tell my professor I needed to have a few drinks during the exam just to stay awake and he reluctantly accepted as he understood what I had to do for work.

I think you HIGHLY underestimate how broken this situation really is because of your idealized experience, and you assume it is a choice to have to have our attention so grossly divided solely because of the economic distress it demands of some students.

And this is mainly because University can charge what they do and has never come back to reasonable costs and they ultimately act as the gate keepers into most Industries and force you to contort your Life/Work to suit its demands unless you can afford to stay in University indefinitely.

I hated University, mainly because 75% of it was a total waste of my Time, 15% was useful albeit dated (experimental designs based on 19 century reactions that have been refined and outsourced by Industry for a reason), and the other 10% was actually really worth it under really passionate professor(s) who actually gave a damn about the subject they taught.

That's a horrible ratio for anything with such a high cost, in fact I saw exactly 1 single slide (that they even glossed over) in my entire undergrad for what I'd eventually do in my profession as a Lab Scientist in Diagnostics.

This model has to be disrupted.

Also, I didn't need a school to teach me how to wrench on cars.

I began when I was 11 and even had a stint in professional motorsports as a privateer driver and worked on my own chassis while I was in University where ASE shop foremans and Race Team chief mechanics were recruiting mechanics from the pits during the bubble era; Motorsports was actually my desired profession and University was my plan B before the crises dried up any sponsorship opportunities in the financial crises.

I still ended up in the Auto Industry working for VW/BMW/Nissan after I left the Sciences because of my experience back then. Had the Industry not been so wasteful and broken I might have stayed but I chose to go into Fintech and Culinary instead.

My point being, is that University, and its severe costs, do not reflect the ever-changing Job Market (assuming it ever did) they act as vanguard for and for the costs and time demands it cannot be justified. Much less when you have such blatant corruption [1] staring at you in the face.

Had I not been so multi-faceted in my skill-set I would have been like most of my Millennial cohort with tons of unpayable student debt and ever diminishing prospects in Life instead of the immensely rich career(s) that has spanned 4 different Industries that I can move between depending on the project(s). I'm a total outlier, and to be honest even though my opportunity at SpaceX in post COVID World didn't materialize and I have to go back to Supply Chain/Logistics in the end I still cannot believe most of it actually happened given what a f'd situation I had having gone to University and then being thrown into the abysmal Job Market in 2009. I really feel for those currently graduating in this Market, I really do. They're simply cannon fodder and will soon be forgotten in the news cycle just like we were back then.

They'll know just how expendable they were seen as by this system really soon as they won't be able to pay their debts, which had already exceeded a Trillion+ Bubble in the US prior to COVID.

1: https://www.latimes.com/local/lanow/la-me-college-admissions...


This is also a common stumble in an Art School setting. Art school is about giving you the time and materials to create art, not how to learn art from scratch. The same is true for STEM.

University is an opportunity to do rather than just learn out of a book.


I was having another one of those moments yesterday where I was questioning the value of higher education while reading a HN thread about teaching yourself CS. There was a healthy debate about whether the typical CS curriculum helps a modern software engineer at all. Some thought the degree was stuck back in the 70's. Others thought there was value, but had trouble articulating it. It seems that when the information received was divorced from the credential, the value dropped precipitously. And this is a hard sciences degree with many "universal truths" to teach. Things get even more squishy when you get into the humanities which are much more idea, opinion and discussion based.

As the linked article points out, employers often look for the degree, but do not believe it is much of an indicator of quality and don't expect new grads to walk in the door with the needed skills. As a former recruiter I think that sentiment is pretty spot on. I don't think I ever asked a candidate about the details of their education. It was just never relevant in comparison to their experience.

So that leaves the on campus experience, which I personally think is amazing. It's like living at a country club for 4 years, crossed with a huge expansion in your world from living at home and going to high school. The new ideas from the classes are just a small part of the new ideas you get exposed to from other students and campus events. Plus with the increasing independence, it's a great gentle introduction to adulthood. Unfortunately all that is not available during the pandemic.

So the content of the classes is in question, the value of the credential is in question, the cost is ballooning and the on campus experience is on an uncertain hiatus. If ever there was a time to disrupt higher ed, that time is now.


For me, the biggest benefit of the university structure was 1) teaching me things I didn't even know existed and 2) teaching me things I didn't think we're important (but turned out to be).

(1) might be solvable with an online curriculum, but (2) requires an evaluation or end-game credential to force me to do it.

Other than that, as you mentioned an even bigger benefit was social, the shared experiences with others, and the cross pollination of ideas from these other people


It's not necessary to attend university to learn things you didn't know existed, or to learn things that seem less important in the moment. Whenever I've taught myself something out of interest, when I've seriously set aside time to explore a topic, it's not hard at all to expose oneself to many new concepts. This is assuming the field has a wealth of literature to read, and CS is one of those fields. Whether you have access to the internet or hardcover books, it'd be difficult not to stumble upon writing that is at least as compelling as the average college professor's lectures.

Disclaimer: I am self-taught in CS, but also have an Engineering degree. I entirely agree that the social aspect of university is the largest benefit.


maybe so, but there is a certain level of self-discipline and privilege needed to be able to study just from books. Lots of people on CS and other courses want friendly exposure to new ideas, not a recommendation of a list of text books


That's more of a personal thing. It's far easier to me to study a technical book on something I know is useful compared to cramming for exams, doing nonsensical assignments, forcing myself through a specific pace that fits a semester, dealing with professors that never seem to be having a good day, wasting time commuting to the faculty (assuming you don't live in campus), etc. I can do that for money, but having to pay and be less efficient at learning than just googling stuff is what drove me to drop out.


Same here, you summed up succinctly why I also have not followed through with a degree. The academic system is now more of a hindrance than a help, compared to studying independently and maintaining connections outside academia. Academia has failed to adapt to modern learning.


Do you view the ability to study and learn from books as a privilege?


Standard teacher education says that each student has a learning style that is most effective for them. https://www.learning-styles-online.com/overview/

A person who learns well from books in a solitary setting is at a huge advantage compared to someone who needs a social-kinesthetic style.


Although "learning styles" are widely believed by teachers, there's no evidence that they affect learning. They're basically horoscopes--vague enough to make sense on an intuitive level, but false [0]. In practice, tailoring lessons to this belief reduces effectiveness [1].

[0] https://www.theatlantic.com/science/archive/2018/04/the-myth...

[1] https://www.apa.org/news/press/releases/2019/05/learning-sty...


It certainly is in comparison to someone raised illiterate, or in any number of unfortunate situations. But in comparison to the privilege of a university education? I'm a bit surprised by the suggestion. If self-taught knowledge is really the marker of privilege in 2020, I'd expect it to be far more attractive than going to university by now. On the contrary, it is not, because learning from books has a much lower bar. You can't use it to select for the well-connected.


Going to university requires more privilege than learning from a book.


One of the benefits of a college setting would be the gathering together of young intelligent people and exposing them not only to new ideas but how to evaluate, value, and compare ideas for their relative worth. I wonder how much open and free debate of ideas even exists anymore, having given way to the inculcation and indoctrination of "correct ideas." I have long believed that the best way to eradicate racism is to allow racists and anti-racists to have open Lincoln-Douglas style debates on college campuses: allow racists to explain their ideas and positions and the tenability of their position will be exposed for the naked fear and hatred that it is. While not quite as crucial as the previous idea, debates of ideas, languages, operating systems, etc can and should happen as part of a CS curriculum. Regardless of the field we end up in, the need to be able to state one's position and defend it against a counter-proposal is something common to all fields (not just law).


You’re making the incorrect assumption that racists are arguing in good faith and are willing to change their minds after a well thought out intellectual, level-headed discussion. But history has shown us this almost never happens. The majority of them are basing their hatred on emotion and not rationality.

Instead, racists use their platforms to intimidate at-risk minority groups and intimidate them into silence. White supremacists are actively trying to silence those they disagree with, often with violence, not trying to have an intellectually honest and open discussion.

Read about Karl Popper and The Paradox of Tolerance.


I assume you include anti-white racism? Because there seems to be a lot of it about at the moment.


There’s no such thing as anti-white racism. Racism is prejudice or discrimination backed by institutional, structural power.

Therefore, a black person prejudiced against white people (as an example) is meaningless because black Americans have historically had very little power in the United States.

Racism by white americans has historically been backed up by the power of every institution in the United States, which often has meant state-sponsored violence against people of color.


I think most people recognize that this is the "new definition" of racism, or sometimes people refer to it as structural racism. But when people use the term racism, they usually mean the standard dictionary definition, which is about being prejudice against someone because of their race.

It's probably better if people used "structural racism" to define what you mean here. Similar to how "sex" retained the original definition of sex, and "gender" was used for the social construct. If people tried to say "There is no such thing as being born with a binary sex, because sex is a social construct" then it would be equally confusing.


The "old" definition of racism was something like "having an opinion about someone based on their race" and was essentially indistinguishable from the concept of stereotyping, where someone "judges a book by its cover" and makes assumptions based on what someone looks like.

Today we understand that that kind of stereotyping is a universal attribute of human cognition and therefore not explanatory. That is, you can't get from that definition of "racism" to the society we see around us today. Every human does it--everyone is "racist" by that definition--yes, including black people. And yet black people are still far worse off, on average, than whites in American society.

So that definition of racism is useless. It doesn't explain why things are the way they are, and we can't change the way our brain works at such a fundamental level anyway. It has even become a joke--see Stephen Colbert's riffs on "I don't see color."

So yes, the concept of racism has evolved to include power. Without power, stereotypes don't have an opportunity to cause actual harm. You have to consider power to understand why American society, on average, treats white people and black people so differently.


Aha! And so we stumble across the Orwellian redefinition of language: if you don't like a word, just change it's meaning until it suits your agenda.


What counts as "changing the meaning of a word" and what counts as higher, more complex, and more abstract understanding of (originally) lay concepts? For example, if the common conception is that "math" means something like arithmetic, and mathematicians point out that it has a much broader, abstract, complex, and encompassing definition within their field of study, are the mathematicians engaging in Orwellien redefinition of language?

The academic feeling among people who study these topics (socioloists, anthropologists, philosophers) is much closer to what GP defined as racism than the definition you (probably) had in mind.


How many of said academics do you honestly, seriously, truly think don't have leftist political tendencies? You seriously expect to get a completely unbiased opinion out of them? Come on. No matter your political leanings, surely you are willing to admit socialist beliefs prevail in non-STEM academia?


By this definition, if Adolf Hitler had gone to live in Israel, a majority Jewish democracy, he would no longer be a racist, because his “prejudice” was no longer “backed by institutional, structural power” - have I got that right?


No, it's not like that, GP has the wrong conception of the topic. Racism is the thought or ideology alone. This means that the hypothetical Hitler in Israel still fulfils the definition.

Conflating racism with the power to put the supremacist wishes and ideas into practice is somewhere between pointless and negligently dangerous.


As a foreigner, and frequent traveler to the US for some 20 years, find it hard to believe it is still present in any other way than marginally at the US's institutions. Can you point me to some examples?


One example is how the police are far more likely to shoot and kill unarmed people of color in the US[1].

Another example is how people of color in the US are far more likely to be incarcerated than white people are, especially for non-violent offenses.[2]

There's a reason the George Floyd protests have been so intense: His death was no isolated incident, but a pattern of continuing discrimination against people of color in America by those who control the power structures.

Sources:

[1] https://www.washingtonpost.com/investigations/protests-sprea...

[2] https://www.sentencingproject.org/publications/color-of-just...


Some contemporary evidence would make this argument stronger. (I say contemporary because I guess we all understand what Hitler did.)

My feeling is that a "platform" on Twitter where you abuse and troll people is very different from a platform where you have to step up in public, make a speech, have people challenge your points and so on.


> allow racists to explain their ideas and positions and the tenability of their position will be exposed for the naked fear and hatred that it is

People are pretty susceptible to passionate speech by charismatic figures; much more susceptible, in fact, than they are to facts. This just seems like an opportunity for racists to win hearts and minds.

And anyway, there's tons of open and free debate on college campuses (at least, there was, when they existed in The Before Times), but generally it's debate about methodologies, not ideas. Everyone's more or less on the right (which is to say, left) side of history, they just have different ideas about, how to, for example, dismantle systemic racism, or end homelessness.

Though probably the more frequent topics for debate are more along the lines of beer pong strategy.


> People are pretty susceptible to passionate speech by charismatic figures; much more susceptible, in fact, than they are to facts. This just seems like an opportunity for racists to win hearts and minds.

The unstated assumption of this premise is that some people (never us, always others) cannot be trusted to make good decisions, and thus need to have their decisions made for them, for their own good. It advocates for human subjugation.


"The unstated assumption of this premise is that some people (never us, always others) cannot be trusted to make good decisions, and thus need to have their decisions made for them, for their own good."

You can make good decisions if you have full information and you put a lot of time into it. I hope you're aware that not all racists are dedicated to the search for truth. Sometimes they tell partial truths, mislead, and lie.


Saying it's a bad idea to invite racists to speak publicly about their racist ideologies on university campuses is human subjugation?

ok pal.


The OP was endorsing debates of ideas, not one-sided passionate speeches. In fact the example of a Lincoln-Douglas format was suggested.

https://en.wikipedia.org/wiki/Lincoln%E2%80%93Douglas_debate...


The “debate” on racism was lost a long time ago. I would hope it’s well understood it’s a net negative for society. I don’t think debating it over and over again as if there’s merit to racist ideas is remotely a good idea.

It’s like letting the flat earthers continue to teach science classes even though we know the earth is round.


> I would hope it’s well understood... [that racism is]... a net negative for society.

Sure, but there are many racism-adjacent topics that are not so clear cut. For example, affirmative action is a "racist"--in the sense of discriminating based on race--policy, but there are serious, intellectual arguments made both for and against it. I put "racist" in quotes there because there's also presently a debate on the changing definition of racism: is it racist if a policy or stereotype is race-based but well-meaning and positive? Is "power" required, and does this immunize minorities from being racist? Is it racist to criticize a long-held cultural belief or practice such as circumcision?

I think all of these topics are worthy of open and honest debate. (Incidentally though, I think that the best way to shut down flat earthers and other conspiracy nuts is also through open debate...)


I was referring to OP's point that there's somehow a equal place for racism and anti-racism in debate. Conversations about race are difficult because inherently racist people are like a bad faith blackhole.

Also, I would love to see a single example of serious flat earther changing their mind.


"having a debate" != "teaching a class"

> The "debate" on racism was lost a long time ago.

When is the last time you asked someone who is a strident racist to explain their ideas in the form of a full essay or 30 minute presentation? Have you ever seen this done without the implicit or explicit assertion that the target of their animus is a lesser human or perhaps not human at all? Anyone with two brain cells to rub together -- including and ESPECIALLY impressionable young people raised in racist environments -- can see the absurdity of such an argument. The very act of exposing the ideas to open discussion and forcing the proponent to defend them with logic and reason is what ultimately undoes them. Will some still recalcitrantly cling to such ideas? Sure, but that but by that point it will be clear to anyone paying attention it's not for a reasonable purpose.


We have Youtube, and you can go there and watch Bill Buckley make all the same stupid assumptions racists do and have them demolished by James Baldwin.

The problem with modern debates is that they are not honest representations of the subject matter. They are exercises in tactical bullshit, meant to be rapid-fire, points-winning pseudo-arguments instead of constructive, intellectual discussions of the subject matter.


Racism is anti-logic. Full stop. It's an ideology that reinforces itself only through the continued rejection of information. Racism is used extensively to study the backfire effect in psychology.

The only way for such an absurd system of beliefs to exist is to either rely exclusively on morally bankrupt individuals, or to heavily lean on confirmation bias and reject any anti-racist ideas.


I mean, this is not true? Impressionable young people raised in racist environments are impressed by racist arguments - whether emotions based ones or pragmatical ones.

Simply said, own supremacism feels good. Even if you know it is based false argument it can still feel good. Compliments do feel good even if false. It is definitely not a threat.

Also, my own supremacism is practically beneficial to me. I gain from that even if if is based on false arguments. That makes the supremacist arguments much more easy to accept.


Racism appears in many forms, identifying that is not as straightforward as it might appear to be. Debating Racism would probably include many of the ways in which racist behaviors manifest themselves, which isn’t clear from the get go. Eg we now know that certain policies such as redlining, war on drugs and harsher criminal sentences affected minority communities disproportionately. Were these policies racist?

The current President doesn’t want immigrants from African countries but is ok with immigrants from Norway. Is that racist?

And so, the debate over racism won’t be over for a while.


Those are great topics of research. But platforming racists as just people with different ideas is a false equivalency.


I agree. Debating racists does not necessarily mean giving them a platform though. They may already have one. The current US President is well regarded as a racist but he has a platform regardless of his racism. This is true of others as well; it’s not just KKK members who are racists, it can be anyone, especially anyone wielding power via institutions.


What would you call racism, then, if not a different idea?


What's a racist, and who decides?


I think understanding the history of racism why it's attractive to the masses is a great start in continuing healthy discourse over the politics and economics of the matter. You might think that there is no merit, but you can only convince others to rid themselves of racist ideas if you understand where they are coming from.


"What is the dignity of a human being?" and "Who is a human being?" are the two questions that fundamentally undermine a racist. It's quite possible that many racists hold their opinions out of ignorance rather than malice so an open debate is an opportunity for meeting someone in dialogue and remedying their ignorance. Those who are exposed to the truth and reject it pass over into malice and those people are their own worst enemy: leaving them in their self-inflicted echo chamber of rage is the worst possible fate that could befall them. It's not what I hope for racists, but I do pity those who end up in such a state.


> but you can only convince others to rid themselves of racist ideas if you understand where they are coming from.

Absolutely disagree, the appeal of racism is well understood, it's only effect on society is to subjugate people, separate along racial lines.

> healthy discourse over the politics and economics of the matter.

"healthy discourse" implies that there is some kind of equivalency between racist and anti-racist, to believe that is to be willfully ignorant of situation and power structures which exist in the US today.


I'm glad you made the flat earther connection here. I was thinking the same thing and you validated my thought process


> eradicate racism is to allow racists and anti-racists to have open Lincoln-Douglas style debates on college campuses: allow racists to explain their ideas

Allow people to point at one of the nonwhite people in the audience and say, "I don't think this person belongs in the country, who's with me?"

Because that's what it is: for one side, it's a debate with no stakes, but for the other it's their exclusion from the university and all the things that are subject to gatekeeping by getting a degree. Potentially for being allowed to live in peace, or live in the country at all.

Letting that kind of speech be platformed is just a more subtle way of hanging a "whites only" sign on your university.


Do you think there's any way universities can analyse living racist ideas, like a virologist analyses a deadly disease, handling it in an isolated lab and only through thick gloves?


They can and they do .. but it has to be done in the third person and not by advocates. Effectively this is what "post-colonial studies" is about.

The opportunity existed to put statues of racists behind glass, for example, away from their positions of public prominence and celebration. I don't think too many people would object to the university of Oxford having a statue of Cecil Rhodes that they kept in a cupboard.


> whether the typical CS curriculum helps a modern software engineer at all

The purpose of a computer science curriculum isn't to train a software engineer, it's to train a computer scientist.


Yes, I absolutely agree, which is why the popularity of the degree is so utterly curious. Companies do not need many computer scientists and the students largely want high paying jobs, not to continue in academia pushing the bounds of theory.


Most degrees in most subjects go in-depth on subjects they probably won't ever use in bog-standard industry role.

Do I regret it? No, it was fun, interesting and it's cool when this hidden knowledge does come up at work.


True, but it seems traditionally many schools have offered Computer Science, but not any other type of Software Engineering or programming degree and CS became the de-facto "I want to be a programmer" degree.


I've hired people, but wouldn't say I have a great deal of experiencing in the area, so here's an honest question for you: do candidates with a college education perform better at their jobs?

In my (limited) experience the answer is definitely yes. Not only do they have more exposure to the concepts we need them to understand (if not necessarily the specific tools), we have just had better luck with them as employees in general. They are more likely to do small things, like all the administrative overhead that's part of a job: responding to requests in a timely manner, having good written communication skills, showing up to meetings on time, etc. I am the first to admit this is a small sample size and a lot of the cultural fit could be that the workplace is already filled with college educated staff, but I'm curious what your experience is after the hiring process is over.


I never shied away from interviewing a person without a degree, and honestly tried to never count that as a mark against a person since making it in the professional world without a degree is a challenge that I think shows a certain character. A lot of doors are just closed by default and that can be tough on a person's sense of self-worth.

That said, I think the degree holders were better candidates usually, but I strongly suspect it wasn't because of the education. Colleges attract people who are good at carefully following complex instructions, have a healthy respect for authority, and can delay gratification in the extreme. They are also experts at navigating a system from within. The degree acts to filter out people who are too independent to fit well within a hierarchy.

I don't think I ever noticed much difference between different kinds of degree holders aside from a broad separation between analytical degrees and humanities degrees, which again might have had more to do with the initial filter than anything taught in the classes.


> have a healthy respect for authority, and can delay gratification in the extreme.

What does these qualities have to do with being a effective software developer?

> The degree acts to filter out people who are too independent to fit well within a hierarchy.

Wouldn't people that fall into the category of "free-thinkers" make for better engineers — or at least engineers that are more capable at creating novel solutions to hard problems?


Delayed gratification means that you are willing to work on tasks that are not immediately fun or are frustrating or when manager is not looking. It means that you are more likely to keep doing tasks whose benefit is in the future (like writing tests). It means that you dont need pat on the back and managerial supervision so often to keep motivating.

Lack of healthy respect for authority typically means more fights about who will do which task and more likely to unilaterally refuse to follow random parts of code standards. It may mean refusal to close jira tasks or open them. It may mean that analysts have much harder time to convince that one person to do what is needed, again.

Software development is typically strongly team work and requires quite a lot of coordination among people with opinions. When one person decides he does not accept authority of analyst, architect or manager, yes there are consequences for whole team.


There's a difference between actual respect for authority and knowing when to shut up as well. I could not like management but I have nothing to gain from getting into a fight.


If we cannot have your love, then your fear will suffice.


One isn't Always creating novel solutions to hard problems. Sometimes there's boring work that Has to get done; respect for authority (understanding that the boring stuff Needs to be done) and delaying gratification (working on the boring stuff) is thus useful.


> Wouldn't people that fall into the category of "free-thinkers" make for better engineers

No. However "freethinkers" can launch their own companies. It is like people say that Steve Jobs would not get hired at Apple today as if it were some great insight. It is simply company is looking for a set of skills and willing to pay money for it. SJ would rather launch a new company "Apricot' instead of working at Apple.

Besides 'freethinker' is just an euphemism and part of linkedin cliches nowadays. And the definition of free is anyway restrictive and bound to cultural and societal norms.


How much of the ACTUAL scope of work needs to be novel? I'd say a limited amount. There is a LOT of work that is just running ops. For example, a billing system doesn't need to be novel it's been pretty well worked out many times.


This may be the case, but it doesn't mean that those skills came from college.

There's a strong selection bias at work - people with good administrative skills and communication skills are much more likely to go to college in the first place. We need to be asking if those people needed to go to college to develop those skills.

In my experience at university, mostly the people who were the strongest - the people you wanted on your teams - were strong right from the start. They had some inherent skills which university developed and honed, but didn't create.


> There's a strong selection bias at work - people with good administrative skills and communication skills are much more likely to go to college in the first place. We need to be asking if those people needed to go to college to develop those skills.

Agreed, but a university education reveals those people to me in a way that other evaluation criteria I have will not. I have lots of tools to evaluate whether a person has basic skills, but relatively fewer to determine whether they can do that other stuff. This is especially true as the reference and employment verification process is more limited than it used to be. Most companies I call can't tell me "don't hire this person" like they did 15 years ago.

It doesn't help that, working for the government, I have limited options once I do hire someone. I am under pressure to make the right decision the first time since unmaking that decision is a great deal of painful time and effort. I suspect a number of mid-level managers like myself are in that position, which is why the degree continues to carry so much weight.


I've worked with a fair few different people over the years, I've never seen a correlation between a CS degree and skill.

I'm biased in some ways as someone with a philosophy degree, but I've easily run rings around CS grads, and seen some who have absolutely no capacity to program. Most of the non-cs programmers have A degree, just not a CS degree. Often technical, I've known chemistry degree, engineering, architecture.

I think you're right in guessing that it's your culture pushing out the good non-CS grads rather than they're better. Be interesting to hear if you do the typical American white-boarding algo nonsense that YAGNI.

This generation might be different as CS became an obvious choice for the sort of people that may previously have chosen a wider range of subject. I think it's a shame as ultimately I see a CS degree as a waste of time, I've studied the concepts myself thinking myself deficient and never actually found any value to the lessons apart from intellectual curiosity.


> I think you're right in guessing that it's your culture pushing out the good non-CS grads

The parent post didn't mention computer science.


As another humanities degree undergrad they really let the side down, didn't they?


I wasn't necessarily referring to CS, but the parent comment was, so I get where you are coming from. It does reveal how our biases in this area impact our views, however. I suspect most industries that are hiring doctors, engineers and scientists will not take anything less than a college degree, preferably an advanced one. I think we find this discussion acceptable in CS because it's a field where the stakes for failure are relatively low, rigorous testing is possible and -- most importantly -- self training is possible due to the accessibility of tools. This isn't the case if you are building a bridge or performing a medical procedure. In those cases you likely want every advantage you can muster to weed out candidates, whether that's fair or not.


It depends.

There's probably a survivorship bias of some sort.

You're probably not interviewing for someone to write a bleeding edge piece of software, so you're probably looking for someone who's capable of showing up, writing some code and not killing anyone.

I've found the reason people got the degree to be a bit more telling. The best performers were those who felt their knowledge was limited and wanted to gain more of it. The worst performers were those who got the degree and thought the whole programming part was behind them.


Here in Poland there's a split between Computer Science (mostly lots of theory and fundamentals) and Software Engineering, which gives you some of those as well, but also prepares you for a typical programming job in a typical company.


It's the same here in Sweden. We have degrees called Högskoleingenjör (which is three years) and Civilingenjör (which is five years) that put more focus on practical and industry-relevant courses than the more science-focused Bachelor of Science and Master of Science programs do.


I'm pretty certain that's true for a lot of countries other than the US. It makes sense after all.

Far from American Universities to make any changes to their curriculum to support people's actual objectives though.


Same here in Sweden, and I think also the rest of the Nordic countries.


Many of these arguments boil down to "those damn lazy kids these day". Being in a learning focused environment has value. I have a dual CS/History bachelor's degree. The analysis/rhetoric and writing skills that I built with the History side of my education are at least as valuable to me as the CS skillsets.

I'd go back and challenge original implication of the article, that university is absurd, like a CD. Characterizing a CD as absurd is pretty short sighted -- it's the standard buy vs. rent argument. If you've ever followed a sport that has been in the middle of a cable contract dispute, the downside of the stream is obvious -- broadcasting is awesome until it isn't.

I still buy music CDs or iTunes tracks. Access to an unlimited catalog doesn't do much for me, and owning perpetual rights is, to me, a better value, as it is for many things. That doesn't mean that streaming is "bad", just as online learning isn't "bad". But universities aren't obsolete because other options exist.


I just ripped a new CD yesterday. I get a holistic album experience with all pregaps preserved. Something streaming and digital downloads won't provide.


> employers often look for the degree, but do not believe it is much of an indicator of quality and don't expect new grads to walk in the door with the needed skills

In other words, requiring a college degree (even in an irrelevant field) of applicants is a form of class gatekeeping for technical professions.


Feels like that sometimes, doesn't it? Then you find an employer in said technical profession who understands how their field has become inefficient at sourcing new employees and they seize on the opportunity to open the gate and let 'em in, thus creating a competitive advantage in the hiring pipeline and diversity in culture.

The real tragedy is when all the others who can't/won't/don't open the gate cry whine and complain about not being able to find talent. Sometimes, when they are really pathetic, they have the gall to go to their local economic development agent with their hands out for a subsidy for hiring assistance.

So those who adapt to the times end up paying taxes into a system that, in turn, gives that money (in part) those who don't. Perverse much?


Honestly, the CS degree is weird.

Some classes are super basic and some classes are nintendo hard. The classes that are in the middle of the road are very few but those are the ones that are most useful.

Operating Systems, Data Structures, Parallel Computing and Software Engineering II (Basic Project Management) were notable for being interesting, difficult enough to make not slack and endlessly applicable for programmers.

The CS degree isn't for programmers as my old professor once pointed out in a high level class: "Computer Science is about Programming like Biology is about pretty girls and microscopes". Those looking to go into research can take the CS coursework and probably get more use out of it.


I've always felt that a CS degree (at least based on the curriculum I went through) is really three separate and only partly related areas of study crammed together:

- Teaching people how to program for the first time. Most of us had never actually programmed before, so you've got to have the intro classes that both teach _how_ to think through breaking a problem down into steps, and also the specific syntax of expressing those steps in a specific programming language, moving on to the typical concepts like OOP and classes and whatnot.

- Actual CS. Data Structures, Algorithms, Operating Systems, Compilers, and the various more math-ish / "fundamentals" content.

- Software Engineering. The practical, hands-on work of actually writing meaningful code, working as part of a team, requirements, using version control, testing, etc. Some classes like specific web dev topics or something probably fit into here too.

Granted, I was going through this almost 20 years ago, and I only have my own experience to base this on, but it seems like this is a reasonable broad description of what a CS degree is trying to cover.

I feel like that gave me a good fundamental background, but at the same time 99% of what I do on a daily basis now is all stuff that I either learned on the job, or in my own free time.


> I was going through this almost 20 years ago

10 here.... your comment holds up.


CS is more applicable to mathematics or Physics.

What CS needs is an equivalent engineering program. It is amazing to me that Software Engineering is not considered a valid engineering discipline like Civil, Mechanical (which apply maths and physics).


As a former tenured professor I've been watching all of this with a bit of empathy, anxiety, concern, and schaudenfreude. Many of my best friends and colleagues are at the university and it still forms a huge part of my life.

In speaking with many students and colleagues, one thing has struck me, which is that many of the current undergrads are enrolled because they want to get away from home. They want some true transition to adulthood, where they live on their own and develop into an independent adult with a vocation and adult identity. This is maybe what I've heard articulated here and elsewhere about being exposed to new ideas, etc. but maybe isn't quite the same. It's something I never really quite appreciated the importance of until recently, but looms large in their minds. Many of them, even though they're taking courses "off-campus" and online, are still in their college town living in their apartment, and so forth. They didn't go home, they just stayed put.

The analogy with books and music is telling because for awhile there too there was a lot of hype about how the internet would lead to some radical reorganization and disruption of those industries. And in many ways it did. But many of the power inequities, the monopolies, and so forth are the same, and the fundamentals are the same. We've migrated a lot from radio and physical media to streaming services, but the players are still there. I suspect something similar will happen with universities; we'll just see a lot more diversity in what form an undergraduate education actually looks like. For some it will involve 4 years in a dorm / on-campus setting. For others it might be more eclectic. There are also some things that just can't really be duplicated at home: an undergrad isn't going to go out and purchase a mass spectrometer for the most part, or PCR equipment, or have storage room full of embalmed anatomy specimens to dissect.

One thing that's important to be mindful of is that this on-campus experience many of us remember is in many ways evaporating anyway. I don't mean to say that's ok, or acceptable, or whatever, but sometimes we talk about the experience of being on campus and discussing things with others in person, forgetting that a major societal issue right now is how that that used to be the norm for everything and is increasingly not. So we have these chats online, on HN comment sections, or over IM, or videoconferencing anyway, even if you're not in college. Whatever might be lost from the on-campus experience is being lost everywhere to some extent; I point this out only to suggest that if we're ok with it in some domains, why should college be special? Or is college the canary that we should be paying closer attention to?

I agree with the perspective of the author of the article that something fundamental needs to change, but my guess is 10 years from now everything will look very different and the same at the same time. I also think there's many problems with universities and higher education that have nothing to do with the cost-benefit calculus in some abstracted sense, but rather to do with how it should be paid for, whether universities are funded well enough, whether they're getting those funds from the right places, whether administration is too top-heavy, whether employers and society abuse interpretation of degrees, and so forth and so on.


"...where they live on their own and develop into an independent adult with a vocation and adult identity."

One can see where you are coming from since these desires are pretty normal around that age right? I think that the trade-off here, however, is two fold:

1) That lifestyle is expensive and puts many young adults into debt before they've even learned and lived financial independence.

2) It delays the inevitable hard work and time that we all must put in contributing as a net positive to the real world economy.

Add 1+2 together and the output in large numbers yields people in bondage lacking the mental fortitude to actually sustain and prosper independently as an adult over the course of their lifetime.


I don't disagree with these points, but I do think many individuals entering college are looking for some kind of extended [rite of] passage or something into adulthood, and I do think college provides that for many.

I don't necessarily think it has to be that way, and I don't necessarily think it's the best way, certainly not for everyone, but I think for a lot of people it provides one way. Right now I think it's a broken path, but I'm not sure what other paths there are to replace it at the moment, given that employers (for the most part, maybe not entirely) seem to be abandoning that role.

To me the bigger underlying subtext in a lot of these discussions is the vacuum that currently exists in terms of providing a safety net for people as they find their way. I think this could be provided by employers or something but I really don't think it is at the moment; colleges might also be flawed in being too expensive or not really delivering on their promises or something, and maybe we're at an inflection point. But if you think of spaces where society is ok with people figuring things out, or learning, or not really being fully developed yet, college is a major place. Outside of that it increasingly feels like people are treated as machines, interchangeable and replaceable, with little to no sense of potential or anything of that sort.

Believe me, I feel like universities are very broken at the moment in many many ways. I just don't really see them going away in the near future, even if they are remade, and I think there's a good opportunity for society to really remake them in a way that's more useful, whatever that is. It may involve restructuring what a college education entails, or paying for it with public funds, or whatever, but that opportunity is there.


Thanks for sharing your reply/viewpoint. There is no correct single answer, but it's not so binary, I believe that self employment + entrepreneurship are underrated in the dynamic.


This is a wonderful comment, thank you.

The most valuable part of college for me (I can say now a decade later from graduating) had nothing to do with the content and everything to do with the experience and connections I made.

My professor helped me land my first job. My classmates ended up in the industry and helped connect me to opportunities and some became (so far) lifelong friends.

This would have been much less likely if I took online courses while living with my parents.


> One thing that's important to be mindful of is that this on-campus experience many of us remember is in many ways evaporating anyway.

I read here that you're not talking about the pandemic specifically, but of the increasing digitization of society as socializing, work, entertainment and education all go online. and you're right, this is a huge social issue that connects to mental health, social cohesion, political life, social skills gaps and how we relate to each other. It's a change we are crashing into without any idea of the consequences. And yes, maybe digitizing our formative coming of age experiences is a step too far, or maybe we just need VR tanks.


Thanks for this comment! Universities are where the majority of scientific research is conducted today, and I hope that won’t go away anytime soon (severely restricting immigration and funding might change that). Many of the Professors I worked with considered teaching to be an obligation; they would rather spend their time on research (doesn’t mean they don’t teach well, some do, others don’t). As long as we continue to find research Universities themselves won’t be going anywhere.


> As the linked article points out, employers often look for the degree, but do not believe it is much of an indicator of quality and don't expect new grads to walk in the door with the needed skills. As a former recruiter I think that sentiment is pretty spot on. I don't think I ever asked a candidate about the details of their education. It was just never relevant in comparison to their experience.

Universities aren't tech schools; their purpose is not job training.

My understanding is that employers used to be comfortable providing pretty extensive job training, and they probably should start doing that again. IMHO, it's wishful thinking to expect any kind of school to produce new grads that can "walk in the door with the needed skills," especially in a field as diverse as software engineering.


When Netflix appeared, more movie watching happend. When MP3's appeared, more music got listened to. But when my school broke for COVID, the amount that got learned dropped sharply. I teach at a college but my wife teaches at a high school and it was the same there also.

OK, maybe instructors are not up on what to do; for sure, I'm not. Maybe it is that the tech is not there yet; it all seems to me to be so static. But perhaps also the great majority of 18, 20, and 22 year olds aren't going to learn the same amount of material if they are at home also working at Target than if they are on campus with someone teaching them. I'll venture that certainly people who have significant socioeconomic and cognitive struggles are going to do worse in a climate where they are on their own. So I worry a lot about what would replace in-person college.

The top post about how a person can make and do wonderful things is ... wonderful. But very, very few people will do it. Not disparaging that poster at all, in fact quite the opposite, just making a historical observation.

> So that leaves the on campus experience, which I personally think is amazing. ... The new ideas from the classes are just a small part of the new ideas you get exposed to from other students and campus events. Plus with the increasing independence, it's a great gentle introduction to adulthood.

Yes, it is great to be given space to grow.

> employers often look for the degree

Yes, having access to a good job is an important part of what people need to get. Of course, there are other things. Some understanding of civic responsibility comes to mind in the current climate, for instance.

Even if we just focus on work, say with a CS degree, evidence to employers that candidates passed being expected to write well in a political science class, to understand something about data in a statistics class, and to have an overview of their field of speciality in a theory class, are things that they may be glad they have in an employee a decade from now. I get that many employers may be looking to what they need now, but I'm only saying that it is not dopey for at least some of them to be thinking along those lines.


> indicator of quality

Any other hiring managers have difficulties with Bachelor of Arts in CS versus Bachelor of Science?

The BA applicants don't seem to have the logic and concept knowledge that the BS applicants have.


> ... was a healthy debate about whether the typical CS curriculum helps a modern software engineer at all. Some thought the degree was stuck back in the 70's. Others thought there was value, but had trouble articulating it.

I'll try a resolution of this dilemma: A few years ago I was talking with a university CS (computer science) prof, and his fast reaction was, IIRC, "The purpose of computer science is to identify the fundamentals of computing." I.e., make a scientific study of computing and find its basic laws.

E.g., there is the subject of computational time complexity that for a given operation tries to find the fastest possible algorithm. Turns out, this is not an easy subject, e.g., we are still struggling with the question of P versus NP; a lot of well informed people believe that solving that problem would be a biggie.

My guess is that the issue is cultural in that Einstein's E = mc^2 was seen as fundamental to the universe and other fields of research would like to get results also at that level of importance, fame, prestige, whatever.

So, apparently in computer science making a scientific or engineering study of software engineering mostly loses out to pursuing some version of E = mc^2 for computing. E.g., one such result was the Gleason bound (in the Knuth volume on sorting and searching in his The Art of Computer Programming) that shows that O(n log(n)) is the fastest possible worst case sorting of n items by comparing items (records, keys, ...) two at a time.

One response is that if medical schools were run like research university computer science departments, then no one would want to go to a hospital no matter how badly they hurt! I.e., a researcher deep into P versus NP might have never had occasion to use containers, Lisp, RUST, or even SQL.

Of course, in medicine we have some medical schools in university research teaching hospitals where some of the best medical research is done. And it is easy for a practicing physician to conclude that mere clinical experience is not enough for the needed progress and that fundamental research in biochemistry, cell biology, DNA/RNA, the immune system, etc. is necessary, crucial, and really the best hope for the needed progress.

So, it is fair to say that software engineering could be taught like medicine, not just as introduction to research but for practice, clinical practice, professional practice. E.g., if want to be the CIO of a large bank, insurance company, shipping firm, on-line retailer, etc., then how to manage an information technology (IT) department with 2000 employees to do well on data security, reliability, performance, software quality, software documentation, software system designs and revisions, growth, system management, etc.


> as employers embrace new skills-based certifications, many students may question the value of the traditional four-year degree

Of course employers will embrace skills-based certifications, as they essentially offload the cost of any specialized training to the employee. Until now, it was not uncommon for the employer to pay (directly or indirectly) for the additional training.

But I'm not sure if getting a specialization certification without having a solid background on the topic is a good idea. It will be like the hordes of self-taught plugin programmers that made the Wordpress platform a liability to run.

In the end, the problem is not the value of the four-year university degree. The problem is its price. Pretending that the price can't be lowered, so we have to investigate alternative modes of higher education, is both untruthful and undemocratic.


It may well be impossible to walk back what has happened to universities. Large institutions move slowly. There are too many stakeholders. Even when they can change, it's inevitable that forces will resist the change, resulting in half-measures.

So it may well be more expeditious to start over. There are many other models, and if nothing else, whatever success those models have will provide more capacity. That may also draw away students and force them to lower prices.

It still won't be easy. The price rise is driven in part by a mythos about the value of the four-year degree. People will continue to see education in any alternative as "lesser". And it doesn't have to be "lesser". I am a big fan of the work being done at Signum University, which has been working for years to establish an online model that's more than just a MOOC. It's actually closer to what a liberal arts education was supposed to be, the thing that gave four-year universities such prestige for producing well-rounded students. They don't teach computer science; they teach literature. And that means teaching dialogue, discussion, and insight, rather than being a glorified vo-tech school with a bunch of humanities thrown in. I'd love to see CS education move to a place where it taught how to craft software rather than teaching us to be mechanics.

Even when that happens it will be seen as "lesser" for a long time. We use the four-year school as a marker that the student can at least jump through the hoops, speak and write at least marginally competently, and can do the mechanical, testable aspects of the job. It doesn't actually do that very well, but people think it does, and don't believe anything else can.


> So it may well be more expeditious to start over. There are many other models, and if nothing else, whatever success those models have will provide more capacity. That may also draw away students and force them to lower prices.

The prices are high because Universities nowadays are run (more or less) as enterprises. Starting over without changing that, won't lower the prices. It will just increase the profit margins.

The article author seems to be under the impression that professors have a strong say on how the universities are run. They don't. And my feeling is they will have an even less significant role if we start over.


> So it may well be more expeditious to start over. There are many other models, and if nothing else, whatever success those models have will provide more capacity. That may also draw away students and force them to lower prices.

Don't you think the process of starting over is already underway? We've ha alternate training and evaluation methods (certifications, MOOCs, and vocational schools) for a long time. What do you mean by "start over" in this context?


What I mean is something that receives more of the cachet of universities. As you say, we've had MOOCs and similar for a long time, but you're still going to have a hard time being employed with just that, especially for the best-paying jobs. They connote somebody who can do mechanical work but without creativity or communications skills.

My suggestion would be to promote those "soft" skills without the overhead of overhead of buildings, a football team, dining halls, etc. You can discuss literature, history, writing, etc. online almost as well as you can do it in person. Those are the things that differentiate a prestigious 4-year degree from a vo-tech or 2-year degree, and it's the prestige that has ultimately driven people's willingness to pay so much for university. (Well, that and some dishonesty and poorly-thought-out policy when it comes to student loans.)


There are many more problems with CS degrees that just their price.

Would anyone here be comfortable running a WP plugin from the average 4 year CS grad? It is rare to find a CS grad who can write production code right out of college.

Let’s be clear, skills based certs are terrible at measuring things. You know what else is a terrible skills based test? A 4 year CS degree. I’ve interviewed a number of degreed working programmers who can’t solve even basic interview questions. Not gotcha questions, not difficult algorithms, but basics (read a file, sort a list of words, etc).

There’s lots of ways to learn, and this kind of talk just adds another barrier to people coming from non-traditional backgrounds. I understand this, because, as a self taught engineer, I’ve been the person at the table listening to engineers talk about how important a CS degree is and how you just need one.

Every time you make the point that you absolutely need a CS degree you’re telling someone who learned via another avenue that they aren’t good enough. At a time when we want more diversity and inclusion in software this message is the wrong message.

We need to think about how to bring education to more people through channels that work for them instead of telling them to fit themselves into a box that was designed for someone else.


> In the end, the problem is not the value of the four-year university degree. The problem is its price.

It can be both.


> In the end, the problem is not the value of the four-year university degree. The problem is its price.

I don’t know. The piece of paper is worth more than I paid for it, but that’s just an issue of employers gate keeping.

If I think back to every course I took in college I would say all but maybe 4-5 were a waste of my time. All these ancillary required courses so that I can receive a “well rounded” education that I needed to get a job, but didn’t actually want to take.

Even those 4-5 useful courses were woefully out of date. I could have done a better job learning that material with 6 months on Pluralsight or some other learning platform.


> In the end, the problem is not the value of the four-year university degree.

It's not worth it IMO, even if education is virtually free here on the better side of the big pond.

Even at 0 EUR, it's still a lot of time and effort that could be focused much more effectively.


Looking back on my education 20 years later... obviously the math degree gave me the skill set to take advantage of the opportunities handed to me. (It’s how I got back into grad school & out with a PhD in just 4 years; it let me work with Mike Abrash.) but the skills I learned in school that have helped me as a person were from the courses that weren’t “certificatable”: literary analysis, my beloved history, the philosophy I always hated.

I worry in a certificate-driven education that we’ll leave those things behind.

On the other hand, the person I know with the best grasp of history, philosophy, and literary analysis barely graduated from high school and did a 4 year stint in the Navy: apparently, being stuck on a ship for years gives you ample opportunity for self-study. Maybe what we need is just boredom, opportunity, & books, and not more schooling?


I graduated an engineering program at a pretty decent school a few years ago.

My school was quite explicit about the idea that they were trying for an occupational approach to education. We were there to become engineers, if there was any hint of the qualifier “well-rounded”, it was mostly lip service: compelled writing courses (albeit with great professors), a ridiculously superficial ethics course, and a credit system that strongly discouraged taking courses from non-engineering colleges (some of our sister colleges in the university offered world class liberal arts courses, peers, and professors).

In my writing courses, there was a popular sentiment (unintelligible to me) to rebel against doing any work for the class. A student said, in class, “I’m studying to be an X Engineer, I shouldn’t be here.” I managed to take a few humanities courses outside of the engineering college, to count towards these requirements, and (along with random books at the library) they easily provided the most mind-expanding material I encountered in college.

I totally agree with you, that certificate-oriented degrees have a tendency to drop the important lessons of exercising the “muscle of thought”. In my opinion it’s a big loss: my life today would certainly not be as rich without those lessons, earned outside “my field”. But, we have to reckon with the reality that many students not interested in these “extra” features, for better or for worse. There’s a real demand for rote education, even if its cash value is lesser, which we suspect it to be.


Yeah, the #1 thing I got out of college was critical thinking. History and philosophy are where I learned that. Would it have come without college? Maybe, and certainly yes for some. I do worry about a learn-one-task education system and what that means for society at large.


With the degree attainment rate at only 30% in the US, a share that is similar in many countries around the world, the "learn-one-task education system" is what society at large is already accustomed to.


Is this an argument against my concern? I can't tell. I think poor education is a major issue in our (world) society and makes people more vulnerable to things like disinformation which is rampant today.


I would say that you have to make sure that there is no real barrier first:

Bring Education online.

THEN the university and learning spaces have the chance of providing whatever 'Education online' can't provide.

That would free up and actually make it possible to evolve education.

If everyone is listing to the same lecture every year again without improvement, we miss out but also having to do the same lecture every year, costs ressources. Optimize your lecture instead of reduing the whole lecture every year.


University right now is a bundle deal.

* You get lectures and assignments.

* You get schedules and deadlines that discourage procrastination, if you're the sort of person who leaves things until the last minute.

* You (hopefully) get detailed feedback on assignments from a real human.

* You get big manicured lawns and marble pillars.

* You get health insurance.

* You get in-person small-group discussions (at least in some subjects) of advanced topics.

* You get a bunch of people of your age and social background, all uprooted from home and looking to make new friends at the same time.

* You get subsidised gyms and sports clubs and interest clubs.

* You get to network and meet people.

* You get buildings full of academics with office hours where you can basically just walk in and they'll explain almost anything to you.

* You get parties full of drunk young people, of all genders, some looking for relationships, others for casual sex.

* You get access to journals and databases and software that usually costs $$$$$ (and computer labs with it all already set up)

* You get a weird 'future middle class' social status where you can drink and party and not get a job and go into debt - yet be treated as someone successful.

* You get a easily understood explanation for that gap in your resume.

* You get internship opportunities - where you can get your foot in the door at big employers, while being paid and taught how they do things.

* You get loans for your living costs, despite having no income or credit history.

* You get access to a library with more (serious, intellectual) books than you could read in a lifetime.

* You get to leave home, but with training wheels if you're not ready to cook and clean and laundry and pay bills all at once.

* You get supply control, from 'weed out classes' and limited numbers of student places.

* You get in-person exams that are at least moderately difficult to cheat on.

* And yes, you (hopefully) get a credential at the end of it. Maybe even with a good 'brand'.

Online courses can certainly deliver lectures at a much lower price than the ruinous cost of US universities. But I think part of the reason the likes of Coursera aren't on their way to replacing conventional colleges is because they're missing so much else from the bundle.


I am mostly in agreement with you. MOOCS are awesome, but they typically require fair amount of motivation ( which is enforced at a university ). I can't say definitively most people are sufficiently motivated to follow a given path on their own.


Great list. You missed an important one: Being selected for admission to validate your worth.

The entire HS experience (in the US) is formulated around this validation. Simply being chosen is a reward in itself.


Yes and no. Yes, getting admitted puts you above the other high school grads. No, it doesn't put you anywhere compared to the college grads. "Admitted to X University" on a resume is a question mark, not an exclamation point.


It's hard not to be pedantic here. CDs still have their place, and a lot of streaming is terrible; either there are advertisements, or DRM, or fidelity costs, or limited libraries due to licensing problems. etc.

I think the metaphor still goes the same way: the old fashioned way may not be perfect, but there is some real value there that is not necessarily replaced by steaming online courses.


All the problems with streaming you mentioned are completely artificial. They are not actual technological problems with streaming.


They're not artificial, they're inherent to the business model. So I agree that the limitations I mentioned don't have anything to do with streaming technology, they crop up time and time again because of some of the limitations apparently required by a streaming business model.

And so in that sense I think it remains a useful metaphor. What sort of business models will be profitable when it comes to streaming courses online? Will they always put the student and learning first, or will the viable business model for these impose limitations, or push things in the wrong direction.


The further in time I am removed from my university years, the more convinced I become that the social interaction was the true root of the experience's value. To quote Stephen Leacock's 1920 take on university life:

"The real thing for the student is the life and environment that surrounds him. All that he really learns he learns, in a sense, by the active operation of his own intellect and not as the passive recipient of lectures. And for this active operation what he needs most is the continued and intimate contact with his fellows. Students must live together and eat together, talk and smoke together. Experience shows that that is how their minds really grow.

[...]

If I were founding a university -- and I say it with all the seriousness of which I am capable, I would found first a smoking room; then when I had a little more money in hand I would found a dormitory, then after that, or more properly with that, a decent reading room and a library. After that, if I still had money over that I couldn't use, I would hire a professor and get some textbooks."

Source: https://news.library.mcgill.ca/stephen-leacocks-the-need-for...


Excellent, I would be interested in your university.


> This transition is likely to appear first in technical degree programs, where it is relatively easy for students to certify their skills online

This is dangerously false. The world isn't software engineering.

Engineering disciplines are in many ways fundamentally about learning to use expensive, special-purpose equipment to design, monitor and control expensive, special-purpose, reliability-critical infrastructure. Sometimes, as in the case of nuclear engineering and radiological sciences (the topic of my undergrad degree), access to this equipment and training is regulated and restricted by the host nation-state.

Perhaps some disciplines (Mechanical, maybe Civil) are democratized enough that something like a Hackerspace membership could complement an online degree program, but for others (Aerospace, NERS, Biomedical) I just don't see how you train people and verify competence without a centralized campus and training equipment.

Also, it may not be a good assumption that Software Engineering will remain as democratized as it does today. Web Development, sure, that's in some sense inherently by-and-for consumer devices. But things like IaaS, Machine Learning, and physics / engineering simulation (this last is what I now the most about) increasingly occur on specialized hardware that's at best inconvenient and at worst impossible to learn on a typical consumer device.

Changes in consumer devices themselves are aso in some sense de-democratizing tech learning. My wife has worked in the higher education space in operations, and still does a lot of UX research there as well, and in her experience the technical competence of students with business-standard tech platforms (mainly e-mail, but also things like spreadsheets and word processing) is regressing, and COVID-19 is exposing how poorly current university infrastructure serves students whose only device is a smart phone (common among students from lower-income backgrounds).

We may have benefitted from a blip in history where a) B2B tech vendors were unusually successful at selling their budget B2B products to consumers and b) the performance requirements of a consumer product segment (i.e. games) drove the hardware innovations underlying a high-growth B2B market (ML on GPGPU)


Not the right analogy, but the article is mostly on point--higher education is in the process of being "unbundled".

> I need no convincing of the value of campus life and in-classroom education. I recognize that online platforms can’t perfectly replace what we deliver on campus. But they can fulfill key pieces of our core mission and reach many more students, of all ages and economic backgrounds, at a far lower cost.

> Our industry has been so stable for so long that we’ve conflated our model with our mission.

The author then rhetorically asks what the mission actually is, and answers with

> As educators, we strive to create opportunities for as many students as possible to discover and develop their talents, and to use those talents to make a difference in the world.

which sounds good and all but seems pretty far removed from the sausage that is actually being made on college campuses. I, for one, am looking forward to see what new models arise over the next couple decades and I hope my child has a wider variety of options than I did.


Is it even the mission, or has it ever been? Universities aren't so much about identifying and fostering talent as they are about molding the manner in which individuals approach the world, while giving them the tools necessary to navigate and succeed.

And yes, campus life is part of that. It is similar in purpose to Rumspringa: the freshly-minted independent adults are given a taste of freedom in a controlled manner, while surrounded by reminders of the path they are on.


I still buy CDs and rip them to MP3s.

They work when cellular data isn't available, they have excellent sound quality, they're easy to organize, and there's no vendor lock-in who's spying on my listening habits.


Do you have the option to use something better than MP3?

How's the 'loudness war' going with modern CDs?


I previously used FLAC but I like taking my whole library with me. HQ MP3 is good enough for me.

The loudness war is something of a myth. I normalize my audio, so I don't really notice. As for the absence of dynamic range, I don't really care. I don't listen to classical.


I don't think the loudness war is a myth: https://en.wikipedia.org/wiki/Loudness_war

But it appears to have significantly abated, due to normalization by streaming services and changing opinions of sound engineers over the last decade.


> changing opinions of sound engineers over the last decade

Don't sound engineers oppose the loudness war? My understanding is that they'd far rather be doing work they could be proud of, but they get orders to ramp up the volume.

You're right that it absolutely isn't a myth.


Something of a myth: the idea goes that due to the loss of dynamic range in audio that somehow the quality of the audio recording is diminished; but that's not necessarily true.


University was a chance for me to grow up. I was a young 18. I got to leave home, meet a bunch of people, have some fun. I grew up a lot (kinda, ha) over three years and was ready to work when I was done. I also got a degree (which was far cheaper back then).

The paper degree never really opened any doors for me - meeting people probably did, growing up definitely did. 95% of the work I got/did that followed was because I taught myself PHP (and WML, LOL) for my dissertation project. If I'd knuckled down, I could have learned that in a few weeks without the degree.

With all this in mind, how long before enterprising firms come up with a learn and work option? I'm sure with a bit of thought you could make money off a hundred or smart 18 year olds. Give them a campus environment, free meals and board, some pocket money - then 12 hours (or whatever) of lectures a week, and the rest of the time they actually work for you doing practical things, putting in to practice what they're learning (or just doing a bunch of stuff that immerses them in a real life company). After a year or 18 months, they have something for a CV, contacts, they've got to grow up a bit, real life experience... Plus no student debt etc. I'd guess the company(ies?) involved could also have their pick of the smartest ones. Everyone wins.


> Our industry has been so stable for so long that we’ve conflated our model with our mission.

This had already become apparent to me long before online education took off. Back when I called up the local university to ask if I could enroll in a few classes, a la carte, in order to flesh out my knowledge of some subjects that my alma mater's program didn't cover. They informed me that those classes were considered advanced, and therefore were only offered to students enrolled in the college's degree program.

In other words, the only way they would let me take those one or two classes would be as part of a package deal for earning a second bachelor's degree, majoring in the same area of study as the one I already had.

The only way I can make sense of a policy like that is if you think of the fundamental model of higher education as being essentially unchanged from what it was in the medieval period: A sort of atomic package that's a rite of passage that gains you access to a certain stratum of society as much as it is an education. Under that model, no, maybe it didn't make sense to take just one class, any more than it makes sense to walk into a jeweler's and ask them to chisel a chunk off of a diamond ring and sell it to you. But, nowadays, that way of thinking about education is deeply anachronistic.

And classist, too. Failing to meet the needs of people who can't afford to just drop everything and become a full-time student with anything more helpful than, "Well, maybe you can dig a 500m deep financial hole, jump into it, and we'll meet you at the bottom," is simply a travesty.


Every time I see articles like this I can only assume the writer either majored in CS or in a humanities major that doesn't involve any physical work. If you study any sort of science or engineering you will find that even the simplest equipment used in lab classes cost hundreds, if not thousands, of dollars. Cost aside, you simply can not trust an undergrad to run a centrifuge without in-person training.

People also highly over-estimate the cost and speed of self teaching. Personally, I learned more in my introductory Mechanical Engineering courses freshman year than I did in all my attempts at self-teaching. This of course doesn't even include the subjects I did not even know about before. There is an overhead cost per subject to self-teaching, and I've yet to see anyone self-teach themselves the number of subjects that a typical undergrad education involves.

College is also the first time someone finds themselves in the company of enough competent people to, say, do an engineering or CS group assignment that doesn't involve major hand holding.


There are also things I learned in CS (BigO notation, building algorithms from scratch, working in groups) that I don't think I would have picked up easily on my own (or know that it was important to learn). But it's still theory I use all the time when I design and build things at scale.

There was a lot I didn't learn in University and that I picked up while working, because I kept my skills up.

I know people who graduated from University who only knew Java and never did anything outside of the curriculum. I know other students who played with hardware, hooked up sensors, did assignments in different languages each semester (if the professor allowed it) and went out of their way to learn as much as they could.

You get out of any education what you put into it.

I do think in-person education is really important though, even in things like Comp Sci. I do think it should be affordable. I went to a small school in a small town and IIRC, my tuition was <$3k per semester (not including books, food and housing of course). It was the early 2000s and I had roommates who worked grocery stores and print shops, and were able to either pay for school with those jobs or pay off their loans within a year or so.

During that same time, I met people from other schools with $20k in debt (and that's considered very low today!).

No education should cost $100k. That's truly insane.


> The University Is Like a CD in the Streaming Age

And that is a good thing! A great deal of the most interesting music is not available on streaming. Thanks to CDs (and the people who pirate them) we can still have access to unique recordings.


Top universities can still keep their most valuable asset: a piece of paper with their name on it, scarce. They likely will and be fine. Those degrees will still open many doors.

The lower tier universities that charged tens of thousands of dollars to offer a minor leg up for your resume among the pile of resumes when applying for a job though are in big trouble. Ultimately colleges will more clearly shift their value proposition from "here is where you learn key material" to "here is where you learn to be an adult" and survive I think. Most kids out of high school are idiots and need some sort of structure to latch onto as they leave adolescence / the nest.


How much of a role do the certification bodies (the groups who decide who can grant a bachelor's degree) play the cost of college tuition? I'm a big fan of the classical quadrivium, trivium, and Great Books model of education; why couldn't one complete all of the studies at home, spend six to twelve months in an intensive course of examination and debate to validate the level of education (which could be done online) and receive a bachelor of arts in "the humanities" (or bachelor of letters or whatever one wants to call it)? I think we would have a much more powerful crop of critical and logical thinkers if education were handled thus.


Universities will survive. They do change. Originally U's didn't pay profs, but only gave them a stage to recruit students to tutor. Their money was made from tutoring. Many kids in U's were 14 or 15 years old until the 18th century. Tests, standardized tests, and multiple choice tests didn't always exists.

This might be a low point, but in 50 years I see them having great power. Democracy will once again shift from geographical ridings to a 'guild system'. For example there might be a senate seat that can only be held by a licensed doctor, and only licensed doctors can vote for who will be in that seat.


Societies tend to see the most progress when a greater number of people gather together. While we may find a way to accomplish that through online instruction, our current attempts to do so have been spotty at best.

One of the reasons brought up for online education is likely one of the things that hinders it: the idea of being able to study where ever and whenever you want. As liberating as that may sound, it keeps people apart in both space and time. Granted, social factors probably play a greater role. For some reason, people don't seem to form the same types of connections with each other in the digital world.


Students at elite schools do NOT attend "to learn facts." Most such students are smart enough to learn anything they want to learn on their own.

Elite universities like Harvard, Yale, Princeton, MIT, Stanford, etc. are attended mainly (a) to interact socially with and learn in-person from other individuals who are smarter, more talented, more knowledgeable, and/or more connected in diverse ways, (b) to join a kind of 'exclusive networking club' with lifetime membership benefits, (c) to learn the rituals and norms of this club, and (d) to get a credential that also confers lifetime benefits.


In this analogy, here's the vintage vinyl version (for those who can understand French): the College de France, free since 1530. Adding some irony to the metaphor, their courses are available online via streaming.

https://www.college-de-france.fr

No they don't give out degrees but it's one option among many of open knowledge sharing.


I've done a good bit of research into online programs and there seems to be, with a few exceptions, online CS programs and online Liberal Arts programs. Of those the only "useful" degree is CS and the quality of the programs vary greatly.

For the most part this makes sense, a lot of cool or useful degrees arent really possible to deliver a quality program online because of expensive equipment for labs that arent available to consumers (be that for price or other reasons).

On an interesting note, I'm disappointed there arent more math programs. It's a moderately useful degree with wide applicability and doesnt have much of the way of physical constraints. The only issues I can imagine are ones that are common to online degrees in general. Yet only a few school have one.

> Indeed, that unbundling is already happening. Employers such as Google, Apple, IBM, and Ernst & Young have stopped requiring traditional university degrees, even for some of their most highly skilled positions

So... developers and tangential professions right? Self taught developers and those with non-traditional educations are hardly new. I doubt these companies are willing to waive for many "highly skilled positions".

I think I read some time ago about Elon Musk claiming he was removing degree requirements from job listings[0] and yet if I search for engineering positions at Telsa many require a masters degree in an engineering discipline. Everytime i hear something about how usless degrees are and how much better other options are, I assume they dont know anything about the world outside software.

[0] https://www.businessinsider.com/elon-musk-college-not-for-le...


Cynically speaking, the purpose of University has always been to connect very rich people with very talented people. In this way the elite can fertilize each new generation with fresh blood carefully selected from the masses. This is good for the very rich, as they know they need talent to remain at the top, and is also good for the very talented, as it gives them a path to that top.

Unfortunately, opening higher education to a much greater proportion of the population (50% in the UK?) has somewhat torpedoed this mechanism. The rich will become more stupid, and the talented will find less opportunity to establish themselves.


For some programs, like Georgia Tech's OMSCS, online students in given courses are known to do better on written exams then the in-person class[1] This could have a lot to do with demographics: online students have a harder survivorship curve to contend with, already have employment, and face little reason to continue with the program if they are discouraged or stuggling. None the less, it's an interesting data point!

[1] https://dl.acm.org/doi/pdf/10.1145/2876034.2893383


This is a pretty bad take. The article doesn't consider the numerous disciplines like chemistry, biology or physics for which often expensive physical resources, equipment and facilities are a necessity — really any course of study with a significant amount of time spent in a laboratory.

Call me paranoid, but I am very suspicious of the voices pushing for all-digital education. Of the students whom I know, none want this. It's not what they signed up for, and many feel shortchanged by their universities continuing to charge them full price for a subpar learning experience post-COVID.


> It's not what they signed up for, and many feel shortchanged by their universities continuing to charge them tuition fees for a subpar learning experience post-COVID.

As soon as they convince employers that a digital education is a reasonable substitute for in person classes, the traditional university will be over.

Why would you put yourself into $100k worth of debt for a piece of paper when you can get the same piece of paper from a pure online university for a fraction of the cost.


What the students I've talked to are saying is, "why put ourselves into $100k worth of debt for online-only classes?" If they're supposed to be doing lab work, why pay any money at all for a class without it? Tuition prices are unchanged even though higher education has temporarily transitioned to remote-only.


The best thing that college did for me is get me out from under my parents roof and their influence. College gave me a chance to develop my own beliefs and ideas by being surrounded by people from many different places.

College changed me, by forcing me to be surrounded by different thoughts and feelings of people that I normally wouldn't have come into contact with in my home town.

In the few short years I was in college I went from a religious staunch conservative to an atheist liberal. This would have never happened to me had I stayed in my home.


One very dangerous thing about university I notice is that it encourage age discrimination. And that propagate to the industry hiring. Very few people with family will choose to go to university full time, and hence can't further their knowledge. Also discrimination is so rampant that even online degree is not given same weight as the classroom even though the classes are exactly same (for example stanford SCPD). So definitely we need a movement to stop this discrimination and kill this so called universities.


I'd never gladly give up all of the things about college that had nothing to do with coursework. Yes, over the years the discipline, fluidity and acumen required and acquired from mentors was invaluable. But that was maybe one-third of the value.

To describe the rest? let's say: the concentration of similarly-oriented students and the time to explore and grow in an intense and stimulating environment. But - then - college was a lot less expensive.


If I needed inspiration for a new online educational program I’d look to video games to see what’s going on.

Take Rocket League for example. This is a game that’s incredibly simple to get started: you drive a car and hit a ball. But to master advanced techniques, like dribbling in the air as a literal rocket, it takes hundreds to thousands of hours of practice.

To practice in a consequence-free environment there are standard and community-created training packs. To gain skills relative to your peers there’s a matching engine to create “fair online games”. To compete openly, there are ranked matches with precise criteria for getting into and falling out of a rank.

Furthermore, the community is incredibly supportive of questions, show-and-tell streams, and is light-hearted enough to create their own memes. There are even tutors who will review your games play-by-play with you to fix your game sense.

There’s so much good stuff here to pick apart on how to build good programs that are safe, inspiring, and lead to real skill progress for the participants. The university does a lot of it naturally but from what I’ve seen from Rocket League, you should be able to push a lot of those benefits into online formats.


I think vocational education could have been better conveyed online, but with perfect pre-recording and editing. Coursera and Edx are a good example for a forerunner though.

As a college student myself, this year, this semester is the worst for me -- I felt really lazy without any faith to keep me going, that with face-to-face activity I feel like I'm motivated to engage. But I hope this semester really is the exception.


>Coursera and Edx are a good example for a forerunner though.

Are they really though?

Online lectures have been pretty much a solved problem since maybe the 80s. (Although, of course, video lectures didn't become widely available until digital and YouTube.)

Arguably broadcast video should less faithfully replicate in-person lectures but it's a lot easier cheaper to point a video camera and add some Powerpoint than to do something more multimedia. See also virtual events.

Automated grading of assignments works reasonably well for narrow classes of problems. (Like programming--and even then it can only give feedback for the most part if the answer is correct.)

Discussions are mostly a tire fire--although they can be better for small groups with selective admittance criteria.

A lot of people don't even like it when MOOCs stick to a schedule to provide some structure.

So, I've taken some MOOCs and gotten some value out of them but I think on the whole you have to give them a pretty mediocre grade.


> As a college student myself, this year, this semester is the worst for me -- I felt really lazy without any faith to keep me going, that with face-to-face activity I feel like I'm motivated to engage.

The in-person energy of a University environment isn’t given enough credit in these discussions about the relevance of Universities in modern times.

Physically going into a building dedicated to education and sitting among peers who are also there to learn is a strong motivator for learning. The social cues of seeing your peers pay attention helps align everyone toward learning. Being among your peers is a good reminder that taking the education seriously is important for remaining competitive in the workforce after graduation.

Contrast that with online courses, where students utilize the same device they use for gaming, social media, and browsing the internet to also view remote lectures. No one will notice if you’re chatting with your friends during the course, or browsing Reddit the whole time, or if you have the TV on in the background. Sure, some people are good at sitting down, focusing, and paying attention, but many others struggle without the context shifts, social cues, and social pressure of a traditional learning environment.

HN has been having heated conversations about how self-teaching and free online courses are going to replace expensive universities for as long as I can remember. Yet in all of my experience interviewing candidates with an extra emphasis on giving self-taught and non-traditional applicants extra attention, I’ve never seen any self-taught candidates who come close to their University educated peers. I’m sure there are great self-taught people out there, but on average it appears that a real, in-person University education really does something extra to prepare people.

I think there are a lot of intangible or hard to pinpoint aspects of a University education that won’t be replaced any time soon by pure online courses.


Yes. For online learning, distraction that is easy to reach is a problem.

We can further generalize it to remote working, which surprisingly drew a same parallelism as you described for school/workplace.

> I think there are a lot of intangible or hard to pinpoint aspects of a University education that won’t be replaced any time soon by pure online courses.

I had mixed view on this. I think this is about the matter of authenticity, that whatever you read, taught and trained in school should be reliable, consistent & correct (doesn't mean it is the most up to date, however). It also served as a (referential) benchmark of your most of your skills.

However, at the end of the day this is a matter of the ebb and flow of information -- You see that there are some people that can succeed with limited public information on LeetCode, Codeforces etc., they are truly successful and competitive, but most people also get preference for the background of their school, e.g. CMU, MIT, that they are offered more confidential information (like what type of algorithm question would Microsoft/Google present to you?), more trainings (better understanding/explanation of materials), more networks (alumni), etc. to begin with.

If all of these information and assets are open to public and easily accessible, universities, or perhaps every centralized education facilities will truly met its demise -- their business advantages and potentials are almost totally zero.


Its a real struggle for us. The "doing" part of vocational education doesn't really work for at home because of the equipment needed. Its not like typical lecture classes, since it is damn hard to weld at home for example. Never mind the Commercial Drivers License Training which requires use of the simulator until we are confident you have the skill for the actual big rig.

I would actually say vocational is much harder than traditional classes. Heck, laboratory assignments for the sciences are already computerized and don't lose as much value as vocational.

We have to do hand-on appointments and book work remote.


I'm going to be generous and assume by 'vocational' the parent meant in a CS sense (things like database and network configuration that you usually learn after college) and not in the broader sense (like welding and hair cutting where hands-on is the majority of the value).


Oh, well, sorry I got confused, but vocational has a very specific meaning in academics in the US and I guess its on my mind these days since this has been one of our big problems. I'm still hoping that there will be some industry acceptable solutions.


> "I think vocational education could have been better conveyed online, but with perfect pre-recording and editing. Coursera and Edx are a good example for a forerunner though."

Completion rates for Coursera and similar online courses are notoriously poor. See, for example, https://www.insidehighered.com/digital-learning/article/2019...


I think the title is clickbaity. As mentioned in the article there is more to University than just classes, there is the social aspect of campus life etc. So it's not like the Uni is getting obsolete because of online learning platforms.

That being said I think there is indeed a big potential to offer quality content at a reduce cost, and make knowledge accessible to more people, especially in countries where access to university is extremely costly, hence access to education not equitable.

Having social, human one to one interaction with professors like Deleuze, Grothendieck, who, in example, where both teaching in French Uni, is not quite the same as watching a video online. So I'd say Uni is the color dolby surround cinema experience, online learning is the black & white TV at home.


The value of that $75k/year was never from the classes themselves but from the social proof, networking, and opportunity to live among other like minded young people at similar stages of their lives (and exposure to different people/ideas through that). The classes themselves are essentially commodities. A lot of highly ranked universities don’t even have great teachers for many of their classes.

Universities that derive much of their value for being a proxy for some combination of intelligence/privilege can keep their value going into the future because they’ll likely remain just as selective if not more so over time. It’s the not-particularly-selective, small private colleges and universities which have their business model under threat, IMO


For those like me drawn in by the bad headline, FYI the article itself uses the obviously more correct analogy of live entertainment in the streaming age. (TBH I didn't read it, just skimmed to see if they were really going to use only the bad analogy)


American higher education discussion is usually misplaced because it’s a middle class jobs discussion with extra steps.

create the factory jobs which don’t need a degree and many people will be happier and colleges will no longer be a part of a non-normally distributed outcomes market.

People learn in a variety of ways, and the abysmal completion rates from MOOCs clearly indicate that IT enables education does not solve the problem which causes the symptoms.

Gamification, intervention, multiple steps to keep completion rates up still Result in large drop out rates.

Online education will only work for a few people, and will fail far short of its promise for everyone else.


I always expected the drop out rates for MOOCs to be high. The cost of entry is low.

If you look at organisations like the Open University it's a different picture. People have to put their hand in their wallets and that seems to be enough to fix the problem.


that underscores how our assumptions and solutions don’t make sense - free, quality education avaiLable at any time? That was the holy grail which we thought we had achieved when Salman khan‘S videos started getting shared.

Instead we have to concoct filters and tools to get a ~12-13% average completion rate.


I heard a great thought experiment once: would you rather have a stanford degree without the education, or a stanford education without the degree?

20+ years of experience have taught me that anything but the first choice is utterly insane.


I like the distinction between a university and an institution focusing on practical higher education (German “Fachhochschule” seems to convey the concept). This should not be about difficulty and prestige, but one’s mindset and what they want to do in life.

Research-oriented higher education is not likely to become obsolete, and neither is applied practical education. What can (and, arguably, should) go away is expecting an institution originally intended for the former to provide the latter.

Unfortunately, in many countries universities have grown to become the default option, any other path is considered inferior.


> I like the distinction between a university and an institution focusing on practical higher education

The USA already has Fachhochschule. See: any non-phd-granting regional branch campus of the state's university system. Places like https://www.ucmo.edu/.

The problem is that we don't admit that our Fachhochschules are Fachhochschules, and students at our Fachhochschules therefore often don't understand that they are at a Fachhochschule. People in the US talk about university as if it's one monolith, when in fact we have at least three very different types of universities with very different purposes: elite finishing schools (top-tier LACs and the Ivy League), universities (state flagships and large R1 privates), and applied universities (low-tier LACs and state branch campuses). And then students make dumb choices because all of those universities offer the same educational programs.


I sometimes miss putting on a CD and listening to the whole thing. Not every album is good for that, of course. And I can do that just as easily with any media player. Something about the physical media made me more likely to listen to the whole album, though.

I was a student in a Physics class in the mid-nineties that used connected HP calculators to periodically survey the class to get a quick handle on everybody's understanding of the material. I was a student in another class where the lectures were recorded, and once in a while if the professor was out of town, a TA would roll in a VCR cart, and we'd watch the lecture. It was interesting to watch new technology get adopted with very different approaches and outcomes.

Now we have K-12 classrooms using iClicker and there are open source solutions like https://github.com/qlicker/qlicker. This, to me, is one of the better potential uses of technology: helping educational professionals customize curriculum and pace.

We recently moved to a school district where there has been an investment in developing curriculum. It has been a learning experience for me as a parent to see how this can widen the subjects a student can cover, and the depth they can go into. It's not obvious, nor is it easily measurable, so I fear it is an easy target during budget discussions. Examples include units on it being ok to fail/take risks (I forget if this was 2nd or 3rd grade), using the bee bot (https://www.robot-advance.com/EN/actualite-beebot-educationa...) to teach teamwork and problem solving at a young age, and using Google Slides to create presentations about subjects like the weather.

As a parent of a 3rd grader and 6th grader, I see the differences between the educational approaches for the different ages, how people learn, etc. I'm in agreement with the author of the article that I'm looking forward to see how technology is integrated in such a way that it accelerates/improves education, without removing some of the intangible benefits of collaboration, etc.


Like many things in life, this isn't a black and white issue. Higher education isn't for everyone, some people benefit greatly, for others it is a waste of time and money. It shouldn't be a requirement for moving up in life but one of the options. There should be different paths for acquiring the skills required to advance in life. For some it should be university level higher education programs, for others it should be vocational schools and apprenticeships, etc.


While there is no argument that lectures can be done online. There is so much more than lectures at campus.

One thing that isn't mentioned is Labs.

People aren't going to be doing organic chemistry in their bathroom, nor are they going to be firing clay pots in their ovens.

I know I wouldn't have had access to all of these things from home.


Serendipitous meeting of minds in the common room / rose garden / library cafe / lab / river bank / Kings Arms / The Mill / student bar.

If you have no idea what this feels like, go back to your old University town and try to dip back in, COVID notwithstanding.


Is that an editorialized title? The title on hackernews doesn't convey at all what the article's title and subtitle do:

> Are Universities Going the Way of CDs and Cable TV?

> Like the entertainment industry, colleges will need to embrace digital services in order to survive.

Completely different meaning.


Yes, but like a CD I can get a very high quality experience at a university in person. Like streaming I can get a weak, self driven experience remotely. I guess a CD can still be real low quality recordings too but I was fortunate to enjoy being at university.


In that the people producing the content-- adjunct instructors and grad students-- don't get paid much, but they get paid a helluva lot more than they'll get in whatever hairbrained disruption scheme Silicon Valley comes up with for the university?


College is a huge scam imo. Paying thousands of dollars for a class on calculus that has 100 plus people in it is robbery. You can easily get that knowledge online for free. The only reason to go is for the piece of paper.


In many countries (except US) such course would be free or quite affordable. It's just a matter of policy.


You make a good point. I should have clarified that college in the US is scammy. In other countries with reasonable tuition it is quite a good deal.


Obviously right?

I mean look at the state of universities today: Around the globe every semester a professor is standing in front of a class, presenting material and they will do this again and again.

Every professor is doing it a little bit different but the main message is the same. In Math its probably even closer between universities while history or social studies might be further appart.

Where is our central / global learning platform? Which tracks all progress? Allows you to chose what professor explains to you certain topics? Tools to support you, nugget of wisdom explaining to you certain small parts of the lecture?

YouTube explained to me a few concepts i just didn't get while listing to the professor.

This is ridiculous!

Try to find lectures for free online which are above 101 curses. The video quality is shit, no exercises nothing.


I think you have a fundamentally wrong conception of what it takes to learn advanced concepts. In most cases you can't autograde or award points for everything, it also isn't what University education should be about. It is sad enough that during the Bologna reforms in Europe University education was "industrialised" as much as possible (ECTS points, core modules, etc.).

In the best case a University education in a STEM field will lead you pretty quickly (within 3-5 years) to do research in a lab that does cutting edge work. Most of that lab work can't be distributed or done online, nor can any of the preparatory lab courses in small groups with constant supervision (which are a large part of Physics, Chemistry, Biology, Medicine degrees). Even in subjects such as math and computer science a huge part is exercises that are graded and then discussed in small groups of 10-20 people. This wouldn't easily translate to an online experience either.

Universities like Cambridge even offer 1:1 supervision in Math and Physics, this can be totally awesome and is non-replaceable by video chat or online forums.


I have studied for a year next to my job.

My biggest challenges was to find material which explained to me certain concepts. Stuff like 'why can you transform this mathematical notion to this short form? How do you know that? Oh i learned that in the gym' YouTube helped, other students helped but it was time consuming.

It would have helped me much more if i would have had a proper central source of high quality material. That would have allowed me to understand it easier and better and with less effort for others around me.

THIS then should free up those people to do more 1:1 supervisions (which might be common in Cambridge but is not everywhere the same). It should also allow those people to optimize how material is explained.

And this allow more people to get education which is key.

And i don't think this should just start at university level. Why not start with when you are baby for your parents?


Well these centralised resources are books. There are high quality (almost standard) books for most subjects. Want to learn Analysis at a Graduate level? Pick one of the popular books Rudin (for example) and work through it. Same goes for Quantum Field Theory (Peskin & Schroeder, Zee, Srednicki,...) and tons of other subjects.

The lectures are often only a guided tour through one of those books (at least at the undergraduate and early graduate level). In many ways lectures are still la huge advantage of course because most of these books are too long to be useful (I still have some guilt when I think about how little I absorbed of the 500+ pages experimental condensed matter book we used). Often the exercises then cover things that were not explicitly explained in the lecture but can be worked out with the help of the book.


Those are books you start reading one page and then you do research on this page until you got it.

The density is to high and there is no use of modern technology at all to link to explanations or guidance for it.


How would you scale assessment of exercises further than eg. coursera already does?


Coursera is not the central learning platform, its a course platform.

You don't log in to see your learning graph which will give you the CS Bachelors Degree. You need to choose a course which will give you a CS Bachelors Degree.

Imagine you log into Coursera and you want to have a comparable Degree. You would choose courses to learn 'nodes' of your learning graph. But you could choose which one. You could learn the security topic from a course from someone from harvard and you could learn security topic from someone in germany.

You choose your language, you choose your medium. Might be that a lot of text + 1-2 explain videos help you more then a 2h lecture.

Might be that you can choose between implementing a program or do a presentation.

It might even be required for you to listen to a presentation from other students. Or create your own small programming tasks for others.

This could become the best curated/crowd created biggest collection of educational material in the world.


I think your sentiment is great overall and I agree with it in essence. But realistically, there are way more obstacles to something like this than just creating a platform, as I'm sure you can imagine.

Especially when you mentioned above, "why not start when you're a baby" which I think you meant start at a young age. kinder-gardens, preschools, schools, are not only used as a place for education, unfortunately. Parents NEED their kids to be in someone else's care for a good part of the day. The current pandemic has made that even clearer if it already wasn't.

Additionally, the sheer number of people that need education is not only enormous, it's growing. Not only that, we are all different, with different needs. Some don't have computers, some don't have internet, or paper, or pens, or food. Some won't be able to learn how to read from home or learn math. Some are deaf or blind.

Again, that's all before getting to higher education. Universities, despite their bad record and resistance to change, are a place where a lot of research happens. I'm not talking only about the US. Let's forget about the US for a second since the scope of your comment goes beyond that. We are talking about identifying, recording, and distributing courses from professors from around the globe. How do you decide who's good enough? How to decide who's getting paid and who's not? Most Universities and Colleges have more than one professor per subject area.

Again, I agree with the sentiment but I wanted to share a larger perspective because I thought your comment was interesting and worth discussing.


I thought about this as well and the benefit of such a platform would be that depending on what your role is, the platform can do different things for you:

- You are a teacher -> you can make sure you teach related stuff like the rest of the world. You can share and exchange learning material, you can track your students - You are a parent -> you don't need it often but you might need to see early if your kid is struggling or you wanna have some extra material for your kid to exercise with - You are in a government position -> you can use it to get your curriculum - You are an help organization and you need stuff to print out for some remote village without internet

I'm not saying that i have fixed/analysed all issues coming from it but i do see a huge benefit if we as a society centralize it more then what it is right now.


And to add to my comment:

this platform could even have stuff like spaced repetition build in!


Check out Kettering university. Top rated engineering school in Michigan. You’re required to have 2 years of engineering co-op experience before you can graduate.


Until you can stream touch, smell, taste, vision and sound in a way that can pass the turing test, streaming is a cave painting if the university is a CD.


I don't think they are like CDs. You interact with lots of people on campus, you have lots of laboratory classes, and much more.


I am wondering if contributing to a 529 plan still makes sense given where things are going.


It has always seemed a crazy disconnect to me that so many in academia will endlessly complain about the expectation that what they teach prepare their students to obtain a high-paying job while happily cashing the checks and asking for more.


I have found that academics often find their dependence on students embarrassing, and the idea of earning befuddling.

The university is a kind of "eternal adolescence" which promised to provide dysfunctional high-IQs without the trappings of such obligations.

With the marketization of education, academics find themselves obligated to their students (how undignified!) and for the first time needing to justify their worth (how are they meant to do that?!).

These are, of course, things that everyone else has been doing for the past two centuries -- now, finally, The University is getting a shock to its system.

I was a person who, for the longest time, associated such pseduo-Nobel attitudes with "being educated". Now, I think the whole place is infatalizing and infantalized.

It is a very bad place for one's ideological health: it inculcates a kind of outrage that any one should be obligated to do anything; a stort of pathological Utopianism artificially sustained by state funding.

I think it's a large part of where extreme student activism comes from: a kind of rage that the circumstances of our life are constrained at all. Shouldn't there just be infinite money and infinite time? Shouldn't the whole world just be a campus?


IDK what university you're talking about.

When I was a PhD student, I worked 60 to 80 hour weeks doing research and teaching. My advisor brought in much more grant money than was spent on our lab equipment/salaries (and worked even hard than we did!) Our lab ran a (pretty healthy) profit, even before considering tuition and patent royalties.

I spent some time teaching. Again, 60 hour weeks and all of my advisees landed six figure jobs after graduating.

To the extent that the sort of institution you describe exists (low accountability), those professors are probably making less than the high school teachers in the same town. For example, an assistant professor at UCMO makes $50K-$60K; a high school teacher with several years of experience and a phd in Missouri makes closer to $70K-$80K (and with a much better pension): https://h1bdata.info/index.php?em=UNIVERSITY+OF+CENTRAL+MISS... And in terms of time/effort, the two jobs are pretty comparable.

The well-paid lazy professor doesn't really exist anymore, at least in STEM. Professors are either well-paid researchers expected to bring in substantial grant money (which is a hard job with long hours), or they are relatively poorly paid teachers.


humanities departments.

Most science and engineering depts have been doing research justification, financing, hiring from industry, etc. for awhile.


Remains available when the Internet goes out ?


I just wanted to say: Derek Banas


The article doesn't mention the political radicalization students are subjected to at university. Having attended a decent school for two years after the military, I never in my life saw such extremism concentrated in one place.

I believe that radicalization combined with the psychological effects of the lockdown is at least partially responsible for the social unrest we're seeing today.

And now that I have my own children, I hope there are viable alternatives by the time they turn 18.


A good question! Just like the CD is just a vehicle for music delivery, the university format is a dated system for delivering advanced education.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: