Yes - it's similar to the situation in high school where there are kids who want to learn and kids who disrupt the class because they're lazy/not motivated/have issues/want attention.
Automated teaching and testing with social sharing automate that dynamic. There's less attention seeking, but much more passive aggressive subversion.
But that's not the scariest thing. The scary thing is that if it's a STEM field these students go on to get jobs in which they have no competence. This is truly catastrophic if you want software that works and buildings that don't collapse.
Worse - the skill they've learned best is gaming the system and hiding their incompetence.
It's a double failure - of culture as well as knowledge.
Kudos to the prof in the story for handling it so well. Most profs won't.
The underlying issue is that there's been far too little research into the social consequences of automating all kinds of interactions.
The 70s utopian ideal of "Give everyone a computer to empower them" turned out to be ridiculously naive. What happened instead is that various dysfunctional economic and cultural patterns were automated and enhanced.
Culture as a whole has no defences against this because hardly anyone has realised that it's a problem inherent within the culture-amplifying effects of automation, and not an unfortunate byproduct that just sort of happens sometimes - and who knows why?
> But that's not the scariest thing. The scary thing is that if it's a STEM field these students go on to get jobs in which they have no competence. This is truly catastrophic if you want software that works and buildings that don't collapse.
And also why we can’t trust a degree to show competence, leaving it to companies to figure out with LONNGG multi-part interviews
I really appreciate this comment, especially the bit about how automation amplifies culture - that's something I've felt for a long time but you've stated it eloquently.
Schools want to churn out more students, industry wants more fresh grads.
Online quizzes/assignments (which are vulnerable to cheating) and Leetcode screener questions (which are just a little better than rote memorization) are how schools and industry react to scaling issues.
I feel like everyone would have better outcomes if we could somehow be satisfied with less growth.
Automated teaching and testing with social sharing automate that dynamic. There's less attention seeking, but much more passive aggressive subversion.
But that's not the scariest thing. The scary thing is that if it's a STEM field these students go on to get jobs in which they have no competence. This is truly catastrophic if you want software that works and buildings that don't collapse.
Worse - the skill they've learned best is gaming the system and hiding their incompetence.
It's a double failure - of culture as well as knowledge.
Kudos to the prof in the story for handling it so well. Most profs won't.
The underlying issue is that there's been far too little research into the social consequences of automating all kinds of interactions.
The 70s utopian ideal of "Give everyone a computer to empower them" turned out to be ridiculously naive. What happened instead is that various dysfunctional economic and cultural patterns were automated and enhanced.
Culture as a whole has no defences against this because hardly anyone has realised that it's a problem inherent within the culture-amplifying effects of automation, and not an unfortunate byproduct that just sort of happens sometimes - and who knows why?