Hacker News new | past | comments | ask | show | jobs | submit | svieira's comments login

The Machine Stops by E. M. Forster is another very good one:

https://www.cs.ucdavis.edu/~koehl/Teaching/ECS188/PDF_files/...

And re-skimming it just now I noticed the following eerie line:

> There was the button that produced literature.

Wild that this was written in 1903.


It's such an amazing short story. Every time I read it I'm blown away by how much it still seems perfectly applicable.

The key point:

> Seeing as the great majority of students spend over 80% of their digital device time using these tools to multitask, the automatic response for a great majority of students using these tools has become multitasking.. Unfortunately, when we attempt to employ digital devices for learning purposes, this primary function quickly bleeds into student behavior.

> This is why, when using a computer for homework, students typically last fewer than 6 minutes before accessing social media, messaging friends, and engaging with other digital distractions. This is why, when using a laptop during class, students typically spend 38 minutes of every hour off-task. This is why, when getting paid as part of a research study to focus on a 20-minute computerized lesson, nearly 40% of students were unable to stop themselves from multitasking. It’s not that the students of today have abnormally weak constitutions; it’s that they have spent thousands of hours training themselves to use digital devices in a manner guaranteed to impair learning and performance. It’s also that many of the apps being run on those devices were carefully engineered to pull young people away from whatever they were doing.

> And perhaps this is the key point: I’m not saying that digital technologies can’t be used for learning; in fact, if these tools were only ever employed for learning purposes, then they may have proven some of the most important academic inventions ever. The argument I’m making is that digital technologies so often aren’t used for learning that giving students a laptop, tablet, or other multi-function device places a large (and unnecessary) obstacle between the student and the desired outcome. In order to effectively learn while using an unlocked, internet-connected multi-function digital device, students must expend a great deal of cognitive effort battling impulses that they’ve spent years honing - a battle they lose more often than not. (of course schools do often try to implement blockers and restrictions, but this opens up an eternal cat-and-mouse struggle, and the mice are very good at finding ways to evade the cat.)


Really jarring reading that. I think I was in middle school when the AOL and the "internet" to me became a thing (lol) and sure there was a lot of time wasting stuff (chatrooms, games, etc.) but there was a huge huge field of just exploration and learning. I cut my tech teeth on that; minimal parent supervision, no gamifying or artificial motivators, just my curiosity.

I feel for kids nowadays. It was the wild west back then, everything was basically unrestricted and nobody had any clue of the consequences, but we didn't have companies actively trying to addict us to stuff.

No idea what the answer is.


The interesting follow-up here... there is no reason these effects should be restricted to children. Like, if children can't learn with devices in a classroom, it suggests executives can't learn in an office (and might give a hint as to why we haven't seen expected productivity benefits driven by it).

But again, if the effect was this strong, I'd really expect to see broader evidence (even just at a national level based of digital uptake).


These authors have big Google Docs of evidence, https://jonathanhaidt.com/reviews/. But if you read it, you will see the effect is (AFAICT) limited to certain populations. There is a significant fraction of students that do have trouble with executive function and staying on task and will fail to do their homework because of social media access. Then there are the other students that have no trouble staying off social media when they have to do homework.

Part of this is because the pre-frontal cortex (associated with logic, will power, discipline, focus, etc) doesn't finish developing until about age 25.

Until then, folks can be reliant on the adults in their environments older than that age, if they haven't built up some abilities.

https://pmc.ncbi.nlm.nih.gov/articles/PMC3621648/


> (and might give a hint as to why we haven't seen expected productivity benefits driven by it)

I'd be shocked if that's not a significant part of why. Most folks will get more work done when their only alternatives are trashcan basketball or doodling, versus... the Web.

I suspect another cause is that a great deal of application of computer technology in organizations aims to improve a certain kind of legibility of processes, which is something management loves a great deal, but the cost of attaining this legibility is high enough (including in hidden or hard-to-track ways) that any benefits are neutralized or all-accounted-for costs actually go up.

[EDIT] A third cause is probably that median ability to use computers remains very low among office workers. There continue to exist offices where knowing how to copy-paste(!) for more than just bare text, or extremely-basic spreadsheet use beyond "put numbers in it" makes you a wizard. I'm not kidding.


i'd love to have a distraction-proof workstation that would force me into my IDE or whatever design doc i'm working on and block out everything else.

but it's not possible because the job requires all these gateway-drugs-to-distraction to be on the forefront of your workspace: * keep slack open in case you're needed in that support thread * keep a browser open so you can google the api docs for something (that's how i ended up here right now) * keep spotify playing in the background so you can drown out the noise of the open office/work-from-home-noise



The effect is absolutely that strong, even on adults. My anecdata in IT overwhelmingly supports that claim. Be it educational institutions, large enterprises, SMBs, or just Mom and Dad with their cell phones, the proliferation of distraction boxes has reduced critical and rational thinking abilities that are foundational elements of learning. After all, why try to reason out what you could just look up online? And if you can get the answer somewhere quicker, well, now you can also skim Twitter or Instagram with the time you saved.

During my brief stint working IT for private schools, with their SMARTBoards in every classroom, Meraki APs blanketing their 300 year old campus structures, and Chromebooks in the hands of every student, the feedback I got was that students hated having technology always with them (to the point of breaking their Chromebooks on purpose), while teachers would deliberately not report broken technology (like their SMARTBoards) so they could force kids off of electronics and into a textbook or journal. Despite the often adversarial relationship of students and teachers, both cohorts acted unconsciously towards the same outcome of less technology.

This early experience has also informed my perspective on the role of technology in the workplace as a force amplifier rather than mandatory toolset. It’s why I’m often fiercely resistant to any “new” technology coming in that doesn’t solve a problem we’ve already identified, as blindly expanding the IT estate just adds to the noise of the enterprise and detracts from the signals important to business.

Even the younger folks (20-30) I find community with outside of tech spaces bemoan the over reliance on technology in general. They aren’t luddites by any stretch of the truth, and they love BlueSky and Instagram and TikTok and all the usual social spaces where their friends are, but they’ve engaged in more active resistance to technology as a necessary component in everything they buy. This same cohort is often an ally at work, because they seek to push products or solutions that remove technology interactions from the daily grind through automation, rather than dragging in the latest toys like we (millennials) did.


There have been consistent reporting for two decades that screens are leading to measurable reductions in attention span. Three decades of reports linking the internet and digital culture with mental health issues. What evidence is missing?

People with an Internet- connected screen appear to have a short attention span because they have instant access to a multitude of things that they're interested in, competing for their attention with whatever you want them to be focusing on.

That what you want them to be focusing on is no longer the path of least boredom like it was in the previous era; the path of least boredom goes through their mobile device.

People's ability and willingness to concentrate on something that interested in has not changed one iota. That sort of biological change takes hundreds of thousands of years of evolution.

Observations of the behavior of people interacting with tech can easily support the wrong argument that people's attention spans have increased. Just look at how somebody can play the same game for 11 hours straight, right?


> People with an Internet- connected screen appear to have a short attention span because they have instant access to a multitude of things that they're interested in, competing for their attention with whatever you want them to be focusing on.

This is incorrect. There have been repeated studies that show a distinct decline in individuals ability to consume and process long-form text. Folks brains are literally remodeling towards ADHD-like behaviors.


But that's more like a developmental situation in the individual having to do with their education.

I would expect, say, individuals not going to school past grade two showing a declined ability to multiply 12 by 11.

People's handling of long form text may be off from several decades ago, but it's still better than their illiterate ancestors 500 years ago.


It has literally nothing to do with education. It's the brain's own neuroplasticity responding to overstimulation:

https://longevity.stanford.edu/lifestyle/2024/05/30/what-exc...


I suspect it's not screens alone, but whether one is creating with screens or consuming with them. Different parts of the brain.

I'd be particularly keen on evidence affecting primary outcomes - eg, are people genuinly more productive or healthier. Attention span is an interesting metric, but if it doesn't directly affect how much work you can do or the quality of it in a meaningful way, I am less fussed.

Given reports on multitasking consistently show it degrades performance this kinda seems like a slam dunk?

Mokie Coke! Mokie Coke!

----

https://scifi.stackexchange.com/questions/218596/70s-or-earl... for those not in the know.


Mono-repo is not the same as mono-service. You can deploy multiple services out of a single repository.


Also a large number of animals cannibalize the weak (chickens, for example). Now, I presume that you hold humans to a different standard for that behavior - why?


I don’t. Humans are animals, too.


Ah - well, at least you're consistent.


Now let's say that instead you threw the keys to your unbreakable safe to your friend across the Atlantic Ocean. And you say that you didn't know that the people entering were cops. And your friend won't give the keys back. The evidence may exist, you cannot access it, neither can the police. The court has no jurisdiction over your friend and you have no authority to force your friend to give you the keys back.

At that point, whether you are in contempt or not depends on the answer to the question "did you know that the cops were entering to look for evidence before you threw the keys?" Whether the judge holds you in contempt or not is a function of the free choice of the judge and is not related to the answer to the first question (though whether or not the judge should hold you in contempt is a function of what the judge believes about what you believed).


> When people are pressured to meet a target value there are three ways they can proceed:

1) They can work to improve the system

2) They can distort the system

3) Or they can distort the data

https://commoncog.com/goodharts-law-not-useful/


This is already happening - I recently saw a resume which included "Added AI-driven code reviews" as an accomplishment bullet point (the person was working for a large consulting firm).


The great thing is, Liquibase even allows you to run arbitrary programs as migrations using `customChange`: https://docs.liquibase.com/change-types/custom-change.html

Though you can get a long way with just specifying the appropriate contexts before you kick off your migrations (and tagging your changesets with those context tags as well): https://docs.liquibase.com/concepts/changelogs/attributes/co...


    Our principal exports, all labelled and packed,
    At the ends of the earth are delivered intact:
    Our soap or our salmon can travel in tins
    Between the two poles and as like as two pins;
    So that Lancashire merchants whenever they like
    Can water the beer of a man in Klondike
    Or poison the meat of a man in Bombay;
    And that is the meaning of Empire Day.
Songs of Education: II. Geography by G. K. Chesterton


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: