It's perfectly doable. I set up Linux Mint for a non-technical family member on a laptop almost a year ago. They use it for browsing the Internet, watching things on streaming services, etc. I just told them to click on the small shield icon that pops up in the system tray now and again to install system updates. I visited them yesterday and asked to see the laptop out of curiosity. It's fully updated, fast, and does everything they want.
When updates or apps go wrong on Windows, you sometimes need to use Powershell. Sometimes it's easier to use Powershell for admin tasks to avoid janky settings menus. What's the difference?
I think the comparison with nuclear tech is missing one crucial aspect: ease of access. Nuclear tech has been misused, and a sufficiently funded and motivated malicious actor could get their hands on it in some way to cause harm, but it is, for the most part, out of reach.
AI, on the other hand, is already being used by every hustler looking to make a quick buck, by students who can't be bothered to write a paper, by teachers who can't be bothered to read and grade papers, by every company who can get it to avoid paying actual people do to certain jobs... Personally, my problem is not with AI tech in itself, it's with how easy it is to get your hands on it and make literally anything you fancy with it. This is what a lot of the "AI for everything" crowd can't seem to grasp.
"Personally, my problem is not with AI tech in itself, it's with how easy it is to get your hands on it and make literally anything you fancy with it. This is what a lot of the "AI for everything" crowd can't seem to grasp."
It's easy look at negatives of a technology to ignore its positives. Especially one like AI technology.
Great point. Though the issue still lies in human intent, not technology.
Shaking up traditional education methods, like paper writing and grading, can lead to more efficient learning and free time, as demonstrated by MOOCs and online universities. Exponentially growing online spam and disinformation might make it more obvious to people and recenter us as humanity to more credible information sources. We might need to adjust tax laws for companies that employ AI, but it could have positive effects. I think it's too early to catastrophize, even if I am sure the technology will be used with malicious intent by some.
I think the so-called "replication crisis"[0] might have something to do with it, particularly in psychology.
The misaligned incentives to publish frequently to have a nice-looking list of articles to show when you next apply for a grant means there's tons of flimsy research that goes unquestioned. It's also fairly attractive to jump on specific bandwagons and publish noise just to get your name out there. A lot of these meta-studies are looking inward, at the field itself and what is currently accepted, and finding that a fair bit of it is of very poor quality, if not straight-up nonsense.
I think, overall, it's a good thing. Research should not be focused exclusively on new knowledge. We should also be validating what others put out there, to make sure it's worth listening to.
> it seems a lot of our modern afflictions come down to an gross imbalance of exercise, diet, sleep, and social connection
If you think about it in an evolutionary timescale, the way most of us live in the West these days is horrendously incompatible with the sort of life we evolved to live. Thousands and thousands of years were spent out in nature, in small communities, eating certain types of foods, engaging in physical activities, etc.
The sit-on-a-chair-all-day, look-at-screens-all-day lifestyle is a comparatively new development, and neither our minds nor our bodies are suited for such an existence. That's enough to cause us a fair amount of trouble. Add all the socioeconomic issues you mention into the mix, and it all starts to make perfect sense to me.
> Now Apple just moved the search to the OS via an API call to its server, and people are noticing the traffic.
I'm not sure how photo hosting services doing this for the past 2 decades is related to this when the author of the post explicitly mentions he doesn't use Apple cloud services or products that would trigger such behaviour. This was the OS analysing someone's images, stored locally on their personal computer, and calling back to an API for no discernible reason.
> I presume a lot of time is wasted grading papers and it seems like something an AI that can see pages should be well suited to automating...
I guess if grading papers is "a waste of time", then we shouldn't be surprised that students can't be bothered to write the papers themselves in the first place. Which raises the question: if students use AI to write their papers, and teachers use AI to grade them, what's the point of any of it?
Hard work is a means to an end. People want to work hard when their efforts are likely to have some meaningful result—higher pay or a promotion, if talking about a job; a good grade or admission to some institution or program, if talking about education; having a positive impact on someone else's life. Judging by the tweet author's bio, I assume he's focusing on employees and jobs.
Despite what founders and CEOs might think, most employees have no interest in working towards their "vision", particularly if that requires a large sacrifice of personal time and energy in exchange for barely enough money to survive and nothing else of any importance. Probably an unpopular opinion to express on this site, but it is what it is.
Don't expect people to "work hard" for you, especially if you're not going the extra mile yourself to compensate and acknowledge said hard work.