Hacker News new | past | comments | ask | show | jobs | submit login

It is especially sad that VC money is currently being spent on developing AI to eliminate good jobs rather than on developing robots to eliminate bad jobs.



Many machinists, welders, etc would have asked the same question when we shipped most of American manufacturing overseas. There was a generation of experienced people with good jobs that lost their jobs and white collar workers celebrated it. Just Google “those jobs are never coming back”, you’ll find a lot of heartless comparisons to the horse and buggy.

Why should we treat these office jobs any differently?


Agree - also note that many office jobs have been shipped overseas, and also automated out of existence. When I started work there were slews of support staff booking trips, managing appointments, typing correspondence & copying and typesetting docuements. For years we laughed at the paperless office - well it's been here for a decade and there's no discussion about it anymore.

Interestingly at the same time as all those jobs disappeared and got automated there were surges of people into the workforce. Women started to be routinely employed for all but a few years of child birth and care, and many workers came from overseas. Yet, white collar unemployment didn't spike. The driver for this was that the effective size of the economy boomed with the inclusion of Russia, China, Indonesia, India and many other smaller countries in the western sphere/economy post cold war... and growth from innovation.


US manufacturing has not been shipped out. US manufacturing output keeps increasing, though it's overall share of GDP is dropping.

US manufacturing jobs went overseas.

What went overseas were those areas of manufacturing that was more expensive to automate than it was to hire low paid workers elsewhere.

With respect to your final question, I don't think we should treat them differently, but I do think few societies have handled this well.

Most societies are set up in a way that creates a strong disincentive for workers to want production to become more efficient other than at the margins (it helps you if your employer is marginally more efficient than average to keep your job safer).

Couple that with a tacit assumption that there will always be more jobs, and you have the makings of a problem if AI starts to eat away at broader segments.

If/when AI accelerates this process you either need to find a solution to that (in other words, ensure people do not lose out) or it creates a strong risk of social unrest down the line.


If I didn't celebrate that job loss am I allowed to not celebrate this one?


The plan has always been to build the robots together with the better AI. Robots ended up being much harder than early technologists imagined for a myriad different reasons. It turned out that AI is easier or at least that is the hope.


Actually I'd argue that we've had robots forever, just not what you'd consider robots because they're quite effective. Consider the humble washing machine or dishwasher. Very specialized, and hyper effective. What we don;'t have is Gneneralized Robotics, just like we don't have Generalized Intelligence.

Just as "Any sufficiently advanced technology is indistinguishable from magic", "Any sufficiently omnipresent advanced technology is indistinguishable from the mundane". Chat GPT will feel like your smart phone which now feels like your cordless phone which now feels like your corded phone which now feels like wireless telegram on your coal fired steam liner.


No, AI is tremendously harder than early researchers expected. Here's a seminal project proposal from 1955:

"We propose that a 2 month, 10 man study of artificial intelligence be carried out during the summer of 1956 at Dartmouth College in Hanover, New Hampshire. The study is to proceed on the basis of the conjecture that every aspect of learning or any other feature of intelligence can in principle be so precisely described that a machine can be made to simulate it. An attempt will be made to find how to make machines use language, form abstractions and concepts, solve kinds of problems now reserved for humans, and improve themselves. We think that a significant advance can be made in one or more of these problems if a carefully selected group of scientists work on it together for a summer. “


GP didn't say that AI was easier than expected, rather that AI is easier than robotics, which is true. Compared to mid-century expectations, robotics has been the most consistently disappointing field of research besides maybe space travel, and even that is well ahead of robots now.


> well ahead of robots now

I am not working in that field, but as an outsider it feels like the industrial robots doing most of the work on TSMC's and Tesla's production lines are on the contrary extremely advanced. Aside from that what Boston Dynamics or startups making prosthetics came up is nothing short of amazing.

If anything software seems to be the bottleneck for building useful humanoids...


I think the state of the art has gotten pretty good, but still nowhere near as good as people thought it would be fifty years ago. More importantly, as of a year ago AI is literally everywhere, hundreds of millions of regular users and more than that who've tried it, almost everyone knows it exists and has some opinion on it. Compare that to even moderately mobile, let alone general, robots. They're only just starting to be seen by most people on a regular basis in some specific, very small geographical locations or campuses. The average person interacts with a mobile or general robot 0 times a day. Science fiction as well as informed expert prediction was always the opposite way around - robots were coming, but they would be dumb. Now it's essentially a guarantee that by the time we have widespread rollout of mobile, safe, general purpose robots, they are going to be very intelligent in the ways that 20 years ago most thought was centuries away.

Basically, it is 1000x easier today to design and build a robot that will have a conversation with you about your interests and then speak poetry about those interests than it is to build a robot that can do all your laundry, and that is the exact opposite of what all of us have been told to expect about the future for the last 70 years.


Space travel was inevitably going to be disappointing without a way to break the light barrier. even a century ago we thought the sound barrier was impossible to penetrate, so at least we are making progress, albiet slow.

On the bright side, it is looking more and more like terraforming will be possible. Probably not in our lifetimes, but in a few centuries time (if humanity survives)


Forget the light barrier, just getting into space cheaply enough is the limiting factor.

Barring something like fusion rockets or a space elevator, it's going to be hard to really do a whole lot in space.


I think the impact of AI is not between good jobs va bad jobs but between good workers and bad workers. For a given field, AI is making good workers more efficient and eliminating those who are bad at their jobs (e.g. the underperforming accountant who is able to make a living doing the more mundane tasks whose job is threatened by spreadsheets and automation)


I worry the effects this has on juniors…


I think AI, particularly text based, seems like a cleaner problem. Robots are derivative of AI, robotics, batteries, hardware, compute, societal shifts. It appears our tech tree needs stable AI first, then can tackle the rest of problems which are either physical or infrastructure.


Capitalism always seeks to commodify skills. We of the professional managerial class happily assist, certain they'll never come for our jobs.


A serious, hopefully not flippant question; Who are "they" in this case? Particularly as the process you describe tends to the limit.


I would guess that "they" are "the capitalists" as a class. It's very common to use personal pronouns for such abstract entities, and describe them in behaving in a goal-driven matter. It doesn't really matter who "they" are as individuals (or even if they are individuals).

More accurate would be something like "reducing labor costs increases return on capital investment, so labor costs will be reduced in a system where economy organizes to maximize return on capital investment". But our language/vocabulary isn't great at describing processes.


Poor phrasing. Apologies. u/jampekka nails it.

Better phrasing may have been

"...happily assist, confident our own jobs will remain secure."


Thanks. Not putting this onto you so I'll say "we/our" to follow your good faith;

What is "coming for our jobs" is some feature of the system, but it being a system of which we presume to be, and hope to remain a part, even though ultimately our part in it must be to eliminate ourselves. Is that fair?

Our hacker's wish to "replace myself with a very small shell-script and hit the beach" is coming true.

The only problem I have with it, even though "we're all hackers now", is I don't see everybody making it to the beach. But maybe everybody doesn't want to.

Will "employment" in the future be a mark of high or low status?


The problem is that under the current system the gains of automation or other increased productivity do not "trickle down" to workers that are replaced by the AI/shell script. Even to those who create the AI/shell script.

The "hit the beach" part requires that you hide the shell script from the company owners, if by hitting the beach you don't mean picking up empty cans for sustinence.


> Will "employment" in the future be a mark of high or low status?

Damn good question.

Also, +1 for beach metaphor.

My (ignorant, evolving) views on these things have most recently been informed by John and Barbara Ehrenreich's observations about the professional-managerial class.

ICYMI:

https://en.wikipedia.org/wiki/Professional%E2%80%93manageria...


An interesting view is that people would still "work" even if they weren't needed for anything productive. In this "Bullshit job" interpretation wage labor is so critical for social organization and control that jobs will be "invented" even if the work is not needed for anything, or is actively harmful (and that this is already going on).

https://strikemag.org/bullshit-jobs/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: