> I think we have a tendency to mentally move the goalposts when it comes to this kind of thing as a self-defense mechanism. Years ago this would have been a similar level of impossibility.
Define "we". There are all kinds of people with all kinds of opinions. I didn't notice any consensus on the questions of AI. There are people with all kinds of educations and backgrounds on the opposite sides and in-between.
I mean, you can just as easily make the claim that researchers shift goalposts as a "self-defense" mechanism.
For example...
Hows that self-driving going? Got all those edge-cases ironed out yet?
Oh, by next year? Wierd, that sounds very familiar...
Remember about Tesla's autopilot was released 9 years ago, and the media began similar speculation about how all of the truckers were going to get automated out of a job by AI? And then further speculation about how Taxi drivers were all going to be obsolete?
Those workers are the ones shifting the goal posts though as a "self-defense mechanism", sure, sure... lol.
Well, there's a difference between the situation with self-driving and with language models.
With self-driving, we barely ever saw anything obviously resembling human abilities, but there was a lot of marketing promising more.
With language models when GPT-2 came out everyone was still saying it is a "stochastic parrot" and even GPT-3 was one. But now there's ChatGPT, and every single teenager is aware that that tool is capable of replacing them with their school assignments. And as a dev I am aware that it can write code. And yet not many people expected any of this to happen this year, neither were those capabilities promised at any point in the past.
So if anything, self-driving was always overhyped, while the LLMs are quite underhyped.
We actually saw a lot resembling human abilities. It just turns out that it‘s not enough to blindly rely on it in all situations and so here we are. And it‘s quite similar with LLMs.
One difference, though, is that it‘s economically not much use to have self-driving if the backup driver has to be in the car or present. While partially automating programming would make it possible to use far less programmers for the same amount of work.
Define "we". There are all kinds of people with all kinds of opinions. I didn't notice any consensus on the questions of AI. There are people with all kinds of educations and backgrounds on the opposite sides and in-between.