The GPT family has already shown more than 50x productivity increase by being able to solve not one, but hundreds and perhaps thousands of tasks on the same model. We used to need much more data, and the model would be more fragile, and finding the right architecture would be a problem. Now we plug a transformer with a handful of samples and it works.
I just hope LMs will prove to be just as useful in software development as they are in their own field.
I just hope LMs will prove to be just as useful in software development as they are in their own field.