Hacker News new | past | comments | ask | show | jobs | submit login

I do, but I think it's a misdirection.

You're basically making the "guns don't kill people, people kill people" argument, with LLMs instead of guns: "A gun on its own is just a mechanical device. Only by assembling it into a gun/ammunition/shooter system does it gain the potential to do harm, and only by performing the act of shooting an innocent bystander is harm actually done. Therefore, we should only regulate the act of loading a gun and shooting someone instead of mere possession or distribution of a firearm."

With firearms, the argument is usually rejected because such a regulation would obviously be impossible to enforce: If someone already has a gun and ammunition, they will just need a few seconds to load it up and pull the trigger. No cop could force them to only shoot at legitimate targets.

The analogue with LLMs would be: "An LLM on its own is just a collection of numbers. Only by deploying it into a software system does it gain the potential to do harm and only by executing the system and causing malicious output is the harm actually done. Therefore we should only regulate deployment of LLMs instead of storage and release."

You could make the same counter-argument as in the pro-gun case here, that such a regulation would be impossible to enforce: The interesting thing about open source LLMs is exactly that you can deploy them on your own hardware without having to bring any third party into the loop: Companies can deploy them in their own data centers, hobbyists on their own consumer machines, some person could just run Llama3 on their laptop solely for themself. There is no way a regulator could even detect all those deployments, let alone validate them.

That's why I find the argument disingenious: You could make a case that the harms caused by unregulated, home-deployed LLMs are much smaller than the benefits - but that would be a different argument. You're essentially arguing that the regulator should hamstring itself by leaving the part where regulation could actually be enforced unregulated (model training and release) and only regulate the "deployment" part that can't be enforced.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: