Hacker News new | past | comments | ask | show | jobs | submit login

> Legal writing is ideal training data: mostly formulaic, based on conventions and rules, well-formed and highly vetted, with much of the best in the public domain.

That makes sense. The labor impact research suggests that law will be a domain hit almost as hard as education by language models. Almost nothing happens in court that hasn't occured hundreds of thousands of times before. A model with GPT-4 power specifically trained for legal matters and fine tuned by jurisdiction could replace everyone in a courtroom. Well there's still the bailiff, I think that's about 18 months behind.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: