Hacker News new | past | comments | ask | show | jobs | submit login

Audit, verify, but also modify.

I don't know why anyone thinks that an AI that just trots out commonplace solutions to legal problems is something that most people even want.

GPTs might be able to do boilerplate basics, but since it cannot actually understand nuance, it's not going to help with really specific things.

Which makes you wonder why a non-LLM system -- an expert system, a database, some sort of DSL even -- isn't better than this stuff.

Why, if there is rigour and precedent, is there not a solution that doesn't start from a foundation of waffle and imprecision?




> Why, if there is rigour and precedent, is there not a solution that doesn't start from a foundation of waffle and imprecision?

Law is a fact-specific discipline, laden with jargon, and where the answers can change on a pretty frequent basis. It's not that far off the mark to say that the answer to every legal question is "it depends." What you need is a system that can take a moderately free-form explanation from a user, tease out the legal aspects from that explanation to match against the legal database (which almost certainly isn't going to use the same terms the user used!) to arrive at the answer, and in the inevitable scenario where the user failed to provide enough information, also be able to recognize that more information is needed to answer the question, and query the user (in their own terms) to get them to provide that information.

LLMs are decently good at handling the language translation aspects, definitely, far better than any previous AI solutions; so it's not hard to see why people try this stuff with LLMs.


There are all sorts of legal treatises that solve this problem




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: