Hacker News new | past | comments | ask | show | jobs | submit login

> we wouldn't have terms of service; instead, we would be given a structured 'legal code' file

They're called terms and they're parsed by meat parsers. We call them lawyers and judges. They are more powerful than any computer. The system they've built has been working for hundreds of years. We have lots of people who understand how it works.

If you make the contract the code, you just replace lawyers with coders. You'll still have drafting errors [1]. You'll still have things looking like X but really doing Y [2]. And you'll still need somewhere to adjudicate conflicts.

The law is fine. Our backlog is in the parsers [3]. Replacing lawyers and judges with AI seems more desirable than some "code is contract" society.

[1] https://en.wikipedia.org/wiki/Decentralized_Autonomous_Organ...

[2] https://en.wikipedia.org/wiki/Malware

[3] https://www.wsj.com/articles/in-federal-courts-civil-cases-p...




He's asking for a standard, not an API to various oracle machines currently in existence.

The "system that works for years" relies on subjectively-defined terms often fiated by those oracle machines (thus only deterministic given the 'institutional knowledge' of 'accepted practice' depending on various jurisdictions) which aren't actually written down or encoded anywhere.

Because anyone who would be in a position to see that and write those down usually has a job instead acting as an oracle machine for decisions made based on that knowledge.

And that's just one of the gotchas where that's concerned. Taking the 'ideal' solution of just encoding 'common practices' ad-hoc as issues arise, you might as well just roll your own at that point.


> He's asking for a standard

We have predictable standards codified in statute and case law. Most cases settle predictably. A few make new law. These receive disproportionate media attention because they are defining edge cases. It doesn't matter if your law is in code or text at that point. If you have an investigatory law saying "books and papers," and a prosecutor seizes an iPhone, you have to produce new law.

This ambiguity is a product of mapping heterogeneity between abstract concepts and the real world. There will never be agreement on what all words mean. "Is this a chair? Yes. Is that a chair? No, it's art. Who says? Let's debate." That is a product of language.

> which aren't actually written down or encoded anywhere

Here's a random, recent Supreme Court opinion [1]. I don't have a legal background; I find it highly readable. Furthermore, the citations are pretty easy to follow.

> anyone who would be in a position to see that and write those down usually has a job instead acting as an oracle machine

Not everyone trained in law is a legislator. (Or judge.) Many legislators, judges, lawyers, professors, students and random people opine on the law in books, blogs and other materials. This is a public debate. When courts face an edge case, lawyers from both sides cite these materials. After that, once a precedent is set (and recorded), it is respected until a new edge case arises or the statute is revised by the political system.

[1] https://www.supremecourt.gov/opinions/16pdf/14-9496_8njq.pdf


> This ambiguity is a product of mapping heterogeneity between abstract concepts and the real world. There will never be agreement on what all words mean

And even if there is agreement, these meanings may change over time.

It is not uncommon for laws to be deliberately abstract precisely to accomodate future developments.


> The "system that works for years" relies on subjectively-defined terms often fiated by those oracle machines (thus only deterministic given the 'institutional knowledge' of 'accepted practice' depending on various jurisdictions)

The same could be said about countless technical standards...

> which aren't actually written down or encoded anywhere.

... even if they are written down.


Of course, my idea is wildly simplistic and naïve. But wouldn't it be nice?

I suppose in a sense, using court cases to set precedent is like implementing a regression test: "We found a bug in this law, next time you hit this edge case, treat it like this."


No, it would not be nice. It would be terrible.

What you're advocating is the "smart contract" delusion.

You're aiming to replace the messy uncertainty of conventional human-mediated legal agreements with the clear, deterministic rigour of computer code. Have you seen how bug-free computer code is in the real world?

Smart contracts are a perennial fascination for technologists working in an area where everything can be determined cleanly and clearly, if only in principle. So the prospect of using computers to sort out all those annoying grey areas in human interaction is tempting: if you don’t understand law (which involves intent) but you do understand code (which does precisely what you tell it to – though probably not precisely what you meant), then you will try to work around law using code.

(You know the uncertainty you're feeling about your life being run by legal documents you don't understand? That's how other people feel about their lives being run by your code.)

Legal issues just don’t work that way – the purpose of law is not to achieve philosophical truth, but to achieve workable results that society can live with. It regularly shocks technologists that the law is fuzzy, circumstantial and indeterminate, and that arbitrary strings of data can in fact be tagged with the intent behind them at the time. Because what a human meant, what they were thinking at the time, is a lot of the point.

For smart contracts to work as advertised, we would need to create a human-equivalent artificial intelligence to understand what people meant the contract to do.

A much-touted advantage of smart contracts is that the code is public, so anyone can check and verify it before engaging with it. The problem is that it is extremely difficult to tell precisely what a program might possibly do without actually running it. (Shellshock lurked in bash for 25 years without being spotted.)

Smart contracts work on the wrong level: they run on facts and not on human intent, but real-life contracts are a codification of human intent, which will always involve argument and ambiguity.

It's times like this I miss Groklaw.net.


I completely agree. Wouldn't it be nice if code did what we meant instead of what we said too though?

Not everything can be solved with code, but it's fun to think about, and leads down some interesting paths, even if they're ultimately impossible or impractical.

What if we could encode ambiguity? Legal code could have some wiggle room. Of course, you'd want to be able to do things like ask 'Is X legal under Y circumstances' and receive an answer, and at that point you effectively have human-equivalent artificial intelligence.


Well said!




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: