Hacker News new | past | comments | ask | show | jobs | submit login

You are dismissive, but I assure you this kind of thing will keep lawyers and courts busy for some time to come.

This raises all kinds of interesting legal issues that have no obvious resolution:

* can agency be delegated to an LLM?

* can an LLM create a contract on behalf of itself? another? an organization?

* does the answer change if the person(s) or organization(s) want the LLM to be able to form contracts?

* are contracts created by an LLM bound by the statute of frauds?

* what happens with unspecified contract terms given that an LLM has perfect knowledge of the UCC?

* does the parol rule apply to the LLM conversation prior to the formation of a contract?

And on and on. Law students all over the world are busy writing law review articles about these questions.




As a lawyer (but not your lawyer, this isn't legal advice), the actual questions themselves all seem pretty obvious under current contract law:

* You can't actually delegate agency to a computer. The Restatement of the Law of Agency says an agent must be a person.

* No, an LLM is not a human, so it can't make contracts in any respect.

* If you agree in an actual contract to be bound by the black box of the LLM, then the LLM will govern the terms thereof. You could theoretically make an unconditional offer to agree to such a contract if you really wanted to.

* Any electronic record is a writing for the statute of frauds, so unless you're piping your LLM to TTS to their speakers without any record, it should satisfied as a written memo in the above case when you really want to have ChatGPT sell your house.

* Again, LLMs can't form contracts. If the company actually accepted a customer's offer, they'd look at the intents of each actual party and parol evidence. What the LLM "knows" is irrelevant.

* It's hard to imagine a scenario where an LLM is involved in an integrated contract.

It is rather interesting to imagine how a court would handle a scenario where a customer actually thinks they are making a contract with a company through a chatbot though. Generally anything a computer does is just going to be seen as preliminary negotiations. When the customer "agrees" with the computer, it's legally the customer making an offer to the company, which then accepts the contract when they actually perform the contract/ship the order. I could see in some cases how companies could be bound by some form of reliance or quantum meruit or dinged for false advertising.


This is interesting. I'm a patent lawyer, so all of my contract law knowledge comes from a standard casebook, but I can can tell you that the analogous questions in my field are anything but settled.

> an LLM is not a human, so it can't make contracts in any respect

Probably the most contentious question relates to AI inventorship, which is disallowed under Thaler v. Vidal (Fed. Cir. 2022), but is already laughably out of date with respect to the technology. The decision draws a similarly hard line regarding whether an inventor must be a human or not based on statutory interpretation.

But the patent office will soon be inundated with AI-authored or AI-assisted inventions, if they aren't already. Applicants will simply just not admit it or possibly opt for trade secret protection instead. Meanwhile, other countries may not take such a hard line, and that IP will make its way to China or the EU or wherever.

Of course this is all science fiction right now, since it's an open question as to whether an LLM will invent anything useful, but it's not implausible. My point is that I believe the question will be revisited very soon.


> You can't actually delegate agency to a computer.

But...

> you agree in an actual contract to be bound by the black box of the LLM, then the LLM will govern the terms thereof.

This implies you can have a standing offer of a contract on the terms articylated5 by the LLM, with some specified method of acceptance, which suggests a different outcome than the “preliminary negotiations” you suggest for a contract where the LLM system is the frontend of the cobtract5 negotiation, provided that the reason that the outside party thought they were negotiating a cobtract with the company via thr LLM is that offer by the company.


> * You can't actually delegate agency to a computer. The Restatement of the Law of Agency says an agent must be a person.

Does Amazon manually review every purchase made on their site? This seems not to be true at least for some contracts.

The rest seem to fall along a similar vein. If you believe software cannot act as an agent for a corporation in creating sales contracts, then similarly it should follow that what you're suggesting is true -- LLMs cannot act as an agent.

But we know that is not the case in some circumstances.


Amazon accepts your offer of a purchase when it ships your order. It's not a contract when you click the purchase button—hence why Amazon can cancel your order when there's a pricing mistake or the item is out-of-stock.


And you believe that process has human review as a step?


These are interesting questions, but an Nth variation of "prompt injection" via an alternative ChatGPT interface is not, and Chevy are not going to be liable to sell a $1 vehicle just because ChatGPT said they are.


This isn't ChatGPT, this is the "Watsonville Chat Team". Sure, they may have used ChatGPT to make this offer to the customer, but how is that relevant? Do you think that if you use your phone's autocomplete to make an offer to a customer it is somehow not binding?


It says "Powered by ChatGPT" in the screenshot.

>Do you think that if you use your phone's autocomplete to make an offer to a customer it is somehow not binding?

Once again, just because ChatGPT said it, doesn't mean it's actually legally binding. This would be thrown out of court. It's no different than changing your name to "Free Chevy" and then claiming you're owed a free vehicle because those exact words appeared on the website.


> It says "Powered by ChatGPT" in the screenshot.

Okay, and? If I put "Powered by GBoard" on my emails they are suddenly not binding?


It also says "Please confirm all information with the dealership."

This is as obviously non-binding as anything could possibly be. All the dealership has to do is say "no, we don't sell cars at that price".


Easy enough to confirm with a representative of the dealership via their chat window that just sold the car for $1.

The conclusion is that you probably shouldn't trust ChatGPT to represent your company.


You can insist on your own personal meanings for clear statements as much as you like, but it won't have any effect on the legal interpretation of those statements.


Same goes to you. Not sure why you think that a representative on the company website is not representative of the company.


For what it's worth, it's actually labeled as ChatGPT at the top of the chat window. Of course it's lacking the "this is a chatbot and none of what it says grants you any rights whatsoever and you should triple check the things it does say because there's no guarantee it'll tell you the truth" disclaimer all chat bots should have, but at least the website isn't pretending you're talking to a human.

I think it's ridiculous to use ChatGPT for things like customer support. This time it's someone writing a basic prompt and expecting the AI to do what it says, but next time it could very well be someone who's unfamiliar with ChatGPT (remember that lawyer that thought ChatGPT was a search engine, and when confronted by the judge about made-up cases, doubled down and asked ChatGPT if ChatGPT was telling the truth?) who honestly believes they're haggling over a car with a real representative. Best case scenario they feel cheated by the company, worst case scenario a judge forces the company to honour the deal.


I don't agree that even this would be obviously dismissed.

Every law student learns about this case in Contracts during their first year of law school: https://en.wikipedia.org/wiki/Leonard_v._Pepsico,_Inc.

I'm sure some people said there obviously was no contract then either, but it was litigated at some length in federal court.

With respect to US common law, unless there is a statute or case "on point," it's potentially an open field.

A lot of AI-related litigation is happening right now to begin to settle some of these issues.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: