Hacker News new | past | comments | ask | show | jobs | submit login
Tell HN: AI legal contract review is already screwing up
16 points by Kon-Peki on June 21, 2023 | hide | past | favorite | 11 comments
Sample Size of 1...

I had lunch with a lawyer friend today, who told me that their morning was surreal. At their meeting to go over the latest contract draft, the client introduced new AI software they had purchased (who is selling this???) and its list of recommended changes.

Every single recommendation was worse for the client, and after explaining the reasons, the call lasted longer than it normally would have, so the bill to the client was higher. The lawyer asked if AI was really that stupid, or if their contracts were really that amazing, because then they should be charging a lot more for them.




> who is selling this???

I would bet money the software is querying GPT4 with a prompt like "Please analyze the contract below and recommend improvements..." Then they turn around and market it enough to make a crypto company blush.

Maybe a very thorough finetune on a massive legal corpus from your jurisdiction, hooked up to a good database, would make something interesting. But I doubt anyone has done this so quickly.


Well. Reading legal contracts is definitely brain burning. I have talked to lawyers about the AI contract summary idea as well. Probably you want to start with some standard simple contract targeting individual customers first.


I read twice what you wrote and I still don't understand what is going. Who did what ?


Company A hires lawyer to write a contract between Company A and Company B.

Company A runs contract through AI software, which recommends changes.

Lawyer spends a lot of time explaining to Company A that the recommended changes are beneficial to Company B and detrimental to Company A, and charges $X for every 6 minutes of explanation time.


Lol, this is funny. Thank you for clarification. Lawyers win either way.


You know, it just hit me: either side can file a lawsuit over a contract. So "plaintiff" and "defense" are not deterministic in terms of who wrote the contract and who agreed to it. How can an AI that is trained only on court transcripts make any inferences on what is good for which party?


the same goes for copilot and other code assistants in the end, it will be more expensive to use AI copilots for writing code instead of just writing the code. it's still early on and people will take a while to catch up.


I'm not sure if I understand this completely either...


What was the AI software?


Probably Harvey


TLDR: AI snake oil software pretending to be a lawyer wasted more time and money after hallucinating worse recommendations for the client which a human could have done without it.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: