Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I run an accelerator out of a huge law firm in Canada. I'm not much involved with the firm outside of using the startup lawyers to get the work done for my founders, but lunch room chatting happens. It's been interesting to hear the discussion: inbound is actually increasing due to AI, AI is both giving bad advice (or more often, good advice badly) and drafting nonsense, or missing so much case law it build weak cases etc. I don't think we will hire less, I don't think we'll downsize, I think we'll just take on work we said no to perviously because it wasn't interesting or whatever, and I think client satisfaction will increase. (also AI tools for huge firm lawyers doing big client files are comically bad, one large legacy player recently turned on the AI feature and it kept telling the lawyers to consult a lawyer)


The tools will get better in time, with human assistance.

They will learn how to get better at providing results.

Nowadays, there's confusion, however, there is room for improvements.


I agree. One funny thing I observed happen was it seemed maybe last year we'd slow associate hiring, but they took a pretty measured wait and see approach, and they set up an AI team to basically very quickly end to end test every tool they could. Coming out of the year two things were clear: AI was still not great for the type of work they do (it's "enterprise law", think $1k/hr lawyers), but the EAs and Paralegals etc will be able to get way more done in a day (but ChatGPT or whatever is fine for their work) and 2025 was going to be a pretty busy year. I'm starting to think the firm will only grow as a result of AI.

a year ago I thought they would all be replaced, but as I spend time with them and see their work, I realized I knew nothing about big law, half the job is therapist and talking people out of being emotional and brash before you can even get to the lawyering. Lots of tacit knowledge too, this regulator has this guy who always rejects the application if the format isn't this way, or this judge never accepts that argument, or yeah I've negotiated against him 30 times, he hates yellow so wear yellow.

The legal system is a very fundamental human thing after all.


"They'll just get better" is a huge non sequiter.


"we're just 10 years away from fusion bro"


> one large legacy player recently turned on the AI feature and it kept telling the lawyers to consult a lawyer

So, the AI spoke the truth and it is criticized?


FYI, "Be kind. Don't be snarky. Converse curiously; don't cross-examine. Edit out swipes." is literally the first rule of HN in the comments guidelines. https://news.ycombinator.com/newsguidelines.html


Your application of this rule is misplaced. So I’ll be curious. How did you get this interpretation? It seems that your comment is the snark.

The AI spoke the truth. It’s application for this purpose was meaningless and the AI even told the user that.


I apologize if I misinterpreted. An AI tool built for lawyers does not need to tell a lawyer to consult a lawyer, they're lawyers, especially a tool charing $5k a month that is supposed to... replace junior... lawyer. :)


Have they used GC.AI? Heard some buzz about it but kinda skeptical myself of these tools


No, but it doesn't look very good to me at all, tbh.

The best thing I've seen is https://www.spellbook.legal/ but our firm is too big for them right now.


What doesnt look good? Just curious!


Is that Relativity?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: