no - some Judge wrote to his family member recently.. " I am seeing all these great briefs now " followed by a novice discussion of AI use. This is anecdotal (recent), but it says to me that non-lawyers, with care, are writing their own legal papers across the USA and doing it well. This fits with other anecdotes here in coastal California for ordinary law uses.
i think they're especially likely to hallucinate when asked to cite sources, as in they're mostly prone to making up sources, and a lot of the work my lawyer friend have asked of chatgpt or claude requires it to cite stuff, and my friend has said it has just made up case law that isn't real. so while it's useful as a launching point and can in fact be helpful and find real case law, you still have to double check every single thing it says with a fine tooth comb, so its productivity impact is much lower than code where you can clearly see whether the output works immediately
My guess is bc hallucinations in a legal context can be fatal to a case, possibly even career endu g — there’s been some high profile cases where judges have ripped into lawyers pretty destructively.