Honestly, you just sound salty now. Yes it makes mistakes that it isn’t aware of and it probably makes a few more than an intern given the same task would but as long as you’re aware of that it is still a useful tool because it is thousands of times faster and cheaper than a human and has a much broader knowledge. People often compare it to the early days of Wikipedia and I think that’s apt. Everyone is still going to use it even if we have to review the output for mistakes because reviewing is a lot easier and faster than producing the material in the first place.
I've already seen other posts and comments on HN where people have talked about putting it into production. What they've found is that the burden of having to proof-read and edit the output with extreme care completely wipes out any time you might save with it. And this requires skilled editors/writers anyway, so it's not like you could use it to replace advanced writers with a bunch of high school kids using AI.