To be fair, it does strike me that not everyone will derive the same value from chatGPT.
I have always wanted to rap and it has wrote rap lyrics that are way beyond what I could possibly do myself. If I was Andre 3000 or Tupac though I wouldn't be impressed at all.
That is why asking experts in their domain what they think about chatGPT seems to miss the point to me.
Exactly. I've been very frustrated in my attempts to make use of LLMs for programming until I tried using it in the context of languages and technology I wasn't familiar with. There it improves productivity immensely.
The primary use-case of LLMs is to reduce the barrier to entry for tasks that used to require a lot of learning. In the past, some people were employed mostly just for knowing how use some library or framework. That will no longer be the case. The value of knowledge that can be found in a book (not the same as "learned from a book") has been going down with digitization (internet, Wikipedia, search engines) and with LLMs its value will be reduced to almost nothing.
You don't trust it more than the top 5 google results. Surely, you wouldn't argue that search engines are useless because not everything that's written on the internet is trustworthy.
For programming, you can just test if it works. That's how most people[1] check if their own code is trustworthy, absent considerations of security against malicious attackers.
In general, knowledge and authority or trustworthiness are different things. I said this makes knowledge worthless. It does not make authority or proof of authenticity worthless; probably, the opposite.
[1]: those who aren't able to always write 100% perfect code on the first try
I think the issue is the authority is worthless. At least with Google you can see the source of those 5 responses and check their quality (i.e. SEO content), ChatGPT is just 'trust me bro.'
With some LLMs, there are efforts to make them cite sources (of course this is hard). On the other hand, existing LLMs are becoming more truthful over time (GPT4 is more truthful than GPT3, etc.).
The possible applications of the technology have known required levels of trustworthiness. As the tools surpass any given threshold, new use-cases become attractive. Already at current levels of performance there are many applications where it's easier to test or continue researching a hypothesis than to find plausible and relevant ones.
No source of information is perfectly trustworthy and we have to deal with that. This is sometimes called "critical thinking" and might become a more important skill. There is also a good chance that AI systems can be trained to do this themselves and "fact-check" the responses of others while providing sources and reasoning, which would be one tool to increase the level of trustworthiness.
Or we could use our own brains, I imagine critical thinking is an important skill we shouldn't really outsource. But hey, we march to the beat of technology.
If you come to 'trust' AI for critical thinking, how do you then know when or if it has been manipulated (surely it is naive to think the governments and power centres won't do everything they can to make AI say the 'right' things)?
I've come to much the same conclusion with getting it to produce prose. Having been paid to write for the last ~25 years I would be embarrassed to copy/paste directly from chatGPT - it manages to be both bland and verbose consistently.
But seeing how much trouble non-writers have in producing anything longer than a text message, I can see that it would have real value for them.
It's worth remembering how hard it is to make a start when you have no idea what you're doing, and the worst part of the learning process is when you know something is wrong but you don't know how to fix it.
ChatGPT seems to excel in a few narrow fields, but I think it can take a lot of people from mediocre/non-functional to somewhat acceptable and that's a pretty big leap.
I have always wanted to rap and it has wrote rap lyrics that are way beyond what I could possibly do myself. If I was Andre 3000 or Tupac though I wouldn't be impressed at all.
That is why asking experts in their domain what they think about chatGPT seems to miss the point to me.