Agreed. Not only do I think it’s worth it, i actually like that I can contribute. I’m getting so much good value for free I think it’s fair. It’s a win-win situation. The AIs get better and I get better answers.
It’s not naive. The value these ai chatbots provide to me is extremely high.
I’ve been writing code for many years but one of the areas I wanted to improve was debugging, I’ve always printed variables but last month I decided to start using a debugger instead of logging to the console. For the past weeks I’ve only been using breakpoints and the resume program function because the step-into, over, out functions have always been confusing to me.
An hour ago I sent Gemini images of my debugger and explained my problem and it actually told me what to do and it actually explained to me what the step-* functions did and it told me what to do step by step (I sent it a new screenshot after each step and told it to explain to me what was going on).
I now have a much better understanding of how debuggers work thanks to Gemini.
I’m fine with google getting my data, the value I just got was immense.
I got that from your first post. As with every game context a win-win is only possible in non-zero sum with a relatively balanced benefit. It's clear that you can see the value you get and maybe even quantify it. However you can't quantify the other side, nor the degree to which its win will affect your win on a relatively short term (a few years tops).
Two things come to mind
The less relevant one, is that as a coder, once there's a good enough model (good enough = benefit/cost) your "win" will get to 0. And your contribution to what will make that win 0 is going to be non-0, but you're not going to get anything.
The more relevant one, longer term, is that you may end up being predictable (a good model of yourself) that will be able to extract value out of you personally forever, again without anything for you to gain.
Both may be argued against, or that they are unavoidable, regardless. But in either case, your "price point" has been arbitrarily chosen, at least from your perspective. I.e. it's not an informed choice on your end. A bit like the Monty Hall problem, you chose a door with little information.
The act of sticking to the door you chose is why you're naive.