Hacker News new | past | comments | ask | show | jobs | submit login

Yes, God forbid we build powerful new tools that extend human knowledge, insight, and productivity in directions previously undreamed-of. Mah coppy rite is more important! Thereoughttabealaw!

As usual in these scenarios, the only real injustice is that the people who tried to stand in the way will enjoy the benefits of progress in AI alongside those who worked to make it happen. So it goes, I guess.




I'm no ally of copyright; however, as long as it's here and a thing to be dealt with, I'm not going to cheer on a company operating on the back of flagrant disregard thereof. This isn't "code I'm using in a personal project that maybe only my friends will interact with". This is a full fledged business, owned in part by of all people, Microsoft, the people who rammed copyright down our throats for the last 3 to 6 decades while doing everything they could to cripple FLOSS.

I expect the absolute most aggressive enforcement of copyright in this case.

As to my more general assertion of AI only getting the traction it is because of an industry looking to devalue it's currently incredibly highly priced laborers; spend a bit of time around shareholders/management types and you'll soon understand why I think the way I do. Magical thinking, "as long as it makes my outlays lower" thinking is par for the course. There are, in fact, social classes who see the "hired help" as something meant to be out of sight, out of mind, and lucky they get what they are willing to give.

Besides the above hot take, I also see AI as being fundamentally disruptive to the human social fabric. I'm not convinced that as a society we're even prepared to have a real conversation eith regards to a technology that at any time could cross a threshold to sspience. The choruses of such individuals as Carmack and plenty of other HN posters on "it's just a statistical model", and "lets wait til it's at least a developmentally challenged toddler before worrying about those types of concerns" (where those types of questions are those with regard to sapience, and the matter of where the line between "just a statistical model" lay) only proves my point The reductionist viewpoint will be stretched right up to the point that there's a court case where the public finds out that training or instantiating models that communicate with one another basically involves torturing a collective mind that no one bothered to see that way because it was just so stupidly productive.

Hell, the outcome of said case would probably be shifting research in a direction whereby it's possible to make a construct that just barely toes the line. Which misses the entire moral point.

You could say I'm fairly black-pilled on the matter. Humanity can't even deal with one another, or competently raise their own children. We don't need to be committing terrible parenting on an industrial scale.

...If you've read through all of this, you're ptobably a better person than I currently, but know there was a time I shared your attitude toward the subject matter. Then I really started to pay attention to how people treat one another, and how money actually gets earmarked for different things. The learning experience is something I'd not wish on anyone, but as you, and our shared friends the Trafalmadorians say,

So it goes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: