Hacker News new | past | comments | ask | show | jobs | submit login
When AI helps you code, who owns the finished product? (theregister.com)
25 points by beardyw 27 days ago | hide | past | favorite | 37 comments



It depends on what you're developing and possibly for whom, but I think in general not that many will care about using AI or any other tool.

Nobody bats an eye when you use IntelliSense or another form of autocomplete, or a plugin with code snippets. I don't worry about the implications of using StackOverflow, or some plugins to do a bit of codegen for me (e.g. ORM mappings for my app from a live database schema), nor would I worry about getting an informal suggestion for how to write a particular method from a friend.

You'd have to be in a pretty corporate environment for AI or any of that to start mattering.


Perhaps nobody needs to own the code

Code should support human values. It's not clear that human values are best served by private code ownership. And AI code might best provide value as a publicly owned good.

That is why I write all my code under the AGPL, and think all AI assisted code should automatically be open source.


Property is a silly way to think about information.


Naivety is a silly approach to legal concerns.


Property is a silly way to think about property too, fwiw.


I take the view that if it's silly and it works then it's not silly.

Physical property seems to work, though I'm less sold on the effectiveness of intellectual property.


That's my position too. I'm just not sure the harms are worth the benefits.


> I've plastered a copyright notice at the top of the source - as I've always done...

Do people actually still do this?


If the vim tab completion plugin that I've been using for a decade wants to own all my code, it can have it


LOL


Interesting question. If the Google Vs Oracle case had played out today, and Google was able to prove an AI had created an API that looked incredibly similar to Java, would that have stood up as a defence? If code isn't owned by the developer or company that wrote it, then presumably someone else can't sue them for copyright infringement.

If that's true then there'll be no effective way to ever sue a software developer over the code they wrote (or generated, or prompted, or whatever the term will be.)


Well, it depends. It can be you, your company, a customer, a friend, a SaaS company that asked you to sign a CLA to contribute to their "open-source" project,…


When Stackoverflow helps you code, who owns the finished product?


SO commentors own their parts and you own yours; just like books, movies, audio works, etc. Fair use, copyright, work for hire, paraphrasing, and attribution are things that predate SO. Comments on SO are licensed as CC-BY-SA.


Tbf the same debate has raged over SO snippets for a long time.


I think we should think of AI more like a search engine that understands deeply what we want right now than a person helping us. And if anyone owns anything the AIs regurgitate it is the authors of the training data.


If you use paint you bought at the store, who really owns the finished painting?

If you use a saw to cut wood, who owns the wood you cut?

For now, "AI" is a tool. Maybe that will change in the future when AI is indistinguishable from people and have rights and privileges, but for now, it's just a tool, and tools do not transfer ownership.


If copyrightable creative source code is the toolchain precursor to an executable binary, can a creative LLM prompt be copyrighted as a toolchain precursor to non-creative source code? Should LLM prompts be versioned alongside generated source code?


Yes, the prompt is copyrighted, just like your post here is copyrighted. But I don't think there's any point in versioning the prompts. The responses are just too nondeterministic. If you feed it the exact same prompt again, you can get something quite different back. On the other hand, if giving the same prompt got you the same response (like a reproducible build), then absolutely yes, version those prompts.


Imagine a court which ruled that LLMs must have a reproducible mode, based on a session ID provided by the user, to establish a legally binding and deterministic connection between copyrighted LLM prompt and generated code.

Are LLM vendors allowed to permanently archive user-copyrighted prompts? Presumably the vendor's EULA can force the user to grant a non-exclusive license for any purpose.

> If you feed it the exact same prompt again, you can get something quite different back

If one trusted an LLM to generate code from the first prompt, it might be worthwhile to periodically evaluate new code generated from the same prompt. E.g. can the LLM generate a better implementation, or does testing of new output identify an ambiguity in the first prompt? Would a newbie developer have written better code after N months/years of experience?


Those are fun thoughts.

I've imagined a world where you can create a long, complex prompt that results in a reproducible response of code that does exactly what you want, and does it correctly. In that world, the prompt could be your versioned source and the build process could start from that. Implementing changes could be done by tweaking the prompt. Of course you'd still want to archive the "intermediate" source code, if only to protect yourself from an LLM outage.

In that world, you actually would be getting new code from the source prompt regularly, and yes, it hopefully would improve over time.

In our world however (at least for me), usage tends to be more conversational, so there really isn't even "a prompt" to save. Not to mention that I never use the output as-is.


I think the issue here is that it's really clear who owns the liability so if your employees are coding "with AI" then you need to understand that they don't know enough to know when they are putting you at risk.


If code needs an owner for legal purposes, then it should probably be the person or persons who commissioned the code to be created, if said ownership is not transferred to a different person or persons by agreement.


Who owns the finished product is whoever published the product (first). Otherwise a special license for AI (to differentiate spam mostly) can be created for autonomously created/published code.


Well,of course the one giving the prompts, not different from when your boss takes all the credit for a series of prompts that created a solution you wrote.


So it's Pope Julius II's Sistine Chapel and Francesco del Giocondo's Mona Lisa?


Wikipedia: "The Sistine Chapel is the large papal chapel built within the Vatican between 1477 and 1480 by Pope Sixtus IV, for whom the chapel is named. The ceiling was painted at the commission of Pope Julius II."

I suppose the Catholic Roman Church has always owned the chapel. It then owned the paintings as part of the chapel.

This a case where the artist was so notorious that we know he is the author. But he has never been the owner, he was paid to work.


It in fact is. The artists "just" created the works. No different from your work if you work for your employer.

It is OpenAI's GPT, not Ilya's.


When I code for my employer, who owns the code?


When you contribute to an open source project on your company time - who owns your contribution?


When you draw an illustration with software stabilization, who owns the copyright?


When my owen helps me make a pizza, who owns the finished pizza?


That's why you eat the evidence to hide the crime.


When you use a spellchecker to check a text who own the text?

When you use a style guide, citation book, encyclopedia, to write a text who owns the text?

I'm sorry but obviously if you produce something whether with help from a tool or not, you are responsible.


When AI helps you code, who owns the finished product?

When a hammer helps you build, who owns the finished product?

... in other news, we've just learned that paintbrushes now have a say in art ownership disputes.


I made this. I made this.


Who owns the code? I guess you should never really admit that AI coding assistant helped you during coding and there you go....problem solved.

Edit: It's actually more harder than this; the best option would be to roll your own AI coding assistant either on-prem or in your private cloud.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: