Hacker News new | past | comments | ask | show | jobs | submit login

It also apparently copies the artist’s logo.

In a way, the “it just does what humans do but faster” argument is starting to follow the “a number can’t be illegal” trajectory.




Either way I support AI art and AI in other fields. Just because artists are mad it's gonna take their jobs does not seem like a legitimate reason to halt human progress. It's just inevitable the way things are going.


This. People thought that photography would make artists obsolete, then Picasso came round (simplifying here, ofc). Who knows what crazy creative things humans will do in the presence of those tools?


Just because some trend feels inevitable doesn’t mean we shouldn’t oppose it. We shouldn’t be beholden to the march of technology like it’s some force of nature that we are powerless to resist.


Can you give examples of when technology was successfully rolled back for social reasons (not because it was found to have dramatic side effects, like asbestos eg) and that didn't coincide with a massive drop in HDI (like the fall of the Roman empire)?


Anti-WMD proliferation comes to mind, but perhaps that's a special case.


I wouldn't call that a successful roll-back unfortunately, they've been somewhat contained at best, but they're still being produced and more powerful than ever.

Those discussions aside, what I meant by social reasons was people wanting to see some tech go away because it's automating jobs.


Please save this comment to refer to when an iteration of Copilot starts to make you truly worried for the future of your career. Thanks in advance.


Yes, many of us will turn into cowards when automation starts to touch our work, but that would not prove this sentiment incorrect - only that we're cowards.


Dude. What the hell kind of anti-life philosophy are you subscribing to that calls "being unhappy about people trying to automate an entire field of human behavior" being a "coward". Geez.


Because automation is generally good, but making an exemption for specific cases of automation that personally inconvenience you is rooted is cowardice/selfishness. Similar to NIMBYism.


Yeah, and we should really do something about those abominations like Excel and Python that let ordinary people create programs without hiring us to do it the right way.


There is a big difference between "this is a tool that makes some things easier" and "this is definitely endangering the skill I spent a lifetime learning".

You will know it when you see it.


There is no big difference. The steam engine put 90% of the world out of agricultural work. That is a big difference. Now you can buy strawberries all year round and drink coconut water in Alaska.

Nobody will care where their art comes from any more than you care about how your food's field was plowed; and all their lives will be better for it.


There is a huge difference in how you will feel when the tool that might put you out of work shows up.

Keep your comment somewhere. Come back to it when you look at a new tool that promises to merrily trample on your entire field’s income and provide an endless source of “usable, I guess” substitutes. Let it provide solace as you stare into a future with no room for the craft you’ve spent a lifetime honing.


I don't see this as a likely outcome for programming assistants.

Software development is heavily labor-constrained, if copilot can make everyone a 10x developer, we'll get slightly less than 10x the features-per-year on an industry-wide basis after contributors shuffle around.

The effect will be most pronounced in application development, where a team of 1-5 is about ideal for a coherent app made with taste, and that team could produce the output of 10-50 developers. Not such a bad thing.

Unfortunately this is unlikely to be true for visiual art, I don't predict that making artists ten times as productive will meet a latent demand for ten times as much art. Could be wrong, but my sense is that about as much art is purchased as people want to buy.


What progress? We’re bruteforcing solutions without any way to learn from then. ML eliminates serendipity. I’m not strictly anti-ML. Horses for courses and all that, but I’ve got to admit I get weird, “The humans stopped learned and the computers started,” sci-fi vibes sometimes.


>ML eliminates serendipity.

This is a statement that is pretty quickly disproven if you actually pay attention to the generated art. Lately I've been seeing TikTok videos where people are using DALL-E to create "new aesthetics" - "vampwave", "neon apocalypse", etc.


Whilst automation in textiles, agriculture, etc. brought measurable benefits to society, this will have few as it stands. I can think of a dozen things that are worth automating before this. This leaves me suspecting that projects like DALL-E are gunning for Shutterstock's $500 million annual profit and that's it.


This leaves me suspecting that projects like DALL-E are gunning for Shutterstock's $500 million annual profit and that's it.

Do you think AlphaGo was motivated by a burning desire to discover better board game strategies?


Elsewhere in this thread, people celebrate AI for some reasons. I'm not so impressed by the tech when the goal is just profit without some sort of ethical upside.


Stealing people’s work to serve as your training set is not human progress.

Sounds like the AI model should be paying royalties to every affected artist for the right to sample their work.


Every time I've been to an art museum (and this is a handful, not just SFMOMA), there has been a person or two in a room with some, well, art hanging on the wall. They had notebook of the appropriate paper medium and art supplies and were copying the existing work.

This is not something new. https://www.realistartresource.com/the-tradition-of-copying

Copying of existing works is part of how an art student learns. That this one happens to be a math model at its core is an interesting philosophical problem. The other people's work to serve as your training set is exactly what art students do - and the works they copied and the works that they have yet to produce are not royalty encumbered.

Nor should a mathematical model. It happens that the developers working on this problem have gotten it so that it can do its learning and creation many times faster than an art student in a gallery... but it still can't get hands and faces right.


> The other people's work to serve as your training set is exactly what art students do

“Training” an art student and training an AI model are vastly different, and your equating the two is, frankly, nonsensical and absurd.

An art student isn’t a trivial weighted model capable only of mapping stolen text prompts to stolen image representations of them.

> It happens that the developers working on this problem have gotten it so that it can do its learning and creation many times faster than an art student in a gallery

It hasn’t learned anything.

It correlates stolen textual descriptions with stolen images, and then regurgitates mash-ups of the same.

This type of AI model cannot produce anything other than purely derivative work stolen from others.


Ok... though I question the use of the word "stolen".

If I was to dabble in sci-fi art and made something that fit in the art style of Steward Cowley ( https://archive.org/details/terrantradeauthorityhandbookstar... ) do I need to credit the art?

When it comes to playing around in blender - my designs are obviously derivative of others - do I need to credit those artists? Even the ones that I don't remember more than a "I saw this print at a comic art show once..."

How original does my own work have to be before it isn't a mashup of stolen images that I half remember?


> If I was to dabble in sci-fi art and made something that fit in the art style of Steward Cowley … do I need to credit the art?

Probably, yes.

> When it comes to playing around in blender - my designs are obviously derivative of others - do I need to credit those artists?

Again, probably, but nobody is likely to care if you’re not actually selling your work.

> Even the ones that I don't remember more than a "I saw this print at a comic art show once..."

Then that’s not the prompt you should be starting with if your goal is to produce an original work.

> How original does my own work have to be before it isn't a mashup of stolen images that I half remember?

How original does it have to be before it’s not plagiarism?

Now, remove your ability for individual creativity, such that you cannot come up with an original idea. All you can do is plagiarize.

That’s the difference, here. This isn’t an AI trained to have creative thought, a genuine understanding of what it’s making, and original ideas. It’s an AI trained to regurgitate mashups of plagiarized works based on weighted correlation between the prompt and the (also plagiarized) descriptions of the works it’s regurgitating.


What do you think the artists "trained on"? Didn't they ever see anyone else's art? Don't you think it influenced their art?

If it's a 1:1 copy, I agree. If it's a "that looks vaguely like the style that xyz likes to use", I disagree.

And I assume you'd run into plenty of situations where multiple people would discover that it's their unique style that is being imitated. Kind of like that story about a hipster threatening to sue a magazine for using his image in an article only to find out that he's a hipster and dresses and styles himself like countless other people, so much so that he himself wasn't able to tell himself apart from another guy.


Your entire point hinges on a false assumption; “training” a human artist (or programmer) is the same as training an AI model.

It is not.

The AI model can only regurgitate stolen mash-ups of other people’s work.

Everything it produces is trivially derivative of the work it has consumed.

Where it succeeds, it succeeds because it successfully correlated stolen human-written descriptions to stolen human-produced images.

Where it fails, it does so because it cannot understand what it’s regurgitating, and it regurgitates the wrong stolen images for the given prompt.

AI models are incapable of producing anything but purely derivative stolen works, and the (often unwillingly) contributors to their training dataset should be entitled to copyright protections that extend to the derivative works.

That’s true whether we’re discussing dall-e or GitHub copilot.


> The AI model can only regurgitate stolen mash-ups of other people’s work.

We all stand on the shoulders of giants. If you're very dismissive, I think it's easy to say the same about most artists. They're not genre-redefining, they carve out their niche that works (read: sells) for them.


The art that was trained on was all Creative Commons apparently, so perhaps artists should understand licenses first before giving their work a permissive license.


All CC licenses, other than the “CC0 public domain dedication”, require attribution, but CC themselves have taken what I’d consider to be an ass-backwards position on the matter:

> At CC, we believe that, as a matter of copyright law, the use of works to train AI should be considered non-infringing by default, assuming that access to the copyright works was lawful at the point of input.

https://creativecommons.org/2021/03/04/should-cc-licensed-co...


>backwards position

It's not backwards. It's the same as if a human artist studied it.

Backwards would be thinking that any creative work you make is a derivative work of many creative works that you've seen in the past which you aren't copying from.


> It's the same as if a human artist studied it.

It’s not a human artist, and it can only regurgitate mash-ups of work stolen from others.


Where is this said? I'm looking at the collection of artists this site is happily offering to rip off (https://twitter.com/arvalis/status/1558632898336501761/photo...) and there's a lot of people with long careers making copyrighted work. I really can't imagine finding a ton of CC work by Bernie Wrightson or Wayne Barlowe or Brom or Junji Ito, for instance.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: