It seems fundamentally different to put in a ton of work building 3D models, putting together scenes, etc., versus typing a description into a text box and seeing what pops out.
I may be wrong, but I get the sense that computer art was welcomed by people actually working in the field (did professionals criticize the computer graphics in Star Wars or Wrath of Khan?) and it was mostly the lay public that saw it as somehow not real. The opposite seems to be true for AI "art."
> It seems fundamentally different to put in a ton of work building 3D models, putting together scenes, etc., versus typing a description into a text box and seeing what pops out.
People at the time also said using a computer was fundamentally different from putting in a ton of work into building physical models.
A lot of tech adoption is motivated by economics, so the argument that "before it was more work, now it's less work" will almost always apply regardless of the specifics. I don't think it's a useful thing to focus on. It's almost a moral argument: I deserve it because I suffered for it, but he did it easy so he doesn't deserve it.
In fact, I would even go further. I would say it's part of the definition of technology. What is technology? Technology is a thing or an idea, created or discovered, that makes work easier and/or cheaper.
I agree that it's not useful if we're looking at practical stuff. It doesn't matter to me if my table was built with ten hours of human work, or ten seconds.
But for creative work? I think it matters a lot. You used the phrase "creating art." I don't think it counts as "creating" if there's no work going into it. Typing some words into a prompt box and getting a video out is not "creating," any more than doing an image search and printing out an image of a painting is creating a painting.
Printers are extremely useful devices, but they don't create art.
but there are subtle signs that the old ways made art different
people do more practical effects, they also miss the era of physical set filming[0], i personally am bored seeing the latest gpu able to create gazillions of whatever because i got the memo, gpu can do everything.. i get more magic seeing what people did with very few
don't get fooled by the "people reject evolution every time"
[0] technology can distort the focus onto the tool out of the art, films before had to arbitrate between various tricks to get a scene to work, now apparently people don't. they film bits and postprocess everything later, the tech allows infinite changes, but the cake has no taste
The core difference is in the amount of intentional decisions being made by the artist. A prompt, no matter how specific, still delegates a lot of work to what is essentially chance. Something like Blender does make it easier to do certain things, but you still have to actively choose to do them. This is why AI-slop, no matter how detailed, will always feel off. It lacks the deliberateness of a human artist who knows exactly what they're doing and why at every level.
Did the Wrath of Khan have any CGI? The only scenes I remember are the jarringly bad computer displays at various points on the Enterprise. If I recall correctly, the rest of the movie used traditional VFX: models, compositing, etc. I personally find the battle scenes in that movie-- particularly the nebula scene-- to be beautiful and one of the space battle scenes ever. Despite what others think, I also think that the first Star Trek movie is both a technical and narrative masterpiece, so YMMV.
Thanks, somehow I forgot about that scene. Pretty great by 1982 standards… a little lame by modern standards, although I could imagine that this is exactly the kind of snazzy but low-res simulation a scientist of the future might generate.
I may be wrong, but I get the sense that computer art was welcomed by people actually working in the field (did professionals criticize the computer graphics in Star Wars or Wrath of Khan?) and it was mostly the lay public that saw it as somehow not real. The opposite seems to be true for AI "art."