As someone who's role involves some elements of image editing, Adobe's GenAI feature is a welcome tool... however, as someone who's read 1984 and is concerned about the future tidal wave of scams, I'm not so keen on ever a$$hole and his/her dog being enabled.
I love Adobe's GenAI, but I also hate it. The problem is it has been programmed with some pseudo Judeo-Christian moral guidelines. Trying to edit my gf's beach pics to remove crap from the background, it freaks out on some bikini pics and pops-up with an angry yellow box, basically lecturing you on your life choices for not covering your entire skin with clothing. You have to chop the image into new documents, fix each part and then reassemble them so that it never sees the whole.
Then there is the fact it marks any image using AI as made with AI (and the user cannot disable it), so the meta data makes it look like you fabricated the whole image. I used GenAI the other day to extend a blank wall so I could crop the image better, but I had to strip the metadata before I uploaded it to social media or it gets big flags on the posts saying "MADE WITH AI".
Interesting that you cite “pseudo Judeo-Christian moral guidelines” when I think you’d be hard pressed to find a Jewish or Christian priest or Jewish leader who would preach angrily against bikinis, whilst you omit a certain religion that actively encourages its women to hide behind veils.
Great example and that's where we should draw the line - thanks for your help but let me do what's best in my scenario. Systems should stop acting based on morals and instead focus on on values.
> The problem is it has been programmed with some pseudo Judeo-Christian moral guidelines.
That's because there are already a multitude of services offering to "nudify" people for cash (which the customers then not just use for their own spank bank but for extorting women), and Adobe does not want to be associated with that kind of disgusting (and in some jurisdictions illegal) behavior.
Okay, but shouldn't you be able to edit the background of a photo that has a person with a bikini in it? Why couldn't they just disallow you to edit the actual human body or the clothing?
Law of unintended consequences. Better to be safe than have to make sure your tool can't be inadvertently used as a blackmail generator.
Consider the example given. A person in a bikini at the beach is a normal image that few find objectionable. Let's make it objectionable by only modifying the background.
We modify the bikini picture's background to make the person appear in a strip club on the stage. Or we make the person appear in the bedroom of an ex.
Blackmail generated by only modifying the background.
I'm not 100% sure they're taking the right route, but given the complexity of the situation I certainly understand a cautious and conservative approach.
> In the meantime, every predator/abuser on earth is sticking an airtag in some ex's car or purse, and everybody is OK with it.
No they're not, which is why Apple and Google got forced to implement "stalking detectors" as an addon to the operating systems and stuff like regular beeping for the tags.
> Why not make the tools capable of everything and just punish bad behavior instead of preemptively censoring users and their usage of the tool?
Law of numbers. Basically, to pull off a convincing nudifier or whatever you needed to be really good at Photoshop, and the tool itself had associated costs (for buying it or learning how to pirate it), which kept the number of fake nudity photos very low. But now, with AI services doing it for a few bucks, or apps/training weights being released that allow everyone with a reasonably recent NVIDIA GPU to pull it off on their own? The numbers of such abuse exploded, because the cost went down drastically.
Yes. It may happen in this particular instance to conform to a right-wing religious stereotype, but it's actually arriving there via left-wing Silicon Valley liberalism meeting fear of bad press and "cancelling" from other more-or-less liberal news outlets.
Another instance of some people's observations that ideology sometimes seems to form a circle moreso than a line. Which is not entirely true, but isn't entirely untrue either.
I want photos to be a record of my perception. I want them to match what I see in front of me as closely as possible. I do not care if data is mangled to make this happen — in fact, the reason I love my iPhone's camera so much is because it mangles the raw sensor data so much. Apple's computational photography pipeline is state-of-the-art, and it is designed to make photos look as close to my perception as possible. All my favorite sunset photos have all been taken on my iPhone.
I do not like these AI photo-editing features. Or rather, I do not like the use of them, and will not be using them myself. The technology itself is stunning and exciting! But it entirely defeats the purpose of taking a photo for me. If someone blinks in a group picture, I want the blink to stay. If there's some obnoxious piece of garbage in the background, I want it to stay, because the whole point is for a photo to be a representation of reality consistent with my eyesight at the time it was taken.
I truly love candid photos. I am open to computational photography and even AI processing, but it should exclusively serve the goal of making the photo more candid. I feel that the AI photo editing features being pushed out these days will only make photos less candid.
It’s funny, I’ve been photographing since my early youth (dodged a bullet and never went for the career path though), and I feel the complete opposite.
In my opinion, the picture shows what I want it to show. You can never capture reality anyway: a photograph is never a single point in time, but rather a time frame during which the sensor is exposed to light, a short video, all frames overlaid in a single still shot, if you will. This can be so short that the image appears to be what your eyes see, but it is by definition always an approximation. What’s more, an approximation with parameters under the control of the photographer.
Thus I, as the artist, get to decide what I want the image to be. If that means false color, a slow shutter speed with motion blur, or long exposure to light up a scene in the dark—or yes, AI edits—that’s my artistic freedom.
I can totally see where you are coming from — and I think I agree more in the context of purposefully artistic photography. however, for me personally most photography is not artistic, but documentary. totally support using all sorts of crazy techniques and manipulations for artistic expression tho!
Before I wrote a single line of code, I championed my abilities to add/subtract individuals from photos for family and friends, first using ClarisWorks Paint! My Samsung S22 can now do this within seconds and better. I removed a TV in front of some liquor bottles on the wall. It repeated the bottles but not even the same ones!
Before I became an engineer and I was doing odd jobs, I got hired by some rich guy (whose name was basically Donald Duck) as an assistant to do some data entry, but really the first task he gave me was to photoshop some photos he took of him and his son after they finished climbing a local mountain. He had me remove people behind them, and had me whiten his teeth.
I think there's a big publicity cycle going on currently due to the AI backlash, but in actuality, no one cares about any of this. People just want nicer photos. They're not sitting around philosophizing about 'what is a photo'. 99.99% of people would say the edited photo in this post is nicer and would prefer it. But if you show it to someone who is all excited about hating AI technology, of course they'll wail that the original was so much better.
Procreate is what more software should aspire to. It's industry defining tech run by people who care deeply about their users, and it costs $13 one time. No predatory bullshit, no multi-tier licensing structure, no subscription. Just some easy to use software that respects its users.
Hopefully we'll get some rise in use of image provenance:
> The Coalition for Content Provenance and Authenticity (C2PA) addresses the prevalence of misleading information online through the development of technical standards for certifying the source and history (or provenance) of media content. C2PA is a Joint Development Foundation project, formed through an alliance between Adobe, Arm, Intel, Microsoft and Truepic.
> The C2PA metadata for this information can include, among other things, the publisher of the information, the device used to record the information, the location and time of the recording or editing steps that altered the information. To make sure that the C2PA metadata can not be changed unnoticed, it is secured with hashcodes and certified digital signatures. The same applies to main content of the information, such as a picture or a text. A hash code of that data is stored in the C2PA metadata section and then, as part of that metadata, secured with the digital signature.
"Like me, not everyone will have the stomach for it. In fact, some people are running in the opposite direction as fast as they can."
A tiny minority. Everyone will use this. Eventually, the system will remove the vaping "dinguses" automatically just as it automatically adjusts the exposure.
As if people didn't photoshop the hell out of photos before. This "era" began 20 years ago. Look at instagram, for the last decade these techniques were applied without ai.
Scale and quality are an important distinction. For example, rich people had paintings for thousands of years but the world changed when photography meant anyone could have better images for far lower cost, and then again when they became effectively free and ubiquitous.
I’d say generally they didn’t. The average person doesn’t have or know how to use photoshop. Applying a filter is hardly the same thing and completely changing elements of a photo. This is far more accessible.
That’s a great topic for late night dorm bull sessions but it’s not what we’re talking about here. People know that photographers can be selective about what and how they cover things, but there’s been an understanding that photos are recording something real in a way which is no longer true at all.
Yes, people played dark room and photoshop tricks but those were labor intensive and hard to make pass a detailed examination. What we have just seen is the equivalent of jumping from the age of archery to machine guns without much time for society to adjust and develop safeguards.
Yes, but you’re restating why I specified archery as the starting point: by WWI people had had multiple centuries to get used to firearms improving rapidly and changing ranged warfare from something requiring skilled warriors to something you could throw gobs of recently-recruited farm boys at. AI fakery has progressed at a much faster rate.
WWI was one of the worst things that ever happened in all of history, it destroyed many millions of lives and scarred an entire generation of people across the entire world and led pretty directly to WWII and the Holocaust. I'm not sure "that's how technology usually works" is a good argument here.
Ironically it also led to Turing and modern computation. Some of the earliest computers were built to decode enigma and help develop the atom bomb as far as I know.
Post-modernism is like salt and pepper. A small appropriate amount of salt-pepper would make a meal great. But salt and pepper by themselves can never be a proper meal. In similar vain, you need post-modernism to get a borader perspective of things. But that doesn't mean post-modernism, by itself, cannot be the only way to discuss things.