Hacker News new | past | comments | ask | show | jobs | submit login

That doesn’t make sense in the context of this tool.



It makes sense because the protective mechanism depends on the AI ingesting the picture as-is, with the added noise.

If the ingestion workflow alters the picture sufficiently, the protection could be lost, same as DRM qualities are lost if someone alters the data-stream by recording what is shown on the screen of the display device.


Then you either don't understand the tool, how generative AI models work, or what the analog loophole is




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: