Hacker News new | past | comments | ask | show | jobs | submit | Kluggy's comments login

It is! I've made https://www.simplyrecipes.com/homemade-chili-crisp-recipe-74... recently to great success


State of Connecticut v. Swinton Ruled that admission of digitally enhanced evidence was allowed. This is going to be a long road full of nuance before we land on a largely socially accepted test for how much adjustment is allowed in court cases.


There's a good rundown of many relevant cases in different US jurisdictions here [0]. It looks like up through the date of publication courts were pretty consistent in ruling that digitally enhanced images may be admissible if it can be shown that they are accurate replicas of an original and the enhancement was only used to make things that were already in the image more clear.

In the case of AI I do think that that will be harder to prove, because there's no specific algorithm that you're following that can be shown to be unable to introduce misleading artifacts.

[0] https://www.jonathanhak.com/2018/02/17/image-clarification-n...


I don't see how this is related.

AI enhanced images (as in enhanced with generative Ai) have nothing in common with digitally enhanced images, other than maybe they're both done in digital form. The process, how the image is transformed and result are wildly different and must be treated differently.

> At the trial, the state (plaintiff) introduced photographs of a bite mark on the victim’s body that were enhanced using a computer software program called Lucis. The computer-enhanced photographs were produced by Major Timothy Palmbach, who worked in the state’s department of public safety. Palmbach explained that he used the Lucis program to increase the image detail of the bite mark. Although the original photographs contained many layers of contrast, the human eye could perceive only a limited number of contrast layers. After digitizing the original photographs, Palmbach used the Lucis program to select a particular range of contrast. By narrowing this range, certain contrast layers in the photographs were diminished, thereby heightening the visual appearance of the bite mark. Palmbach clarified that the Lucis program did not delete any contrast layers; rather, the contrast layers that fell outside of the selected range were merely diminished. Indeed, nothing was removed from the original photographs by the enhancement process. Palmbach also testified that the Lucis program was relied upon by experts in the field of forensic science. The trial-court judge found that the computer-enhanced photographs were authenticated. Because the photographs satisfied the other requirements for admissibility, the trial-court judge admitted the photographs. Subsequently, the jury convicted Swinton. Swinton appealed.

edit: adding source https://www.quimbee.com/cases/state-v-swinton


Perfect example. Digitally enhanced in this context should mean "bring out the details of already captured photons - so our eyes can see better" and nothing else. Generative AI is completely different. It's even in the damn name!


No? There is no nuance here. The Swinton case is afaict a discussion on basic deterministic contrast increase.

There is no element of "AI enhancement". AI enhancement means "apply a statical model to make up information that is not present in the source data". Even if you take AI to be something more than a glorified statistical model, it cannot add any detail that is not in the original data. What it can do is, based on the biased training data, create imagery that looks real and can (1) make jurors believe the image presented are real (because the statistical model is geared to making plausible imagery) and (2) introduce details that are useful to the side presenting the images. The first is bad because it means the jury believes the fake image over the lower quality source data because it looks better despite being fiction, and the second is no different from asking someone to improve the image in photoshop.


mousse Is the same. Could be meat. Could be chocolate.


Do you have a source for your claim that wallet providers get access to all card transactions? Looking at the wallet, my linked cards only have the data that came though the mobile payment.


some of us hold onto the past. I still have a Zip disk and drive (and a bunch of old media, all the way back to a 8 inch floppy) and I have a fairly large collection of 4k blurays


I am listening to some of my minidiscs at this very moment


I have vinyls


Real Programmers gyrate the needle by hand


While I generally agree with you, the claim that trade joes entered into negotiations in bad faith and cost the other brands money be asking them to make changes to recipes is a bit beyond just competition. Presuming that claim is true.


It's absolutely because of inertia. Cutting a tarball is a very public indicator that the software is ready to be used. A git tag or branch doesn't have the clarity of "use this" as a release tarball does.

That said, this is absolutely going to be changing now. We obviously can't keep relying on tarballs anymore. We'll find a new normal that will work for a very long time until some other critical issue arises and the cycle repeats.


We can use tarballs - they are useful as signed artifacts, but only if we verify that they are reproducible.


If you're going to download both the tarball and the got repo and verify the reproducibility of the tarball, then why bother with the tarball at all? You already have the got repo.


A git repo can vanish overnight. Git is often used to snag source, but tarballs are still crafted after even then:

https://news.ycombinator.com/item?id=39903813


Git repos only vanish if they have no clones. For the purpose of accountability, the "official" repo is not more privileged than any other clone that retains the same history, and can be verifiably recreated from any clone if it is ever lost or tampered with. (Assuming SHA-1 isn't too broken, that is.)

For archival purposes, nothing prevents people from creating a tarball that contains the .git directory as well, which would preserve not only the current state of the project but its entire history.


Git repos only vanish if they have no clones.

And? You're speaking probabilities, not certainties. It's not relevant in terms of archiving. You don't guess, you don't hope, you simply 100% ensure that 10 days, 19 years, or a century from now you can build the same thing.

I agree that adding "extra stuff" to a tarball isn't a bad idea, and in fact, many already do!


If you can archive the tarball, you can archive the git repos. If for some reason you can't, you can cut your own tarballs from the git repo and then you don't have to worry about them because you made them yourself.


Archiving an entire git repo is serious overkill, and untenable realistically. Debian has 30k packages, if not more, and some of them are the linux kernel.

My responses in these threads has been to the "why not build from git" logic championed here.

You want to build from a reliable, repeatable source. And as I mentioned, that can be git clone -> tarball, sure.


It wasn't sloppy. It was just luck that someone noticed a half a second extra latency on the second connection of a newly run sshd process and went down the rabbit hole. Had they just shrugged and moved onto more "important" tasks/deliverables, it would most likely have landed in production across the world.

I'm a tad reminded of https://xkcd.com/705/

We got so lucky here. We won't get lucky every time. We will have a massive breach one of these days.


I don’t think it was luck. I think some people are so in tune with their systems that investigating an anomaly like this is a frequent occurrence. This particular anomaly just happened to have an explosive ending.


Yes, I have met Andres in real life and I can totally believe that he is that in tune with his system. He wrote that he found this while benchmark PostgreSQL and saw weird load from ssh. He does a lot of benchmarking of PostgreSQL patches.

But I would say it was also luck. If Andres hadn't been benchmarking on Debian Testing (or whatever system he found this on) this might have taken longer time to discover.


It may not sound sloppy if you are used to todays apps and websites but half a second is an eternity in CPU time. Half a second is also very much a significant amount of time compared to normal ssh connection times with low network latency - if not Freund then someone else would have noticed, complained and this would have eventually been investigated. The only luck part here is it taking less than two months for this to happen but the attacker could have prevented this avenue for detection entirely by optimizing the exploit not to slow down the ssh proces.


Kinda like Clifford Stoll!


A wildcard cert is a single cert that is valid for any subdomain. The specific ones you use are not exposed to the CT, just the fact that there is a *.example.com was minted


They actually planned for the three major overtones and the nine minor theme variations from the start. You may disagree with the results, but it wasn't modified due to outside pressure.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: