Hacker News new | past | comments | ask | show | jobs | submit | greenavocado's comments login

Can't help but to think of Azidoazide azide when I see azides mentioned

DO NOT USE WEBP

JPEG-XL is the superior format. The only reason WebP exists is not because of natural selection, but because of nepotism (Google Chrome).

https://www.reddit.com/r/AV1/comments/ju18pz/generation_loss...


JPEG-XL is supported by exactly 1 browser. WebP and AV1F are supported by just about every browser.

I'd love to use JPEG-XL, but I'm guessing the only way to do that is also bringing along a WASM decoder everywhere I want to use it.


This situation might change if Google is forced to divest itself from Chrome. This is currently in the US courts, but it might take awhile.

due to above said nepotism...

standards require some politicking and money I suppose


Standards do require that. Someone has to show up at the right meetings and conferences, and it's often necessary to contribute code too.

"Build it and they will come" doesn't work for products, and it doesn't work for standards either.


... and I want to add that there's no need to assume nepotism.

If you get code merged into something like Chrome, and it's big and goes unused for a few years, at some point some security-minded person will come along and argue that your code is an unused attack surface and should be removed again.


> ... and I want to add that there's no need to assume nepotism.

Sure, and while this is true:

> If you get code merged into something like Chrome, and it's big and goes unused for a few years [it's likely to get removed.]

it's also true that Google could have pushed JPEG XL instead of pushing WebP, which would have massively increased the usage stats of the JPEG XL code and saved it from removal. But (for whatever reason) Google decided to set things up to push folks to use WebP at every turn, and here we are.


Standardization is the most important feature a technology can have. Look at JS.

> Look at JS.

Given how much duct tape it took at times to get various browsers to behave I would say JS is proof of the opposite. It succeeded in an environment where standards where a mere suggestion.


Don't they all run JS ES5 itself the same way? It's more that each has different feature sets (HTML5 stuff, webrtc, wasm, etc), which are callable from JS but would've been a problem regardless of the language.

Standardization is a big feature of JS, but it's also a surprisingly good language for its use cases. There was even good reason to port it over to the backend (NodeJS).

There are only two browsers, Safari and Firefox.

only Nightly

Upload a JPEG, gets converted to WEBP next person, downloads WEBP converts it to JPEG to actually edit it in software because even things like Photoshop/MacOS Preview can't edit one natively, saves to JPG and uploads, gets converted to WEBP

next person downloads the now JPEG's WEBP'd JPEG'd WEBP'd image

fast forward a decade the original quality versions of the images no longer exist just these degraded


Even if you have image editing software that directly supports WEBP format input / output, you still have the exact same problem, because (like JPEG) it's a lossy format which will lose fidelity with each successive generation of load / save over time. If you intend to edit an image, then the original (if possible) and it's edit should both be saved in a lossless format even if the final published output is saved in a lossy format. If no lossless format of the original is available, then the highest quality version of the original should be converted to a lossless format for "archival" before editing, and that copy should always be used for editing purposes, rather than piling loss upon loss editing the lossy format and re-saving. It's kinda like copies of copies of copies of MP3 audio. Eventually it becomes a soupy mess not worth using.

doubles the speed of the degradation because the image is encoded on upload regardless of size and double stacks the encoding of artefacts

Well... kinda. WebP is a bit of an unusual format in that it supports both lossy and lossless forms of compression.

Most decent image editing software (Photoshop, Pixelmator, etc) will let you choose what you want.

https://www.adobe.com/creativecloud/file-types/image/raster/...

But if you're not a professional it would be easy to mix up the two and slowly end up with VHS level degradation.


relevant xkcd: https://xkcd.com/1683/

So, what you are saying is that JPEG-XL is superior, as long as your use case is insensitive to whether the majority of web users can view your content?

I get it but 2+ billion Apple device users is not nothin’.

They can render JPEG-XL; everything else will render the fall back format like JPEG or WebP.


Teams & developers will likely only chose a single format if they can, the one that most browsers support, because doing some content negotiation is more code, more work. It doesn't take away anything from the parent point though, if JPEG-XL is more performant it could reduce bandwidth requirements.

> because doing some content negotiation is more code, more work

It's actually not more work. The user's browser automatically handles the content negotiation and only downloads the image format it understands:

    <picture>
      <source srcset="photo.jxl" type="image/jxl">
      <source srcset="photo.webp" type="image/webp">
      <img src="photo.jpg" alt="Product photo" loading="lazy">
    </picture>
macOS, iPadOS and iOS get the JPEG-XL image, device that can handle WebP get that and everything else gets JPEG.

There are several image delivery services that will provide the best image format depending on the device that's connecting.


I mean if you define superior not in terms of its technical merits but because of its blessed status by Google. It's not a terrible format but it is very much "what is the worst quality we can reasonably get away with to save on bandwidth." Such a thing does have its uses.

At some point you have to be pragmatic and meet users where they are but doesn't mean you have to like that Google did throw their weight around in a way that only they really can.


> I mean if you define superior not in terms of its technical merits

“Technical merits” are rarely, for anything, the sole measurement of fitness for purposes.

Even for purely internal uses, internal social, cultural, and non-technical business constraints often have a real impact on what is the best choice, and when you get out into wider uses with external uses, the non-technical factors proliferate. That's just reality.

I understand the aesthetic preference to have decisions only require considering a narrow set of technical criteria which you think should be important, but you will make suboptimal decisions in the vast majority of real-world circumstances if you pretend that the actual decision before you conforms to that aesthetic ideal.


just fuking use a poly fill to add support of JPEG XL. Or store JPEG XL and convert on the fly to JPEG to supply it to browsers that don't support JPEG XL.

No need to bring JavaScript into it.

In HTML, use `<picture>`: https://developer.mozilla.org/en-US/docs/Web/HTML/Reference/...

In CSS, use `image-set()`: https://developer.mozilla.org/en-US/docs/Web/CSS/image/image....


> just fuking use a poly fill to add support of JPEG XL. Or store JPEG XL and convert on the fly to JPEG to supply it to browsers that don't support JPEG XL.

Doesn't a polyfill imply more Javascript running on the device?


Only until the browser gets updated and then the polyfill stops being invoked automatically. It's self-healing.

Tt's a WASM . And really isn't very big. So if you need to store and serve many images files, potentially big images like in an preservation & diffusion software (like the stuff that I'm working), I felt that I could afford to pay the extra few KiBs on that WASM.

I enjoy webp lossless mode

I'm not going to use webp or jpeg-xl.

Yeah they're both a pain in that there's friction at all to open/use them.

JPEG XL is superior... just like how Betamax was visually superior.

The main problem with your argument is the people want JPEG-XL. The main reason it is not in is purely due to Jim Bankoski's limited judgement, intelligence, and foresight.

https://groups.google.com/a/chromium.org/g/blink-dev/c/WjCKc...


Just like how some people wanted Betamax. Politics drives adoption; not technical superiority.

"People" here are a tiny minority of techies that have emotional connection to the library they built and a group of OSSers which will take anything that bashes Google as a gospel.

Everyone else... is fine with JPEG. And occasionally shares HEIC pictures from their iPhones.


Thankfully, unlike people and physical hardware, software and algorithms can lie in wait in perpetuity until they are resurrected when the political winds finally change course or favorable conditions for their spread emerge.

> Thankfully, unlike people and physical hardware, software and algorithms can lie in wait in perpetuity until they are resurrected when the political winds finally change course or favorable conditions for their spread emerge.

XHTML 2 is waiting on that one... Oh well...


>just like how Betamax was visually superior.

Only for a very brief period. (Beta 1)


Palantir is the poster child for a global panopticon

We have been told our whole lives that nuclear winter will be the end of humanity, but in reality, biomedical scientists will exterminate us.

Not only that, but just a month earlier, South Korean scientists published another Virology Journal paper revealing that they had engineered a chimeric H5N1 virus using hallmark gain-of-function (GOF) techniques, combining gene segments from three different influenza viruses to increase the virus's heat resistance, alter host targeting, and enhance human cell entry.

"Recombinant viruses were generated using a pHW2000 plasmid-based reverse genetics system."

"Combining the R90K and H110Y mutations (22W_KY) resulted in a synergistic increase in thermal stability and maintained HA activity without measurable reduction even after 4 h at 52 °C."

"22 W HA and 22 W NA genes, along with six internal genomic segments (PB2, PB1, PA, NP, M, NS) from PR8 and a PB2 gene from 01310 containing the I66M, I109V, and I133V (MVV) mutations"

The study also confirmed enhanced antigen uptake and intracellular penetration in human cells:

"The highest level of intracellular entry was observed for BEI_22W_KY, confirming its superior effectiveness in penetrating cells."

Ref: https://virologyj.biomedcentral.com/articles/10.1186/s12985-...



How does one verify all those domains are not essential to the OS?

Microsoft explains what all (most?) of the domains are for

https://learn.microsoft.com/en-us/windows/privacy/windows-11...


Thanks. I think some apps do need to connect to Microsoft to deliver services like update and antivirus.

Some are used by the products itself to maintain reliability like maps and store.

Others are diagnostics, which can still be manually turned off.

Not sure why one would maintain a dedicated hosts file to block endpoints if they can simply configure this on app level itself.


That "small sample of telemetry and spying domains" also contains login pages and update downloads, among others. You're just saying everything Microsoft is telemetry and spying, here are all their domains.

Same with all those Apple domains. This is the world we live in now. Operating systems and most applications are now backed by cloud services.

I regret to inform you that the inmates run the asylum, so your proposal will go nowhere. Banning ads? Making Google confess 'We were paid to say this'? Forcing Burger King to admit they're tricking you into midnight cravings? Politicians funded by ads would outlaw that idea faster than you can say 'campaign donation'. Your felony-for-persuasion is bold. It's also delusional.

This is the distinction between permanent resident and temporary immigrant.

An immigrant is someone who moves to a country other than their own with the intention of residing there for a significant period (usually at least one year), regardless of their ultimate intention to stay permanently or leave later.


You can just look up the definition of the word immigrant.

bro created ana account just to respond lmao

Pointless suffering. Report violations to the CSE, Médecin du Travail, and Inspection du Travail.

It was a choice, I loved my job there. I had more exciting projects than most of my friends in the private sector!

Excellent way to get blacklisted and never work for the State again if you're a contractor, or end up in a low impact, boring job if you're a career worker.

In the USA most software engineers are FLSA-exempt ("computer employee" exemption).

No overtime pay regardless of hours worked.

No legal maximum hours per day/week.

No mandatory rest periods/breaks (federally).

The US approach places the burden on the individual employee to negotiate protections or prove misclassification, while French law places the burden on the employer to comply with strict, state-enforced standards.

The French Labor Code (Code du travail) applies to virtually all employees in France, regardless of sector (private tech company, government agency, non-profit, etc.), unless explicitly exempted. Software engineering is not an exempted profession. Maximum hour limits are absolute. The caps of 44 hours per week, 48 hours average over 12 weeks, and 10/12 hours per day are legal maximums for almost all employees. Tech companies cannot simply ignore them. The requirements for employee consent, strict annual limits (usually max 220 hours/year), premium pay (+25%/+50%), and compensatory rest apply to software engineers just like any other employee.

"Cadre" Status is not an exemption. Many software engineers are classified as Cadres (managers/professionals) but this status does not automatically exempt them from working time rules.

Cadre au forfait jours (Days-Based Framework): This is common for senior engineers/managers. They are exempt from tracking daily/weekly hours but must still have a maximum of 218 work days per year (including weekends, holidays, and RTT days). Their annual workload must not endanger their health. 80-hour weeks would obliterate this rest requirement and pose severe health risks, making it illegal. Employers must monitor their workload and health.

Cadre au forfait heures (Hours-Based Framework) or Non-Cadre: These employees are fully subject to the standard daily/weekly/hourly limits and overtime rules. 80+ hours/week is blatantly illegal.

The tech industry, especially gaming/startups, sometimes tries to import unsustainable "crunch" cultures. This is illegal in France.

EDIT: Fixed work days


I think there is theory and there is real life. As tech worker, in 20 years career, in private sector, I have always been on forfait jours, working more than 10h/day on average, during many years weekend included. I never got paid extra hours. So I get what you say about the perception and the law. The French law is protective (i.e if I can prove that in a court I'll get my extra hours paid for sure but my career would end. Period.

>I'll get my extra hours paid for sure but my career would end.

Are you working in an area that is that specific ? I'm French but I'm naive.


Some State services, such as the "Trésor", which oversees French economic policies, do not respect this at all, and require 12h work days most of the year. The churn is enormous, workers staying there less than a year on average.

> 218 rest days per year (including weekends, holidays, and RTT days)

Wouldn’t that be nice, 218 rest days? It’s 218 working days.


Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: