Hacker News new | past | comments | ask | show | jobs | submit | jenadine's comments login

Because you can only browse pages from 2005 with KHTML (Standards evolves and KHTML hasn't been kept up to date in that respect)

You don't need to save.

And it has nice features like highlights selection


You know that every respectable editor nowadays has a auto-save feature on focus loss? So just alt+tab from your editor to your browser and refresh.


> there are many cheaper options to get rid of an enemy, but those have been outlawe

I'm curious. What are these options?


Nukes, I think they mean nukes. Though "cheaper" is kind of debatable there, a fleet of SSBNs isn't cheap by any measure.


No.

(There might be side projects like the squish test framework or the MCU port that are proprietary, but Qt itself is fully free software)


I guess you were not serious about the climate, as this will have no measurable effect on it, and with things like Jevons paradox [https://en.wikipedia.org/wiki/Jevons_paradox] could even have the opposite effect (more people using the app more because it is more usable)

But yes, for the love of your users, don't make an app using electron :-).


Vacuous comment. What action that an average person can take will have a "measurable" effect on the climate (excepting eco-terrorism)? In aggregate this kind of stuff is still meaningfull.


> In aggregate this kind of stuff is still meaningfull.

No, I don't think it is. The scale is so small that even if every electron application was suddenly replaced by an equivalent super lightweight one, I claim it would make no difference for the climate.

Other things can make a difference in aggregate, like eat less meat, using better mode of transportations, ... But I don't think the CPU cycles wasted by electron makes any difference, even in aggregate.


Electron apps are probably responsible for multiple Gigawatt-hours of wasted energy annually.


This is not about "open source" vs "free software". This is about "permissive" vs "copyleft"


I'm probably oversimplifying things, but for me "permissive license" (MIT, BSD) is equivalent to the term "open source", and "copyleft license" ((L)GPL) <=> "free software". I know that the term "open source" also includes (L)GPL-licensed software, but the "free software" advocates resist being lumped together with software that they perceive to be less free, so they very much prefer "free software".


I don't know which advocates you have in mind, but the FSF does refer to MIT and BSD as Free Software licenses. For FSF, Copyleft is a desirable property (obviously), but not required by the four freedoms.


so less permissive = more free.

gives me something to think about.


All freedom is relative to the perspective you judge it. "Free" in this case is measured from the perspective of the software (and more specifically the source code), not the developer. The basic philosophy is that the easier it is lock up the source code and its development behind closed proprietary walls, the less free it is.

From the FSF perspective, your freedom to make the source code less free stands in direct opposition to the freedom of the source code.


The freedom prioritized is that of the user, not the developer working at a for-profit company. Copyleft software prioritizes users' freedom by attempting to prevent companies/developers from taking away their rights. (See also GPLv3's anti-Tivoization clauses: https://en.wikipedia.org/wiki/Tivoization that restrict companies to empower users)


Why not reuse the whole frontend then?


Its their itch to scratch, and its good to have an independent implementation anyway


Because we want to plug into GCC, so that we can use the rest of the compiler chain to output to all of the architectures that GCC supports.


You still get that by using the same Rust frontend and using GCC instead of LLVM as the backend. A parallel reimplemented from scratch front end at best lets you find under-specified aspects of the current implementation. Not sure it actually helps the ecosystem though.


You're describing rustc_codegen_gcc


Correct. I’m saying that the justification provided:

> Because we want to plug into GCC, so that we can use the rest of the compiler chain to output to all of the architectures that GCC supports.

is invalidated by rust_codegen_gcc and thus is not a good reason for a new frontend.


Discovery and improvement of "under-specified aspects of the current implementation" can have a large practical impact, and implementing a standard (the Rust language) more than once so that it ceases to have a dominant "current implementation" (rustc) is even more important.


That’s me being very very generous and just restating the supposed goal of this effort since rust_codegen_gcc provides a much better solution to the “I want to be able to hit all the gcc targets”.

I’m not actually convinced that a new fronted actually helps as much for that effort especially when weighed against the very real downsides we see in c and c++. I have a strong suspicion that addressing under specified parts of the current implementation can be done more cheaply and effectively using other mechanisms.

Indeed, I don’t see the typescript community really complaining that there isn’t another implementation of typescript. And cpython remains the de facto Python implementation with other distributions seeing less adoption due to network effects and I don’t really see cpython benefiting from the existence of the other frontends.


> Indeed, I don’t see the typescript community really complaining that there isn’t another implementation of typescript.

There are actually numerous third-party implementations that can take TypeScript code and compile it to JavaScript. Babel can do it in pure JS/TS, SWC can do it in Rust, and ESBuild can do it in Golang.

The catch is that AFAIK none of these implementations enforce the actual type checks. This is usually called "type stripping" within the JavaScript community, but it's basically equivalent to the "Rust without the borrow checker" implementation that gccrs is working on.

Well, except that the Rust community is extremely ideologically opposed to "Rust compiler without the borrow checker" existing as a viable product, so you end up with the situation described in the article where that implementation is just used to compile a hybrid compiler with all the hard-to-implement type-checker stuff copied verbatim from rustc. That erases most of the spec-robustness benefits of doing a full third-party implementation, but there are still some practical benefits, like being able to bootstrap the compiler more easily.

And who knows -- if you hack the build to re-disable the borrow checker, the resulting compiler might run faster. (That's the main benefit of the third-party TypeScript implementations in practice.)


Typescript was written very intentionally to be really easy to transpile to JS which is why you see so many solutions. The type checking piece remains not reimplemented despite high profile third party attempts to rewrite it in a faster language. The type checking piece is exactly the thing that makes typescript typescript. And the reason the external frontend attempt failed is that it can’t keep up with mainline development which is also true for CPython and its offshoots. In fact, TS is so easy to strip that the syntax is being standardized within official ecmascript so that any type checking annotations could be layered into JS code and engines would natively strip without needing transpolation in the first place.

As for “rust without the borrow checker” it’s very much not the same thing as Typescript transpilation for many reasons among which are that there are many other language rules would still have to be enforced. It’s akin to making a TS type checker that didn’t validate 1 of the language rules but enforced the others.

I’m not opposed to a compilation mode that didn’t enforce the borrow checker but repeated tests of compiling various software packages reveal that the borrow checker is within the noise performance wise. So you’d need a more compelling reason than “maybe it runs faster”.


> repeated tests of compiling various software packages reveal that the borrow checker is within the noise performance wise.

Out of curiosity, do you have a source for this? The stuff I remember reading on the topic is from back in 2018 where they managed to get the NLL borrow checker to be not much slower than the old borrow checker (https://blog.mozilla.org/nnethercote/2018/11/06/how-to-speed...) -- but that took a concerted effort, and the borrow checker was overall a large enough contributor to compilation time that it was worth spending a bunch of effort optimizing it.


rustc_codegen_gcc is also a thing.


I guess this doesn't solve the bootstrapping issue.


Neither does this project of it contains parts that needs a rust compiler to be bootstrapped


It is the Rust compiler, the article describes how the (Rust) compiler is first compiled without the Rust code in it and then used to bootstrap the final version which does have the Rust code in it


incorrect, this is a re-implementation of rust by designed to be included in gcc, not the original rust compiler.


You didn't read the part about compiling with the borrow checker turned off?


Nuclear material is put in glass which can last many thousands of years.

Geological studies of the locations where the nuclear waste is burried show that it would take millions of year for the isotopes to escape.

Meanwhile, the effects of low dose radiations on health are vastly exaggerated. Every industry is releasing pollution in the ecosystem that are way more toxic and dangerous. And these get barely any attention for some reasons.


> Pluto - Pu. the closest available abbreviation

Did he not get the memo about Pluto not being a planet anymore?


Pluto is a small planet.

Actually, it is useful to distinguish between big planets, medium planets and small planets.

Big planets and medium planets differ in chemical composition, caused by the intensity of their gravity. Only the big planets can retain large quantities of the very abundant chemical elements that form mostly volatile compounds (H, He, C, N, O, Ne and S). The medium planets are strongly depleted in these chemical elements, in comparison with the average composition of their stellar system.

The medium planets and the small planets differ in their capacity of clearing their orbits of any other big bodies.

The celestial bodies which orbit the star, but which are not big enough to become quasi-spherical, are not considered planets.

By these definitions, the Solar system has 4 big planets, 4 medium planets and a large number of small planets, including Pluto and Ceres.

It is likely that the configuration of the Solar System, with some medium planets close to the star and some big planets far from the star, is typical for most star systems, due to higher temperatures closer to the star, which prevent the condensation of the volatile elements that contain most of the available mass for forming a planet and due to lower amounts of bodies condensed around the smaller orbit, which can be accreted into the final planet.


Pluto is a planet, the same way Ceres and Makemake are planets.


And dozens more. It's just not a remarkable major planet.


It's not a major planet at all. It is a fairly remarkable minor planet, however: it's just close enough to see with optical telescopes, it's large enough to have hydrostatic equilibrium (it's round), and it's actually a binary system with the barycenter between Pluto and Charon, since Charon is also quite large.


"Not a remarkable major planet" does not imply "major planet"


If only there was an existing gui library as mature at Qt4 written in these languages, so they could fork it.


There are some good Go and Rust UI frameworks they could contribute to, but that would mean contributing to a common goal rather than forking endlessly.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: