It has been a few years since I last used Tor, but even back then you could download at megabit speeds over the network. I would assume that the situation has improved, I’m surprised that you seem to report the opposite.
Somewhat. Stallman claims to have tried to make it modular,[0] but also that he wants to avoid "misuse of [the] front ends".[1]
The idea is that you should link the front and back ends, to prevent out-of-process GPL runarounds. But because of that, the mingling of the front and back ends ended up winning out over attempts to stay modular.
>> The idea is that you should link the front and back ends, to prevent out-of-process GPL runarounds.
Valid points, but also the reason people wanting to create a more modular compiler created LLVM under a different license - the ultimate GPL runaround. OTOH now we have two big and useful compilers!
When gcc was built most compilers were proprietary. Stallman wanted a free compiler and to keep it free. The GPL license is more restrictive, but it's philosophy is clear. At the end of the day the code's writer can choose if and how people are allowed to use it. You don't have to use it, you can use something else or build you own. And maybe, just maybe Linux is thriving while Windows is dying because in the Linux ecosystem everybody works together and shares, while in Windows everybody helps together paying for Satya Nadellas next yacht.
> At the end of the day the code's writer can choose if and how people are allowed to use it.
If it's free software then I can modify and use it as I please. What's limited is redistributing the modified code (and offering a service to users over a network for Afferro).
Good lord Stallman is such a zealot and hypocrite. It's not open vs. closed it's mine vs. yours and he's openly declaring that he's nerfing software in order to prevent people from using it in a way he doesn't like. And refusing to talk about it in public because normal people hate that shit "misunderstanding" him.
--- From the post:
I let this drop back in March -- please forgive me.
> Maybe that's the issue for GCC, but for Emacs the issue is to get detailed
> info out of GCC, which is a different problem. My understanding is that
> you're opposed to GCC providing this useful info because that info would
> need to be complete enough to be usable as input to a proprietary
> compiler backend.
My hope is that we can work out a kind of "detailed output" that is
enough for what Emacs wants, but not enough for misuse of GCC front ends.
I don't want to discuss the details on the list, because I think that
would mean 50 messages of misunderstanding and tangents for each
message that makes progress. Instead, is there anyone here who would
like to work on this in detail?
He should just re-license GCC to close whatever perceived loophole, instead of actively making GCC more difficult to work with (for everyone!). RMS has done so much good, but he's so far from an ideal figure.
Most contributions are required to assign copyright to the FSF, so it's not actually particularly open.
If the FSF is the sole copyright owner they're free to relicense it however they please, if no one else has any controlling interest of the copyright, the GPL doesn't restrict you from relicensing something you're the sole owner of (and it's doubtful there's a legal mechanism to give away rights to something you continue to own)
Again, the FSF under Stallman isn't about freedom it's about control.
That sounds like Stallman wants proprietary OSS ;)
If you're going to make it hard for anyone anywhere to integrate with your open source tooling for fear of commercial projects abusing them and not ever sharing their changes, why even use the GPL license?
But all the other delis are doing it as well. So is your supermarket. So is your farmer (somehow they figured out how to add salt to the veggies they sell at the farmer's market!)... Whaddya gonna do? Grow your own? THE SEEDS SHALL HAVE SALT TOO, has been decreed...
Then your social media & newsfeeds are buzzing about salted coffee, and your work has mandated salt in the coffee, insisting that it increases productivity, and if you’re not partaking you might fail your next performance review.
.NET ResX localization generates a source file. So localized messages are just `Resources.TheKey` - a named property like anything else in the language. It also catches key renaming bugs because the code will fail to compile if you remove/rename a key without updating users.
I've only ever seen three reasons for Midori to shutdown:
1) they were hitting C# limitations (and started working on custom compilers etc) (and people involved in Midori say Rust has already shipped things they failed to do)
2) there was a bit too much academic overeagerness, e.g. software transactional memory will kill any project that attempts it
Midori is certainly an interesting project, but no; I meant the old "code access security" model that .NET Framework had.[0][1] Administrators (and other code) could restrict you from doing certain operations, and the runtime would enforce it. It was removed in .NET Core.[2]
Okay, that looks really funky. Like, libraries explicitly state what access they have ambient authority to use, and then callers can be constrained by an access control list, or something like that. Really weird design.
I'd love to see someone put genuine thought into what it would take to say that e.g. a Rust crate has no ambient authority. No unsafe, applied transitively. For example, no calling std::fs::open, must pass in a "filesystem abstraction" for that to work.
I think the end of that road could be a way to have libraries that can only affect the outside world by values you pass in (=capabilities), busy looping, deadlocking, or running out of memory (and no_std might be a mechanism to force explicit use of an allocator, too).
(Whether that work is worth doing in a world with WASM+WASI is a different question.)
reply