That time never existed, there were plenty of subgroups inside Microsoft that wanted to make useful software.
Plenty of people still in there that do.
But it's never been a general policicy of Microsoft that turned into actual culture and actions.
Because that would require a general vision in that direction, which would imply we would never have one version out of two of Windows (Me, Vista, 8, 11...) that looks like a prototype.
We would not have the mess of UI with thousands of toolkits, the infamous env var windows staying unusable for decades, the right click "more option" in W11 that changes theme mid-flight.
We wouldn't have had the awful windows media player that couldn't read anything without a pack of spyware installed in the 90'. Or IE6 being frozen into obsolescence. Or ads in the start menu.
Teams file sharing and chat would not suck. Word would not destroy your layout because you move an image on pixel on the left. Access wouldn't produce the most corruptible db ever. The File explorer would not start in 7 seconds randomly in some machines (see last twitter trend), and its search would actually be useful.
The wizard to fix your internet issues would have solved a problem at least once. MS would not allow tons of crapware to be installed by 3rd party. Python in the windows store would not have been made non standard.
We wouldn't have to wait 2020 to have an upgrade to the terminal. The registry would be self documenting. Skype wouldn't have been destroyed after being bought. You would be able to get to your user directory easily out of the box. Cortana would be useful. Surface wouldn't run to crawl because they would be optimized for their hardware.
I can continue like that for hours.
Because those are not subtle issues. There are other teams at Microsoft that would never let that happen.
The priority of MS is to conquer the market. At some points it meant being a bully. Now it's to pretend to be FOSS BFF. It's always been about getting devs on the platforms, and business on the hook.
But at no point in MS history has been ever been, when you at the actual results, making good software.
Good software like .Net, AoE 2 or Excel were made by the few of their accidentally amazing team.
>Word would not destroy your layout because you move an image on pixel on the left
I swear every time someone tells me that LibreOfficie is simply not up to standard, I think of how much more sane its handling of floating tables and images is. Also,
>AoE 2
A company they killed to make the next monthly report of Xbox look nicer.
> We would not have the mess of UI with thousands of toolkits, the infamous env var windows staying unusable for decades, the right click "more option" in W11 that changes theme mid-flight.
> We wouldn't have had the awful windows media player that couldn't read anything without a pack of spyware installed in the 90'. Or IE6 being frozen into obsolescence. Or ads in the start menu.
> Teams file sharing and chat would not suck. Word would not destroy your layout because you move an image on pixel on the left. Access wouldn't produce the most corruptible db ever. The File explorer would not start in 7 seconds randomly in some machines (see last twitter trend), and its search would actually be useful.
> The wizard to fix your internet issues would have solved a problem at least once. MS would not allow tons of crapware to be installed by 3rd party. Python in the windows store would not have been made non standard.
> We wouldn't have to wait 2020 to have an upgrade to the terminal. The registry would be self documenting. Skype wouldn't have been destroyed after being bought. You would be able to get to your user directory easily out of the box. Cortana would be useful. Surface wouldn't run to crawl because they would be optimized for their hardware.
There are currently 570K+ projects in pypi, 60k+ in debian repos.
It can take several months of work to approve one single package to the official repos, for a single distribution. And each have different rules and setup.
Now explain to me how you think this is going to work.
Also, do you place to force everyone to use chroot or containers to replace their virtual env systems to have variations on deps? Or maybe everybody should use nix?
You only need the system package manager to provide the non-Python dependencies of Python packages. I'd expect that those by and large are already packaged in Debian and elsewhere, perhaps with some exceptions among smalltime C (or Rust) projects that only exist to accelerate Python packages.
> Also, do you place to force everyone to use chroot or containers to replace their virtual env systems to have variations on deps? Or maybe everybody should use nix?
Nix is a good fit for this, but so is Guix and probably Spack. With a sane implementation of PEP-725 (which I mention and describe in another comment on this post), users could freely choose whatever package manager they like to supply non-Python deps.
> There are currently 570K+ projects in pypi, 60k+ in debian repos.
Not all PyPi projects require C code.
> It can take several months of work to approve one single package to the official repos
This is a massive exaggeration. For Fedora it takes a couple of days, all of it being necessary review of the code and licenses. And yes, you do have to do that work, it's done by the distros themselves too.
One of the reasons Python is so popular as a scripting language in science and ML is that it has a very good story for installing Frankenstein code bases made of assembly, C and Pascal sprinkled with SIMD.
I was here before Anaconda popularized the idea of binary packages for Python and inspired wheels to replace eggs, and I don't want to go back to having to compile that nightmare on my machine.
People that have that kind of idea are likely capable of running kubs containers, understand vectorization and can code a monad on the top of their head.
Half of the Python coders are struggling to use their terminal. You have Windows dev that lives in Visual Studio, teachers of high school that barely show a few functions, mathematicians that are replacing R/Matlab, biologists forced to script something to write a paper, frontend dev that just learned JS is not the only language, geographers that are supplicating their GIS system to do something it's not made for, kids messing with their dad laptop, and probably a dog somewhere.
Compiling Python extensions is a nightmare because we allow Autoconf, CMake, Visual Studio, Bazel etc. to make it complicated and nonportable; when someone sets out to wrap some library for Python the quality of the result is limited by the quality of the tools and by low expectations.
A serious engineering effort along the lines of the Zig compiler would allow Python to build almost everything from source out of the box; exotic compilers and binary dependencies, not "Frankenstein code bases" per se, are the actual obstacles.
What you’re proposing here is essentially “if we fix the C/C++ build systems environment, this would be easy!”. You’re absolutely right, but fixing that mess has been a multi-decade goal that’s gone nowhere.
One of the great victories of new systems languages like Rust and Zig is that they standardized build systems. But untangling each individual dependency’s pile of terrible CMake (or autoconf, or vcxproj) hacks is a project in itself, and it’s often a deeply political one tied up with the unique history of each project.
> What you’re proposing here is essentially “if we fix the C/C++ build systems environment, this would be easy!”. You’re absolutely right, but fixing that mess has been a multi-decade goal that’s gone nowhere.
Not sure I would call it easy, as it would still take a lot of effort to update how PyPI works to account for these new capabilities, but that's exactly what the Zig compiler & build system solved.
Rust is completely hands off when it comes to C/C++ dependencies, Zig can package and build them. That's why I created https://github.com/allyourcodebase/.
Haivision/srt, the upstream C++ project
mbedtls
googletest
When you run zig build all these 3 dependencies are downloaded and their build.zig is run if present (the first one doesn’t have a build.zig since it’s just the vanilla C/C++ upstream project that we are providing a build script for).
The work to package everything must still happen, but once it’s done correctly you get the ability to build from any host for any target, which you can literally see happening in the CI runs of that repo: https://github.com/allyourcodebase/srt/actions/runs/10982569...
This kind of skepticism is exactly why I wrote this post. The details of actually coming up with a realistic upgrade path for PyPI are certainly much more nuanced than what I wrote in the post, but the core insight is that the C/C++ build intractability problem has been solved... and that you shouldn't depend on free big tech money if you can.
In my experience, `zig cc` is great at cross-compiling (which is a hard problem that it solves!) but doesn't actually help with the rest of the problem.
You still have to run cmake/autoconf/meson/whatever else, which is the part that's project-specific and often quite fiddly.
This has been my experience as well. I previously used cargo-zigbuild for packaging at work (and contributed both to cargo-zigbuild and Zig as a byproduct), and I still had several troubles that I had to analyze and tackle in spite of them.
Well if you're limiting yourself to zig cc, then all you get is a C compiler.
If you take the time to kick out of your project that soup of build systems and replace them with a build.zig, then you got yourself a complete solution. Takes some effort but it's perfectly doable, see https://github.com/allyourcodebase
The "upgrade path" is for individual packages (providing portable build scripts) and for client-side Python package management (actually providing and running next-generation tools), not for PyPI which already supports package metadata stating what platforms a package is compatible for.
You're the second person to post a link to a comment that's prominently highlighted on the issue I already linked. Yes, I've read that; nowhere do I see a statement like "Zig will always ship Clang with it" and instead I see a number of statements that imply it won't. I'm not even saying that getting Clang out of Zig is a bad thing. It's not like CMake or Rust come bundled with a C compiler.
I suggest you read it again then, because it heavily implies that.
> These use cases can still be satisfied by, again, an independently maintained project that combines Clang main() and Zig main() together. For users of these CLI tools, I don't expect there to be any difference in user experience.
Means when someone installs ziglang from their package manager, it will be able to build c.
Yeah, I've read it multiple times. Every time, it says to me "somebody somewhere else can package these things together, it just won't be Zig". There must be something about Andrew Kelley's communication style that clicks with other people but not me. But I just can't read into it what you say is there. Since this all boils down to interpreting one somewhat arrogant man's words, it ceases to be a technical discussion and just becomes an argument in semantic parsing. I'm just not going to comment much on Zig anymore as clearly other people know what's going on and I keep getting it wrong.
Zig is and will be shipping clang for a great long while. Not only will it be shipping clang so you can build other code with it, they aren't removing LLVM support.
It is not entirely clear what any of this means for the future, and in any case, it keeps changing all the time.
The promise to maintain LLVM support just applies to Zig code. That doesn't require Clang. LLVM alone is not a C compiler, and neither is Zig without Clang.
What very well may happen is that the "standard" distribution of Zig will continue to bundle Clang, while a "minimal" distribution also gets created which omits Clang, and thus cannot compile C code. However, none of this has been spelled out clearly yet, and so I think it's reasonable to say don't depend on Zig to give you Clang.
Zig is migrating away from LLVM as the mandatory backend, it will still ship clang and llvm so that it can build everything it needs which is critical for its mission for all your (code) base.
The typical build scripts and tools and terrible hacks that are adequate for a standalone C/C++/Fortran project are lacking for building the same project as a Python extension, which requires portability and supported, not too custom, build steps.
The ambition and usefulness of aiming for the latter higher standard is quite new: it has gone (relatively) nowhere because it has been a multi-decade non-goal, which only a small minority of users cares about.
Hard-to-build python extensions are basically .so files with some special symbols exposed.
As others said, a general task to "build libfoo.so" is very complex, and currently requires a wide variety of build systems. I don't see why this will get any easier if we require this .so file to export some Python-specific symbols.
It wasn't even intentional for Zig at the beginning, though. And any such solution should be usable across all platforms, while Nix on Windows is still not remotely usable (the latest attempt seems to be [1]).
Autoconf is easy, it's the highly bespoke build systems that someone thought would be a good idea that require the right phase of the moon that are the challenge.
Wheels do predate conda (though manylinux did base itself on the experience of Anaconda and Enthought's base set of libraries), and there were distributions like Enthought (or even more field specific distributions like Ureka and individuals like Christoph Gohlke) that provided binaries for the common packages.
What the conda ecosystem did was provide a not-horrible package manager that included the full stack (in a pinch to help fix up a student's git repository on a locked-down Windows system I used conda to get git, and you can get R and do R<->python connections easily), and by providing a standard repository interface (as opposed to a locked down and limited version that the other providers appeared to do), conda pushed out anyone doing something bespoke and centralised efforts so spins like conda-forge, bioconda, astroconda could focus on their niche and do it well.
Python shouldn't need special lib to design by contract, `assert cond, "reason"` and `if __debug__` work very well for that.
Unfortunately, people don't know they are made to be stripped in prod with `-OO` so that the perf hit is zero. This means you can never be sure a 3rd party lib you depend on is not using an assert somewhere for a very important check you are going to disable.
Respectfully, i am not sure that i agree with that. I think if it were allowed to slip into, say, public domain, then, yes, i agree.
HOWEVER, on a slight tanget from that point...If this company and its software have ever received any sort of taxpayer funds, then my opinion is that from the beginning, such software should have been open sourced. Publicly-paid software should be publicly available. Of course, said business has every right to earn profits from providing the service such as hosting said software for convenience for customers. But, every citizen who paid taxes has a right to see (and access!) all the code...well, that's my belief anyway. ;-)
There should be a very high standard for "the government can force me to do [something]" and it shouldn't be thrown around as casually as, one of a thousand enterprise CRUD apps went away.
reply