Hacker Newsnew | past | comments | ask | show | jobs | submit | Borg3's commentslogin

This and great backward compatibility. I still can make app targeting Win2000 and it will run on Win2000 onwards (Win10 and Win11 included.) Unfortunately, its starts to fall apart...

I guess they finally think they captured enough "value" with Windows so there is no need to keep every subsystem maintained. It must be very expensive to keep a 20+ year developer to sit in a basement room writing code for some feature that does not generate much revenue. Sad truth. TBH I'd love to learn those subsystems and do it for free.

Exacly.. I avoid Visual Studio.. I try to build everthing using Mingw..

Clang is the better alternative to MinGW because it can use standard Windows libraries and avoids the need for additional runtime.

Can you actually do cross compilation (on Linux host to win64 binary) with clang in the same way as MingW does out of the box though?

No. You cannot even do direct compilation on the same host and target with clang only.

LLVM doesn't come with the C library headers (VCRuntime) or the executable runtime startup code (VCStartup).Both of which are under Visual Studio proprietary licenses. So to use Clang on Windows without Mingw, you need Visual Studio.


I use MingW without any extra libs (no msys), it just uses the ancient msvcrt.dll that is present in all Windows versions, so my programs work even on Windows 2000.

Additionally the cross-compiler on Linux also produces binaries with no extra runtime requirements.


You can use Mingw-w64 UCRT or CLANG environments that come with MSYS2.

Compared to older Mingw64 environments those link with the latest UCRT so you get almost the same style executable as Visual Studio.

The only difference for C is that it uses Mingw exception handling and global initialization code, and it uses Itanium ABI for C++.


But that's the point, I don't want the same style executable as Visual Studio. Having to distribute bunch of DLLs and having worse compatibility is pretty bad.

A major part of the incompatibility with older versions of Windows is just because newer VS runtimes cut the support artifically. That's it. Many programs would otherwise work as-is or with just a little help.


yeah, you can get away with this now a days because Git itself installs 2/3rds of the things you need anyway. You just need to finish the job by getting the package and putting the binaries in your git folder. Bam! mingw64, clang, what ever cc you need. It all links to standard windows stuff because you have to tell the linker where your win32.lib is. But this is true no matter the compiler, it's just Visual Studio supplies this in some god awful Program Files path.

Just msys2 it all

MSYS2 is horrible. It brings a massive runtime environment and is a bad idea to foist on users.

Aren’t you thinking of Cygwin, or the MSYS2 shell (dev tooling)?

The Windows-native software you build with MSYS2 can be shipped to and run by users that don’t have anything of MSYS2 installed.


He must be thinking of Cygwin as half of this is installed when you install git ;) Git Bash, etc…

MSYS2 is repacked Cygwin though. It is literally the same codebase compiled with slightly different flags. You need a full Unix environment for Bash to run, not just Mingw toolchain. The difference is Cygwin aims to create a full Unix system while MSYS2 just enough development environment to run bash, make etc to build native Windows programs with Mingw.

Git installs its own Mingw and Msys2 stuff but mostly compiled for a Mingw environment so they consume Windows paths natively instead of using MSYS2/Cygwin path conversion. That's why when you have mixed PATH variable all hell breaks loose with Git.


I think you’re underestimating or discounting the work the MSYS2 team put into their layered environments mechanism:

https://www.msys2.org/docs/environments/


I do appreciate it as a daily user of MSYS2. However the bigger thing that enabled them is indeed Cygwin project since it unlocked the path to build things that are using strict Unix tooling. autoconf, Make, bash etc cannot run under a pure Windows environment. They are too dependent to the underlying system being a Unix. To use pacman, bash and make in MSYS2, you need msys-2.0.dll. Even they cite Cygwin's URL in the runtime package [1]. And they basically patch Cygwin to enable this [2]

Without Cygwin enabling the path, it wouldn't be possible to build GCC for Windows without completely changing its build system. It would be a DOA fork while Mingw and PE32+ support is a part of GCC nowadays.

The nice and genius part of MSYS2 is that it is there to primarily encourage you to develop native Windows software that has better cross-platform behavior rather than Cygwin alone. If Microsoft made a better, free of charge C compiler in early 2000s that is adhering to the standards better, we wouldn't probably need Mingw to build cross-platform apps. Now MSVC is still free of charge for only open source and individuals.

[1] "Cygwin POSIX emulation engine", https://packages.msys2.org/base/msys2-runtime [2] https://github.com/msys2/MSYS2-packages/tree/master/msys2-ru...


Thank you for this insightful comment.

> MSYS2 just enough development environment

Doesn't it come with `pacman` too?


You need to update stuff somehow. `pacman` is a beautifully simple package manager. It covers the complexity just enough without going overboard.

Yes, it’s wonderful.

A repacked Cygwin is one environment, but the default uses the UCRT from Microsoft.

They all have to use MSVCRT or UCRT to stay compatible with other Windows programs and APIs like COM. And AFAIK nobody has developed a C library that's purely dependent on Win32 system APIs (it is possible just really hard). The difference is that Cygwin is trying to create this semi-isolated Unix environment to programs to think they are running under a complete Unix system like Wine does.

MSYS2 is there to just provide the basics so you can develop programs that are Windows native but use some of the tools that have really strong Unix dependence like shells or Make. They depend on the existence of syscalls like `fork` or forward slash being the directory seperator.


It was not clear what the parent commenter was addressing; I was under the impression they meant 'compile against the MSYS2 environment', which is broadly Cygwin, yes, which should not be forced onto a user.

Okay, but that just seems to be perpetuating the misunderstanding of what MSYS2 is intended for.

It gives you a *nix-like shell/dev environment and tools, but you build native software that runs on Windows systems that don’t have or need to have all/parts of MSYS2/Cygwin installed.


Example:

I built a network daemon using the MSYS2 CLANG64 environment and llvm toolchain on Windows 10.

Windows 7 x64 users could download the compiled single-file executable and run it just fine, so long as they installed Microsoft’s Universal C Runtime, which is a free download from Microsoft’s website.


> MSYS2 CLANG64

I get your point. Although my point is that there is actually zero need for MSYS at all for this, even as a developer, and especially not with the 'CLANG64' environment. These binaries themselves are built to run in the MSYS2 environment This is how I cross-compile from Windows... to Windows with LLVM-MinGW[1]:

  > (gci Env:PATH).Value.Split(';') | sort
  > clang-21.exe --version
  clang version 21.1.2 (https://github.com/llvm/llvm-project.git b708aea0bc7127adf4ec643660699c8bcdde1273)
  Target: x86_64-w64-windows-gnu
  Thread model: posix
  InstalledDir: C:/Users/dpdx/AppData/Local/Microsoft/WinGet/Packages/MartinStorsjo.LLVM-MinGW.UCRT_Microsoft.Winget.Source_8wekyb3d8bbwe/llvm-mingw-20250924-ucrt-x86_64/bin
  Configuration file: C:/Users/dpdx/AppData/Local/Microsoft/WinGet/Packages/MartinStorsjo.LLVM-MinGW.UCRT_Microsoft.Winget.Source_8wekyb3d8bbwe/llvm-mingw-20250924-ucrt-x86_64/bin/x86_64-w64-windows-gnu.cfg
[1]: https://github.com/mstorsjo/llvm-mingw

I think you have it backwards, but I may misunderstand what you're saying.

I'm certain I haven't misunderstood the point of MSYS2's CLANG64 and other environments.

> These binaries themselves are built to run in the MSYS2 environment

I'm not sure if you're referring to the toolchain binaries or the binaries one produce's with them.

The CLANG64, etc. environments are 100% absolutely for certain for building software that can run outside of any MSYS2 environment!

You can, of course, build executables specifically intended to run inside those environments, but that’s not the primary use case.

> (gci Env:PATH).Value.Split(';') | sort

I don't want to use PowerShell or Cmd.exe when doing dev stuff on Windows. I want to do CLI work and author scripts in and for modern Bash, just like I would for Linux and macOS. I want to write Makefiles for GNU make, just like...

Now, sometimes there are bumps and sharp edges you have to deal with via `if [[ -v MSYSTEM ]]; then`, similar in Makefile, cygpath conversion, template/conditional code in sources, and so on. But that's a small price to pay, from my perspective, for staying in the same mental model for how to build software.


All msys2 does is give you a unified BSD experience and toolchain for compiling applications for any architecture and platform. Windows included.

There. I think that sums it up.


MSYS2 UCRT also uses the native Windows libraries (aka. UCRT).

Exacly. Git supposed to be DVCS not generic DVFS. Choose right tool for right task. I needed generic DVFS to store my docs, so I wrote one. Its easy and quick and does it job :)

As explained, the storage backend in git is pluggable but still not flexible enough.

There has been efforts to store git repos in torrents, and to store got repos on crypto blockchains, but all are big architectural challenges, for example people want everything to be backwards compatible for starters, and some want to be able to partially store some content somewhere else, while still keeping all existing use cases efficient.


Ugh.. sorry to hear :( I am myself unemployed right now. Its really hard to land a job in tech.. Luicky, I dont need to flip burgers for now...

Who's gonna play you to flip burgers with no experience doing it and everyone else needing a job as well?

Who’s buying $6.00 burgers when the old customers have been replaced by AI?

There is a huge demand for low-skill labor in other industries. Stuff like plumbing, HVAC, and a ton of other traditionally unsexy jobs that can barely keep enough people in a town to perform these jobs at higher costs than normal.

I wouldn’t call plumbing and other trades low skill.

I agree. I didn't mean to disparage anyone. I have a massive appreciation (and some involvement!) in these trades. The amount of knowledge these guys have about their trade is impressive.

Those jobs don't often pay well until you graduate out of journeyman / apprentice, or are a business owner. They usually require some training and testing ahead of time. They also carry a higher risk of serious injury or death.

The average salary for a software developer in Montana is $88k/yr. The average salary for an HVAC technician in Montana is $58k/yr.

The average salary for a software developer in Oregon is $118k/yr. The average salary for an HVAC technician in Oregon is $74k/yr.

It's for sure less, but the gap is smaller than some might think. I think some markets (SF) distort the cost a bit.


you will not land job as plumber or HVAC without piór experience or similar experience. People shun software developers at once im because THEY KNOW that you simply don't fit as typical candidate.

Also: people doesn't want to hire someone who got 'better' education than them... :/


Oh poor soul :) I had the same problem. And I solved it easly. I pulled out stuff from Internet, keeping only VPN overlay network..

The future is dark I mean.. Darknets.. For people by people. Where you can deal with bad actors.. Wake up! and starting networking :)


Haha, yeah.. Im using Notepad2 actually, because for LOOONG time, notepad.exe could not display LF files correctly... and Notepad2 has a bit more features, but still.. clean and lean.


Because ML and LLM (Not the fucking AI, it doesnt exist yet) should be used to deal with bulk boring stuff. Filter milions of images to leave interesting. Transform some data (but IM not sure why LLM is interesting here, I can write a script to do so) and so on... Not a creative stuff. This should be left for humans. Because if we take this away, whats left? Low skilled labour at best...


Whats the problem? 1997? They were probably using 10BaseTX network, its 10Mbit... Using Novel Netware would allow you to trasnfer data at 1MB/s.. quake.exe is < 0.5MB.. so trasnfer will take around 1 sec..


Not sure what you mean by "problem". I said miniscule cancels out miniscule.


Networking in that era was not a problem. I also don’t know why you’re so steadfast in claiming that builds on local PCs were anything but painfully slow.

It’s also not just a question of local builds for development — people wanted centralized build servers to produce canonical regular builds. Given the choice between a PC and large Sun, DEC, or SGI hardware, the only rational choice was the big iron.

To think that local builds were fast, and that networking was a problem, leads me to question either your memory, whether you were there, or if you simply had an extremely non-representative developer experience in the 90s.


Again, I have no idea what you mean by networking being a "problem".


You keep claiming it somehow incurred substantial overhead relative to the potential gains from building on a large server.

Networking was a solved problem by the mid 90s, and moving the game executable and assets across the wire would have taken ~45 seconds on 10BaseT, and ~4 seconds on 100BaseT. Between Samba, NFS, and Netware, supporting DOS clients was trivial.

Large, multi-CPU systems — with PCI, gigabytes of RAM, and fast SCSI disks (often in striped RAID-0 configurations) — were not marginally faster than a desktop PC. The difference was night and day.

Did you actively work with big iron servers and ethernet deployments in the 90s? I ask because your recollection just does not remotely match my experience of that decade. My first job was deploying a campus-wide 10Base-T network and dual ISDN uplink in ~1993; by 1995 I was working as a software engineer at companies shipping for Solaris/IRIX/HP-UX/OpenServer/UnixWare/Digital UNIX/Windows NT/et al (and by the late 90s, Linux and FreeBSD).


Ok that's not what I said. So we'll just leave it there.


That's exactly what you said, and it was incorrect:

> This is exactly the shipping I'm talking about. The gains would be so miniscule (because, again, and incremental compile was never actually slow even on the PC) and the network overhead adds up. Especially back then.

The network overhead was negligible. The gains were enormous.


>> I said miniscule cancels out miniscule.

> You keep claiming it somehow incurred substantial overhead

This is going nowhere. You keep putting words in my mouth. Final message.


Jesus Christ. Networking was cheap. Local builds on a PC were expensive. You are pedantic, foolish, and wrong.

Were you even a developer in the 90s? Are you trying to annoy people?


I think you were using some 3th party software for auto completion. There was project called Visual Assist with was pretty popular and powerfull tool.


Visual Assist provided much better completion especially for (then brand-new) C++98 if you used templates, STL etc heavily, but VC had its own completion out of the box that was actually pretty good for plain C code like Quake. AFAIR the "IntelliSense" branding was also in place already.


Nope!


Something like this? http://borg.uu3.net/~borg/?gperf

This stuff is written in Ruby :)


Not exactly.

That link looks like a Ruby-level drawing demo (and I don’t see a public repo or docs behind it).

ruby-libgd is a native, in-process binding to libgd, focused on deterministic rendering and explicit pixel-level primitives, released under the MIT license.

On top of that, I built libgd-gis (also MIT) as a higher-level standard: map rendering, GeoJSON ingestion, layers, tiles, and reproducible output for GIS-style workloads.

ruby-libgd is the low-level graphics engine; libgd-gis is the map renderer built on it.


Yeah, its private stuff.. I added GRX Lib bindings to Ruby so I can use Ruby to do 2D graphics.. Ive been looking for somethink like this but couldnt find anything usefull for me, so I made one..

Here is example/demo: http://borg.uu3.net/~borg/?grx/test

gperf is not a demo, it my simple performance counters drawing app.


Thanks for the clarification — that makes sense. My goal with ruby-libgd / libgd-gis is to provide a public, documented, reusable graphics and map-rendering stack for Ruby.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: