I'm not a Windows programmer to a significant degree. But, I did a tiny bit of digging and discovered at WinMain is just a function called by main() in Win32 executables. The implicit main() is pretty trivial and WinMain is largely unnecessary.
hInstance can be retrieved from GetModuleHandle(nullptr). hPrevInstance is always NULL. lpCmdLine can be retrieved from GetCommandLine(). nCmdShow can be retrieved from GetStartupInfo() StartupInfo.wShowWindow.
and it works the same as the code in the article. Not a significant difference. But, I like peeling back magic code to make things just a little simpler.
Can someone who knows better tell me why this is a bad idea? Or, is WinMain just an idea from 30 years ago that didn't actually go anywhere?
> Or, is WinMain just an idea from 30 years ago that didn't actually go anywhere?
I believe it is a leftover from 16-bit Windows, where (at least originally) it actually had to be WinMain. Although even there it all depends on the compiler and runtime library.
Windows 3.x had no console subsystem, so Windows applications had to be graphical; command line apps were DOS-only. Then Microsoft introduced "QuickWin", which was a wrapper which turned simple (plain text-only) command line apps into graphical Windows apps. In QuickWin, you'd supply the main(), and QuickWin would supply the WinMain(), and Windows would call QuickWin's WinMain(), and then QuickWin would call your main(). Windows NT introduced a proper console subsystem, and then Windows 95 introduced this bizarre abomination in which 32-bit console apps were supported, but their IO was routed via a DOS program called CONAGENT.EXE. So your Win32 console app would call some DLL, which would spawn CONAGENT.EXE in a DOS Box, and then there was some VXD which the DLL and CONAGENT.EXE used to communicate, so your IO would go via the DOS Box.
In Windows NT (and descendants), WinMain is no longer necessary, but is still (partially) supported for backward compatibility. I guess a lot of people stick with it just by reason of tradition. It does help communicate (to a person reading the code) that you are dealing with a graphical app rather than a console one, so maybe not totally useless.
> Windows 95 introduced this bizarre abomination in which 32-bit console apps were supported, but their IO was routed via a DOS program called CONAGENT.EXE.
Windows 95 used the 16-bit COMMAND.COM as its default command-line shell, so doing it this way was probably necessary to make 32-bit console applications interoperate with the command shell (and support eg. piping and redirection between 16/32-bit executables).
> Windows 95 used the 16-bit COMMAND.COM as its default command-line shell, so doing it this way was probably necessary to make 32-bit console applications interoperate with the command shell (and support eg. piping and redirection between 16/32-bit executables).
I think that's got the arrow of causation reversed.
Windows 9x did it this way because it didn't have a 32-bit console subsystem. They couldn't have easily ported NT's 32-bit console subsystem to 9x/Me, because it was deeply tied in to how DOS Boxes are implemented (NTVDM), and that's radically different between NT and 9x/Me (which have basically the same architecture in that regard as Windows 3.1 in 386 Enhanced Mode). And also deeply tied into NT architecture components that 9x/Me lacked (CSRSS.EXE and LPCs)
And they used 16-bit COMMAND.COM as the primary shell, because without a 32-bit console subsystem, the value of adopting CMD.EXE was rather limited. It would have allowed some more advanced batch files.
Actually Microsoft did port CMD.EXE to Windows 95 and 98, but unclear if they ever officially released it. It was shipped in some Windows betas and beta SDKs, and some people got it from there and redistributed it (might not be technically legal but I doubt that anyone at Microsoft really cares, especially by now) – http://cygutils.fruitbat.org/consize/index.html
COMMAND.COM did piping using temporary files. I think even in NT versions, redirection works when starting a 32-bit console executable from a DOS app. I wish I had a 32-bit Windows VM handy to test that with. (Pity pcjs.org has Windows 95 but no NT versions, not even NT 3.1/3.5/3.51/4.0)
> I guess a lot of people stick with it just by reason of tradition.
Or nostalgia, or muscle memory. I kinda learned this one by heart as a kid, and could still recite even in my sleep: "int WINAPI WinMain(HINSTANCE hInstance, HINSTANCE hPrevInstance, LPSTR lpCmdLine, ... " - well, I forgot about nCmdShow. And don't ask me about WINAPI vs. APIENTRY (which is used in the article) - I've seen both used, and I'm pretty sure I picked the former from Petzold (i.e. Programming Windows, edition ${whichever it was that I found in a library in around 2002).
There was also the other one, let's see if I remember. "Something something WndProc(HWND hWnd, UINT uMsg, WPARAM, LPARAM)"? That one I usually copy-pasted between projects - as all I ever needed is a window to draw on with OpenGL.
On that note, WinMain() isn't the only case I've seen of substituting main(). MSVC/Windows had at some point _tmain(), as in:
int _tmain(int argc, _TCHAR* argv[])
It's there so you could write code generic wrt. short/long character strings - which, in the usual Microsoft fashion, was achieved by #define-s - depending on a flag, _tmain() expands to main() or wmain(), then _TCHAR expands to char or wchar_t, etc.
The function MessageBox() also doesn't exist - actual system DLLs provide you with MessageBoxA() and MessageBoxW(), and MessageBox is again a #define, in the style of _tmain().
Anyway, back to overriding main() - SDL[0] also encouraged you to write an SDL_main() instead of main().
Thanks for the morning nostalgia trip and for filling back specifics of APIs I learned as a kid, without understanding much of them.
> Windows 95 introduced this bizarre abomination in which 32-bit console apps were supported, but their IO was routed via a DOS program called CONAGENT.EXE. So your Win32 console app would call some DLL, which would spawn CONAGENT.EXE in a DOS Box, and then there was some VXD which the DLL and CONAGENT.EXE used to communicate, so your IO would go via the DOS Box.
You mean 16-bit console apps running in 32-bit OS, or 32-bit console apps using 16-bit I/O?
Funny how those problems never die. I only recently learned about "DLL surrogates" and Windows mechanisms for mixing 32-bit and 64-bit code by having 32-bit DLL proxied by a system-provided executable. I even wrote a DLL-proxying executable once, because the nature of legacy proprietary code is that sometimes you really need that 32-bit DLL the vendor, for whatever reason, refuses to rebuild for 64-bits...
> You mean 16-bit console apps running in 32-bit OS, or 32-bit console apps using 16-bit I/O?
I mean on Win9x/Me, 32-bit console apps are forced to use 16-bit I/O. The 32-bit console APIs are implemented using 16-bit MS-DOS IO via the VCOND VXD and CONAGENT.EXE. There was never a 16-bit console API for Windows, so all 16-bit console apps were either DOS or OS/2 1.x. (NT originally could run OS/2 1.x console apps, although the support was withdrawn at some point; 9x/Me never could, except for family mode executables.)
> I only recently learned about "DLL surrogates" and Windows mechanisms for mixing 32-bit and 64-bit code by having 32-bit DLL proxied by a system-provided executable. I even wrote a DLL-proxying executable once, because the nature of legacy proprietary code is that sometimes you really need that 32-bit DLL the vendor, for whatever reason, refuses to rebuild for 64-bits...
That’s something I’d be interested to know more about. Microsoft put a lot of effort into ensuring interoperability between 16-bit and 32-bit code, but when it came to do the same for 32-bit/64-bit, decided not to
Neither main/WinMain/wmain/wWinMain are actually the methods called on program launch. Both are preceded by various setup calls (i.e. the CRT in your mainCRTStartup linker flag).
The pre-main runtime doesn't do much, but it does initialize the WinMain parameters in not that a fashion not that dissimilar from your example. You could just do all that work yourself, but really, why should you? On most platforms you can get argc and argv[] through the right system calls and file I/O as well, but it's much easier to have the conventional runtime do all that stuff for you.
WinMain is just a convention (https://devblogs.microsoft.com/oldnewthing/20061204-01/?p=28...) for your C/C++/etc runtime to start executing the code you write. Linux isn't all that different, adding runtime code before the main() method where necessary.
This only applies to C++, but the CRT runs all constructors of global variables before main/WinMain. (Or rather, the CRT calls a special function that the compiler generates for this purpose and links into the executable.) In some codebases, that's quite a lot of stuff.
There is also initialization relevant for C code, e.g. strlen() will crash if you call it from the startup function directly without properly initializing msvcrt.
> On most platforms you can get argc and argv[] through the right system calls
Not (reliably) on Linux or, as far as I know, on similar systems. argv, environ, and the aux vector come from a horrible data structure the kernel creates on the stack.
The kernel just copies the data to the program's stack in a contiguous manner. Obtaining pointers to them can seem somewhat magical if you're writing a nolibc program but I wouldn't call it horrible.
I implemented it for my programming language with some rather simple assembly code:
I’m saying you can’t reliably get the information from syscalls. The runtime (i.e. whatever implements the actual entry point declared in the ELF headers) can get it reliably, as can any other code to which the runtime gives an appropriate pointer.
You can’t assume that /proc is procfs if you’re writing a low level runtime library.
If you can find the top of the stack, you can read the contents with reasonable reliability. But the top of the stack is not at a fixed address, and if you are writing low enough level code (container manager, init, etc), poking around in /proc at startup is not a great idea.
If you’re wiring a real runtime library, none of this matters: the kernel passes a pointer in a register at startup.
(that does not look like an official repo, but it's the best I could find in terms of a github link. The file pretty much matches c:\Program Files (x86)\Microsoft Visual Studio\2019\Professional\VC\Tools\MSVC\14.29.30133\crt\src\vcruntime\exe_common.inl on my PC.)
That fixes the problem, but goes in the wrong direction IMO. I argue that you actually should only use MessageBoxW, ignore all the macros, and just use wide strings whenever you interact with the OS. An Ansi-only program is broken in 2023, and actually has been for two decades.
I agree, but we're also talking about Win95, which didn't support Unicode until 2001. The example compiles fine so long as UNICODE isn't defined but assuming, as a 90s dev, you'll eventually want to support NT when they make it run games, you should probably leave the door open with TCHAR.
If you are dealing with user-created files you should always use the *W functions as filenames are not guaranteed to be valid UTF-16 and thus might not be accessible via UTF-8 *A calls. At least until Microsoft adds WTF-8 [0] support to the *A functions. You can use WTF-8 yourself internally of course so no need for wide strings outside your platform abstraction code.
With the standard, retro compiler as used in the 90s Developer Pack example, the compiler will assume you're calling the ASCII methods unless you manually configure it to use Unicode by default.
During modern Windows development you should probably default to wide characters, of course.
However, Windows 9X didn't actually support the 16-bit wide-char winapi. Apparently it did actually define the *W functions but they were nearly all just stubs. In 2001 Microsoft retroactively added support for the wide-char API to Windows 9X (according to https://stackoverflow.com/a/35329506/1185152):
> In 2001 Microsoft released an add-on for Windows 9x called the Microsoft Layer for Unicode (MSLU), which changed the W functions from failing-stubs to thunking proxies which converted the strings back into an 8-bit format and then called the A functions, so programs explicitly using W functions could run on Windows 9x.
Also, some APIs use buffers with a fixed maximum size, and the A API's buffer isn't big enough to handle the maximum possible data size for the W API.
And some APIs are Unicode-only, and don't have A versions.
For maximum reliability, better to use W APIs only and do the UTF-8<->UTF-16 conversion yourself.
However, in practice, if you just use A APIs only and let Windows do the UTF-8 conversion for you, it is going to work 95% of the time. But maybe at some point you'll hit that 5% where it doesn't. It is more likely if you are using more obscure APIs (like my example at the start of manipulating Windows Terminal Server sessions). More likely for system-level stuff than application-level.
> Until recently, Windows has emphasized "Unicode" -W variants over -A APIs. However, recent releases have used the ANSI code page and -A APIs as a means to introduce UTF-8 support to apps. If the ANSI code page is configured for UTF-8, then -A APIs typically operate in UTF-8. This model has the benefit of supporting existing code built with -A APIs without any code changes.
i had a blog about this many years ago, if anyone is interested - but not about WinMain, which i must admit i have never really understood - something of an artefact of the MS linker, i think.
This is a work of fiction, filled with 75% pure fantasy, and the rest half wrong.
All the hardware stuff is pure fantasy; they were using an emulator. So all the pics and all that text, including nonsense like "3 PIC slots" and "two EDO RAM" -- Peripheral Component Interconnect is not even similar to a Peripheral Interrupt Controller; EDO RAM only works on a Pentium, and it's a type of chip, it's not countable -- this stuff is 100% pure fiction.
No, Win95 was not the first fully 32-bit Windows from Microsoft.
1. It's not fully 32-bit; it's a hybrid, with 16-bit code.
2. It's not the first; that was NT 3.1, which is fully 32-bit.
It didn't bring programming to consumers. Did consumers want programming anyway?
This is pure fakery, in broken English, from someone who doesn't really know their stuff -- and probably is not really old, as they claim. It's just playing pretend.
I did something similar a few years ago to get the feeling of how a 90's developer would work developing GUI's using tools that came with CDE.
I compiled CDE from source using a Debian I installed from debootstrap. Installing Debian that way was already lot of work a good learning experience. Of course, a 90's developer with access to a UNIX workstation probably wouldn't have to install the system or all the tools themselves.
After running CDE, I then did some quick but enlightening experiments with application builder. Although I liked the fact that I could use C, it was not as flowed, fast or cheap as delphi was at the time. Actually even the UI theme wasn't as attractive.
But it was fun. It felt powerful. The feeling that I was using a tool that has been developed decades ago, somewhat maintained and compiled on a modern system thanks to the source being available was impressive. There was a feeling of integration. Some form of integration of ideas, ideals and philosophy. I mean... it was the same tools, but running a modern system, modern hardware, modern compilers, debug and analysis tools which use theories that didn't even exist when those tools were envisioned. And all that worked nicely because there was a form of vote, an agreement, in following standards.
I understand now, from proper experience, why some people could see an aura of superiority around UNIX devs.
But this is someone saying they are "an old developer" and blogging about running an OS on old kit, in some depth, discussing what adapters they are using...
And there is no kit. No CPU, no motherboard, nothing. It's all a total fantasy. They are running a PC emulator on an M1 Mac, as far as I can see.
The author, downthread, says they were practising tech blogging.
This is the feedback!
Write about the real stuff you are using. Don't make stuff up -- EVER. No fantasy, no wishlists, no daydreaming. Use real kit, and if you can't, don't pretend.
If you are using an emulator, describe it. Show it. Say how you built it, show what options you picked, what you built it with, and how, and why.
Because this person made a bunch of stuff up, without making it clear they were making it up, and then they got the stuff they made up wrong. Even the facts and claims about the contemporary tools are wrong.
I have no clue about main loops in the Win32 API and I don't care, but I can't trust that that's right either when the rest of the post can't distinguish between a PIC and PCI, and can't count SIMM slots, and doesn't seem to know about the bits I do know.
It it the same issue as this famous Mastodon post from the end of last year:
«
He talked about electric cars. I don't know anything about cars, so when people said he was a genius I figured he must be a genius.
Then he talked about rockets. I don't know anything about rockets, so when people said he was a genius I figured he must be a genius.
Now he talks about software. I happen to know a lot about software & Elon Musk is saying the stupidest shit I've ever heard anyone say, so when people say he's a genius I figure I should stay the hell away from his cars and rockets.
»
This is a blog post about software development in early MS 32-bit tools. I know nothing about that.
But the blog post talks at length about 1990s hardware, something I know a lot about, and about 1990s MS operating systems, something else I know a lot about...
And that stuff is all wrong.
So it makes me not trust the bits of it that are about the stuff I don't know about, and since that is the core subject of the post, that matters.
Someone was trying to show off their knowledge, and it didn't work, and I am calling it out.
To add insult to injury he runs the PC emulator on an ... Apple. It's like putting Dracula in a Church.
But I get what you say and agree with you that it's a pointless dribble. Shows all those pictures of motherboard and CPU and Sound Blaster cards and to what purpose? It's a darn emulator! They could show the picture of an alien's butt and would make more sense, at least it would be clear it's all imagined.
Yeah and you can find the OP on LinkedIn, given the crap he posted here I wouldn't expect he'd have the brains to hide his true identity when he makes a fool of himself.
I don't expect this to be posted on HN. I use the blog to teach myself how to write in technical blog English. Let me clarify two things:
- What I meant restricted access to old computers is that it's hard to get old computers that's usable and not being dismantled and scraped. Whenever I searched online it's always from the US or Europe. No local e-commerce market have 486 available.
- The Real Programmer™ is a tongue-in-cheek. I got my first job doing VB 6 stuff . The remark is just to make fun of those who look down on VB6 just because you cane do C or C++.
I'm actually surprised you opted for VS97. I would think, far more developers between '95 and '02 would be more acquainted with VS6. Mostly due to it's exceedingly generous (relative to later) licensing terms and inclusion of VB classic. I don't think I used any Visual Studio environment other than 6 until well into XP and 2000's tenure. It sounds like your own experience was relatively similar.
Borland stuff are great for getting things done and are also what I'm considering if ever I will have time for recreational retro-programming. But for an article, C++ with WinApi would give more to talk about.
1990's? Once it was 1995 you wouldn't be caught dead with a 486.
Also, Windows 95 came with a boot disk to run the installer so you didn't need DOS and MSCDEX.
I was surprised to see no mention of a Borland product like Turbo C++ or Turbo Pascal, but by the mid 90's those were out of sight. You'd be using Watcom or DJGPP to make DOS games in the early to mid 90's.
Also...where was MSVC 1.5? Forgot what version I had before acquiring Visual Studio 4 or whatever everyone else had.
Processors got obsolete pretty fast at that time, and while in 1996 (or maybe 1997) it would be hard to sell 486s, by 1995 I was still selling lots of AMD DX/4 machines for cost-conscious customers from my home business assembling PCs.
Small companies mostly run DOS-based programs in applications like POS and accounting packages, and for those a 486 was more than enough. In fact, running Windows on those machines would bring no benefit and made loading Novell Netware drivers on conventional memory a bit tricky sometimes.
I sold Pentiums at that time mostly to domestic consumers, with the occasional DX4/100 thrown in the mix. Once I got lucky and sold some 10 machines in a short period. SoundBlaster "multimedia kits" came in humongous boxes and for a few days, I barely had space to live in my bedroom.
It was a fun time. Around that time I built a POS application for Windows using Delphi, in a time were most companies were using DOS-based solutions. I could have built a good business around this, but I was young, naive, had no experience or inclination with sales, and looked chronically even younger than I was. So, after some time I quit trying to run a business on software by myself. Sometimes I feel I should have been more persistent.
I had a friend in middle school that also built computers. He wanted to start a computer building business with me. I still regret saying no all these years later. Would have been fun
I had my Packard Bell 486x up until I graduated high school in 2000. I still remember the one summer I mowed every lawn in my mobile home park so I could finally afford to upgrade the modem from the original 2400.
That PC introduced me to BBSs and then the internet. I learned QBASIC and html with that thing. Good times.
Yeah. I had a Cyrix CX486DX4 up until 2000 when my family got an AMD 1000 MicronPC (aka Crucial) running Windows ME. That thing was SUCH a headache until I upgraded it to Windows 2000. My parents were ALWAYS mad at me for “breaking the computer” but that was just the natural state of ME, it would just stop working after a couple weeks.
I got a newly home built 486dx2 in 1998, upgraded it few times up to original K6 @ 200MHz then replaced with K6-2 400MHz around 2000-2001?
Turbo Pascal 7.0 with hacked Delay to support faster CPUs was big (as was Turbo C++) up to 2000, at least in Poland. Those of us who targeted Windows often used Delphi, but all kinds of simple game development often continued in DOS mode till Windows ME/XP killed that path.
My girlfriend had a fancy expensive 386 laptop until ‘98 I used to get my degree using stuff like Borland C++ and java. I couldn’t even afford a computer until after I graduated. Installed Slackware on that laptop from a stack of floppies to learn Unix for my first job programming C.
I think it was Visual C++ 2 then MS Visual Studio 4.0. MSVC 2 could target the Mac, IIRC.
Borland C++ was the "industrial strength" Turbo C and Delphi the successor to Pascal. Watcom was a cool option and I played with it after it was open sourced. Cool that you could target dos, windows 16/32, os/2.
I had to stick with a Pentium 100 MHz from 1996 all the way to 2002 :( It would spend minutes swapping whenever I had to change browser tabs, but I had no choice.
I guess I didn't know enough about computers back then to occur to me to buy some more memory.
I had a 486 DX4/100mhz from around 1996 until 2001, I could not afford anything better - My Dad had a Pentium 166 but it was reserved for his work and was way more expensive then the 486 which was my personal computer, which I personally purchased for my own use - which in those days was mostly gaming.
The 486 was an absolute workhorse however one thing I do remember is it struggled to play MP3's in Winamp it would peg the CPU at 100% and made multitasking basically impossible. It wasn't until I upgraded from the 486 that I could play Quake and Mp3's at the same time.
Being poor in a poor country I had an Cyrix 486SLC from around 2017 until 2000. Even if it had 486 in the name, it was just a faster 386SX.
Beside having fun programming mostly under DOS and exploring any kind of software I could get my hand on (getting my hands on meant going to computer lab at school or Uni and copying stuff to floppy disks), I've tried installing Windows 95 on it, but was kind of slow so I reverted to DOS. The slowness could be in part due to me trying to compress everything with Drivespace, because the HDD was tiny.
> 1990's? Once it was 1995 you wouldn't be caught dead with a 486.
I remember going from my 486 to something higher being rather expensive. Expensive processor, new mainboard needed and depending on what your non-ISA bus situation was, a new graphics card. Benefit, compared to the late-stage triple-digit-mhz 486s? Well, better floating point, but Quake came out a bit later. I remember mp3 decoding failing a bit with my 486/133...
And I saw plenty of Borland C++ used for Windows development.
No, you are wrong, even by 1999 some people was running some 486's instead of Pentiums. And some people in 2004 used FreeBSD with Lynx and Mutt on them.
Not so many, true, as AMD K6/K7's where good enough to be close to Pentium II and III machines.
My path was (C64) Basic -> Quick Basic -> (Borland) C++ -> (x86) asm -> Java -> Python. Mostly.
There's some Ruby, plain C, Lisp, Haskell, C# in there, and lately Ocaml and Rust. And lots of shell scripting in between. Never needed Pascal, and I stay away from Perl. Clojure is still on my wishlist, as is Scala (because of Spark, mostly).
I think almost all of us started with BASIC.
Then we all wanted to do more... and with more memory and do it faster... so I needed a compiled language.
So, I learned C, then Pascal, then VisualBasic. C forever corrupted me.
Then C++ was the future, so I learned that... and then the web came... and with it came CGI scripting with Perl. I really fell in love with Python, though. It was what BASIC should have been, but Python was slow, so I learned Go. Along the way I also learned Prolog, Lisp, Java, Javascript, Rexx and a host of different OS scripting languages. Now, I just look at a problem, and pick something that fits. Mostly. Except when I want to learn something new.
Almost perfectly matches mine, though you can put Logo before Basic. And some weird stuff between C++ / Perl, based on the language du jour in the Computer Science department. Plus Java. Don't forget Java.
BASIC (GW & Apple) was ubiquitous, but a BASIC compiler was incredibly difficult to find, even illicitly. And 13-year-old me had no hope of purchasing it for $500.
for me it was VB at home, turbo pascal at school, TI-86 basic for "cheating" at school (is it really cheating if you had to learn the math to write the program?). Then HTML+JS for websites, python and C++ for college, PHP for internship, C because I wanted to learn how PHP worked, and then onwards into real life.
(Actually it was back to the beginning as my first job was VB6+PL/PGSQL)
Oh, man. I had my TI-85 programmed to the moon. Built my own serial link cable.
Still remember my chemistry teacher let me use a stoichiometry app I coded during exams and welcomed any other students to do the same if they felt it was unfair. There wasn’t an App Store to download it from.
Really taught me the concept of mastering first principles.
No matter how much experience one gets it’s always great to get to be a beginner at something
Ah yes, fiddling with jumper switches in inaccessible locations, tiny screws in even more inaccessible locations, trying to get an IDE card into a slot without breaking something, cables that were too short. I don’t miss it.
I worked for a fledgling consultancy in 1995, and we spent a small fortune on a 4GB drive for the network server (which was HUGE in those days - in my previous role the entire IT department of 20-ish people had a 4GB shared network space to work off, which was plenty. Having that on a single drive in a single box was cutting-edge!). The drive took ~5s to spin up, and made this incredible "vvvvvvmmmmm" noise as it did so, as the inertia of god-know-how-many disk platters was overcome by the motor. Once running, the machine then took another 30s+ to actually boot up into Windows NT, making clunking and whirring noises all the time.
Booting this monster up was the morning ritual, together with getting the coffee machine brewing and opening the blinds. Sometimes I hear a similar sound of an electric motor fighting inertia, and it always takes me back to those days.
Or you could get a secondary mechanical HDD :-P. Last year i got a 18TB mechanical HDD for extra storage in my desktop and the noises it makes are like as if the platter is made of stone and the headers are digging it to store the data :-P. After 10+ years of using only SSDs - and the HDD being quite more noisier than the HDDs i used for most of the late 2000s - the clonky mechanical sounds were outright nostalgic.
Some "new" hdds are absolutely insanely loud.. I got some recently where having 3 in an otherwise quiet case made them very audible from the other end of the house! I sent them back.
Looking at the datasets for the WD Red Pro for example, there's idle/peak dBa from 34/38 for the 10TB to 20/32 for the 20TB
34 dBa idle. Are you kidding me?
I ended up settling on the 18TB btw, as that's all there was at the time, 20/36 dBa.. and quite happy with them, and they're "normal loud". Noticable sitting next to the pc, more so when active, but not from more than a few metres away - and ZFS doing a scrub no longer sounds like a an industrial accident either, so thats nice.
Most hard drives have a very little known feature called “Advanced Acoustic Management” that allows you to inform the firmware of the drive that you want to prioritize noise over seek speed, and it will gently accelerate and decelerate the moving heads.
Enterprise-oriented HDDs (which I assume includes your 18TB one) are a lot noisier than their desktop counterparts. I'd expect that e.g. a 3TB WD Blue would be virtually inaudible compared to your drive.
Yeah it is a datacenter HDD and noise isn't much of an issue there i assume. I have a couple of other systems with mechanical HDDs that are barely audible, but they aren't 18TB either :-P.
I bought that one because it was big, relatively cheap and read on /r/DataHoarder that datacenter HDDs are slightly less likely to have issues (not sure how realistic vs superstition that might be though, main reason i got it was because it was big).
I don't even care about the noise but I would want a mechanical drive because I suspect it will last longer. Most of my hard drives from the early 90s still work. When they fail, they fail gracefully--just mark a few sectors as bad, maybe replace a part of the head. I view most non-mechanical hard drives these days the same as cd-roms, usb sticks, etc., they can't be relied on for long-term storage.
I recall once noticing that my hard drive was acting a little funny. Every few seconds, it would do a little work and flash my HDD light. This was an ancient version of Windows, and I wasn't running anything fancy in the background.
Yup. I remember once with my desktop trying to figure out why it took like 10 seconds to access the D drive. I listened and could hear it trying repeatedly to whir up before it finally did. I don’t miss the tech but I miss the ambience.
> A real gold mine for a person like me who lives in a developing country with restricted access to old hardware.
Interesting, because if anything I would assume that old hardware would be more accessible in developing countries - I mean, that was the case 20+ years ago in my (still) developing country - I got my parts from an open air bazaar, where it wasn't weird to see 10yo items. Personally I was 5 years behind the state of the art.
My oldest piece of hardware was a dot matrix printer, which I used for school as recently as in 2007. IIRC the mentioned bazaar was the only place where you could find a replacement ribbon for it.
My hoarder relative still had some old crap from the 90s when I was clearing out his cellar a few years ago.
I think you can skip VC++97 and develop with msys and mingw. I haven't used those to target 9x in a long time, just older NT, so I'm not sure how up to the task it is these days. But I think it can get you more recent versions of C and C++ that route.
I use a recent mingw-w64 with a recent GCC (I'd assume the latest versions could work too) to target Windows 95 (through 11, obviously). I prefer the mxe toolchain to crosscompile from Linux for convenience but do also compile with mingw-w64 on Windows in a VM. mingw.org will probably work too perhaps with a little work. However recently I had to patch mingw-w64 to restore support for Win 95 through 2000, which they'd broken only for the sake of implementing SetThreadName [1].
My experience in general is it's pretty easy to support Windows 95+ as long as you make use of newer APIs optional by dynamically loading symbols, and deal with small issues as they appear. So it helps to have users actually testing occasionally... because I don't.
As for your diff, I'd be inclined to call AddVectoredExceptionHandler etc. through GetProcAddress where available. (Off topic, but I like to use that API for logging access violations.)
Yes, I guess I could do that, then maybe I can get it upstreamed.
However I was misremembering a bit and misspoke before both about how easy it's been to support Windows 95 (it was easy up until a few mingw-w64 releases ago... except for spawning processes) and how recent my mingw-w64 is (I'm using 7.0.0, 3.5 years old, and simply haven't tried anything newer; the GCC version is independent within limits). Also I have no idea the minimum Windows targetted by mingw-w64 actually is, they don't say. I think they don't actually care about Windows 9x support, they just never did anything to break it before 6.0.0. I have doubts they'd accept such a patch.
The most important trick was you have to use a build of mingw-w64 with 'win32' rather than 'posix' threading ([1], and the rest of that thread is relevant too):
> the 'win32' threading support in mingw-w64 is the original one and supports all 32-bit Windows, while 'posix' threading (winpthreads) is a recent addition which is necessary to support C++11 mutexes and threads, but has higher system requirements. ... That's why mxe switched to posix threads by default in 2019.
I've played around with old Visual Studio versions in virtual machines and as much as I like the older Windows aesthetic, dev tools sure have come a long way.
For those interested in writing code for old systems using modern tools, you can write Rust code for Windows versions all the way back to Windows 95/NT 3.51: https://seri.tools/blog/announcing-rust9x/
NCommander is an incredibly smart programmer that dove headfirst into the Windows XP for Itanium source code just to get Space Cadet Pinball's 64-bit port fixed
8-Bit Guy has stepped away from putting paperclips into rare machines to build his own homebrew 80's style computer, as well as writing several games for Commodore 64, early DOS and others
Action Retro has slowly branched out into putting strange Linux distros and even BeOS on his PowerPC Mac collection
Cursed Silicon put himself on the map by building the "world's fastest Win98 PC" to prove another youtuber incorrect and has been doing a lot of Linux archeology with Gentoo
Phintage Collector recently showed off Windows NT's "OS/2 personality" to run OS/2 apps on NT (as opposed to windows apps on OS/2)
The Serial Port resurrected one of the venerable Cobalt Networks "RaQ" appliances and is planning to set up a 90's style web host for his patrons
Why not? I learned programming with QBasic, which included its own little IDE and docs in the form of a help menu system. Years and years later, for the nostalgia of it all I stood up a DOS emulator with QBasic on modern hardware and it was a good time.
Collecting CD would never be as fancy as collecting vinyl, and collecting USB sticks would never be as fancy as collecting CD (unless in some Fallout scenario). The tech world is so... transient.
And look at how bloated our development environment is today. Bunch of extensions, themes and language servers. It may sounds oxymoronic since I'm a Zoomer, but I don't want these eye candies and too much nice-to-haves, and I just want to write programs effectively without attractions, while I want to dev environment to be lightweight as well.
As an Emacs user: don't underestimate the amount of code running inside Emacs, especially if you install a lot of packages. I measure Emacs 28.2 as shipping 1.86M lines of elisp (a bit more than is in the source repo), the core is 540 kLoC of .c/.h files, and I have ~273 kLoC elisp amongst a small set of packages.
Sure, Electron itself is probably still an order of magnitude more.
I was a hardcore notepad++ user before this but decided to spend two weeks using nothing but emacs then, a year or so later, nothing but vim.
I had to load up VSCode for a demo I was giving. All of the options and plugins and IntelliType and spurious warnings and stuff was crazy overwhelming.
Watching old videos from the time like the Computer Chronicles, I'm pretty surprised how many people were running OpenStep on Intel hardware. It seems like it really made a dent for a year or two there.
GUI development has never been as productive as Visual Basic in the 1990s and early 2000s.
You would just draw the UI, double click, and add code. Later versions also supported binding to data structures with one click and a few keystrokes.
Then we abandoned all that stuff in favor of the web and stuff trying to badly imitate the web like xaml.
It’s a bit shocking to me how totally out of fashion WYSIWYG is given how productive it can be.
If someone brought back that kind of experience cross platform with something like Go as the language, millions of programmer hours would be saved. Of course it won’t happen because nobody would pay for it even if it saved them thousands of hours.
I actually prefer expressing UI layouts abstractly in some kind of markup. Not necessarily in HTML and CSS, because the CSS box model is crazy, but something like XAML. In fact, last time I did a native application for macOS (many years ago), I decided I'd rather use the dead GNUstep Renaissance library, which is fairly XAML-like, than Apple's Interface Builder. I know I'm in a small minority here due to being legally blind, but drag-and-drop UI building was always a non-starter for me. I wonder how many sighted developers feel the same way for different reasons.
Have to agree... Although my version was Delphi + DevExpress components.
It was exactly like you describe it: drag components onto a window, arrange neatly, click and set data binding and properties, double click to add some code, voila - done - a crazy rich, quick, offline or online app, doing anything you ever wanted - real estate database UI, hotel management, custom spreadsheets, radio station playlists...
As much as I love web development, it definitely never came close to that level of productivity, simplicity and effectiveness.
Yeah it was pretty cool for the solo developer but for teams, it was hard or impossible to merge changes or version control them. Having multiple people working on a big VB6 app was an unpleasant exercise in manually coordinating who was working on what and developers frequently clobbering each others work.
If only someone had the foresight to have created an App Store for VB6, we might still have it.
The Web solved the distribution/installation/update problem.
It's actually a little bit amazing to me that app stores work as well as they do. It's still higher friction to install an app than to visit a web site, but most people have accepted it.
Development was extremely efficient back then. Delphi was even better than VB6, with RAD.
It's a bit sad there hasn't been any progress here in 25 years.
However web applications are a more complex use case overall compared to desktop applications.
I'm sure VB6 was limited by today's standards but I, as a clueless teenager, could create GUIs apps in minutes without any external access ever. It was magic.
I haven't really used .NET for GUIs since 2012, but throwing together GUI apps was pretty damn easy the last time I did it. With the added bonus of being able to choose between the various .NET languages
I wrote some industrial PLC GUIs using VB.net, and it very much felt like a more "serious" version of the old VB6 stuff.
When I was a starting professional programmer in the early 90s I had 3 books on my desk. The K&R C book, a book on BIOS/DOS and interrupts and a book on Assembler. That's was all the knowledge that I needed at that time to do my work.
>Windows 95 is the first full 32-bit Windows that's built by Microsoft. It's a break from previous generations of Windows which brings 32-bit programming to consumer which previously was only available on Windows NT.
Any experiences using Delphi for Android/iOS (and cross-platform generally) development? Seems like the only cross-platform solution that wraps platform WebView on each platform, has an imperative GUI, RAD IDE and produces a native code.
Really nice API. Compare to Mac of that era you needed to create and maintain your own event loop and have a bunch of branches for when a certain thing was clicked.
To show a basic dialogue you had to continue to draw it over and over in your loop.
> To show a basic dialogue you had to continue to draw it over and over in your loop.
It wasn't quite that bad. Basic alert dialogs were fully handled by Alert(); event loops for more complex dialogs were implemented by ModalDialog(). Anything non-modal could certainly get hairy, though.
172 pages for dialog’s lol. Thank god for ResEdit! You’re right though. If I recall because of the single thread certain built-in popups could more or less block. BUT - perhaps you wanted something a bit more tricky. Well, now you’re back in it! :)
I always liked that exitToShell() took you to a GUI.
I'd add QuickBasic 4.5, though I suppose that was late 80s. I still have the book that came with it around here somewhere. It was the first programming language / environment I used where I felt I could go beyond toy scripts.
"Hungarian Notation is the tactical nuclear weapon of source code obfuscation techniques; use it! Due to the sheer volume of source code contaminated by this idiom nothing can kill a maintenance engineer faster than a well planned Hungarian Notation attack."
Ten lines? Why so complicated? Back in my day we did it with just two lines:
mov al, 13
int 10
And then you pushed your pixels to the video segment (a0000). So simple and elegant...
But my dad still mocked me because it was too complicated! Back in his day, you didn't need to bother with stupid video modes, and could just POKE the pixels to your screen!
Not only this is architectural specific (needs BIOS to map a specific constant range of memory), but this will cause a lot of software interrupts. Interrupts, like exceptions, are _bad_.
In UEFI you have VESA Video Mode to handle most of the annoying mode selection stuff and GOP to handle...memory mapped framebuffer in linear address, so that you can just write directly to the framebuffer to push your pixel...that's even simpler.
You want double or triple buffering? Just make another memory buffer with the same dimension and strides, and recursively push each level of buffer up on render, easy.
Stop being so dramatic. The exact same code will still compile today, and you'll get your window in 10 lines or whatever. And anyway, it's not a window - it's a message box.
(You could probably put an honest-to-goodness window up in 10 lines, but you'd have to squeeze the code together a bit more than most reasonable people would consider permissible, and/or it'd be missing a few key aspects. But I bet 30 lines would be plenty for a full-on one though, with its own custom window proc that has a WM_PAINT handler that does something and a WM_CREATE handler that pops a context pointer in GWLP_USERDATA and a WM_DESTROY handler than frees the context. And a message loop. The code today would be basically exactly the same as it was in the Windows 95 days.)
Ah, don't mistake many of us collectively romping through the piles of outdated software and hardware with rose-colored goggles as something like that.
Nostalgia trips aside, I certainly wouldn't be eager to trade my three-decades-refined tooling and easy access to answers so readily. That old shit broke all the time. It may have been fun then, but I wouldn't want to return to days of squeezing kilobytes of RAM via autoexec.bat and config.sys tweaking, or agonizing over the perfect combination to keep my IRQs straight across all of my ISA slots.
Well, someone's hurt that their favorite IDE was derided once again 28 years later... :)
Seriously, VB was quick and fun to code in but ultimately just contained way too many crazy footguns to build large maintainable apps. It's one of those things I swore never to put on my CV for fear someone noticed and asked me for just one "one quick fix" to their mudball.
I applied for a vb6 job circa 2000 and was sent a timed exam that required detailed knowledge of its library. I was like, oh, you actually wanted me to really know it. At this point VB had Intellisense for 2 versions and I couldn't imagine actually memorizing its crazy library.
Anyhow, a real programmer definitely got that job.
hInstance can be retrieved from GetModuleHandle(nullptr). hPrevInstance is always NULL. lpCmdLine can be retrieved from GetCommandLine(). nCmdShow can be retrieved from GetStartupInfo() StartupInfo.wShowWindow.
And, so you can just do
and it works the same as the code in the article. Not a significant difference. But, I like peeling back magic code to make things just a little simpler.Can someone who knows better tell me why this is a bad idea? Or, is WinMain just an idea from 30 years ago that didn't actually go anywhere?