Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Windows NT 3.1 on Dec Alpha AXP (virtuallyfun.com)
170 points by jandeboevrie on Oct 3, 2023 | hide | past | favorite | 91 comments


DEC's relationship with MS was fascinating

DEC sued MS due to Cutler's contribution to NT. DEC thought they could sue themselves into MS's good graces. They had embroidered polo shirts saying "Strategic Relationship", and saw several employees wearing them, and the phrase was repeated often

DEC couldn't figure out why Office performed worse on Alpha than x86. They sent an employee to MS, only to find MS had a single build server stashed under some desk. MS was compiling Office with -O0, obviously doing the absolute bare minimum to check a box and move on with their lives.

Source: I'm a former DEC employee


Fascinating. Dave over at Dave's Garage on YouTube talks about having a dec I believe -- if what you heard is true I wonder if it was the sole machine you're talking about. Maybe he can shed some light from the other side (Microsoft)


IIRC - he mentioned that someone _else_ had a DEC machine, and actually used it as their dev box. The dev with the DEC box person developed the kernel panic code, aka the blue screen of death - and blue was chosen because that's the default screen colour when the DEC box is turned on. The idea was to reset the colour to the default before printing the kernel panic message.

So while DEC NT is sort of a footnote, it did have this pretty profound influence : )


This is a bit of a game of telephone - NT Alpha shipped after NT 3.1 (i386 & Mips) and the port was done almost entirely by DEC. The blue screen preceded the Alpha and was really based on the color scheme from the firmware on the Mips workstation which Microsoft built internally. And, of course, the legendary SlickEdit, which was one of the original editors available on Win32.

After NT 3.1, Microsoft assumed primary responsibility for NT Alpha, although there were also some great people at DECWest still involved.

source: me, I'm the 'someone _else_' who owned all the Alpha stuff at Microsoft.


Windows 3.x's fatal exception screens were also blue.


Indeed; I think Windows 2.0 even had it. The story might sound plausible until the holes are poked in the hypothesis.


So were the color schemes on many 8-bit computers of the '80s, such as Ataris and Commodores.


I was running OpenVMS on VAX machines in that era. I couldn’t figure out why DEC wanted NT, a single user operating system, for their mini computers. It made no sense to me. As an aside I had an OpenVMS gig as recently as 2020. It was fun to dust off my old DCL skills.


> I couldn’t figure out why DEC wanted NT, a single user operating system, for their mini computers.

When did you figure out that NT wasn't a single user operating system?


NT's support for multiplexed interactive user sessions is piss-poor and has always come with a CAL restriction that was _absurd_. To this day there's no real "multiuser on consoles". (Yes yes, Citrix, rdp, Powershell Remoting, OpenSSH: compare those with X and, I guess, also SSH, and they suck.)


I'm not sure what the requirement for purchasing CALs has to do with whether an OS is multi-user or not. I have administered or been a user of interactive remote NT user session systems going back to beta testing the initial release of WinFrame in the mid 90's. Although worlds better today, I would not have called it pisspoor even then. Today RDS is much more capable than X on a unix system.


As someone else in that space in the mid 90s I would agree that it was pretty remarkable nearly 30 years ago, especially considering the competition.


As far as I know NT 4 was not able to have 2 users logged on at the same time. So it was not a multiuser system.


A request for an official answer on CALs for Microsofts' SSH port is about to turn 6 years old:

https://github.com/PowerShell/Win32-OpenSSH/issues/926


While MS licensing is shitty, there is nothing ambigous there, it's the same as everything else there, be it RDP, SMB or Apache.

If you use the User CAL model then any user can have any amount of connections from any amount of devices used by the same user.

If you have 100 employees using SSH (or Apache or whatever) - you need to license 100 User CALs.


Windows supports much richer remote access than anything built on X. I can log in using standard graphical login, my keyboard layouts carry over, audio is transparently handled, the screen size isn't tied to the existence of a remote monitor and 3d acceleration just works. And all that works out of the box with no setup necessary other than enabling it.

On Linux, the only workflow I could get working was to use x11vnc to control an existing local session. To log in, I have to run x11vnc as root, pass it some magic cookie (I have no idea what that is still, but some magic incantation from SO did the trick), then once I log in, the running x11vnc dies, and I have to restart it as a regular user to control the session I just authenticated.


You can just "ssh -X" into your remote server, then run an app. SSH will configure X authentication for you (the "magic cookie stuff") and set up the "DISPLAY" environment variable to forward X11 connections over an encrypted tunnel. The app will display on your local desktop. It will even work on a Mac, if you install XQuartz.


you've never seen remote X, it was a decade ahead of RDP


RDP is great. It works fluidly over 56k modems. X “sucks” as it could never do that.

This argument works both ways.


False, I've done remote XDMCP in the early 2000s and it worked well on dial up.


I have an Alpha here with OpenVMS! It's fun to play around with. VMS was the first multi-user system I had access to, way back when I was a teenager in the late 80's.


i'd probably blame the compiler. It's a weird mess of DEC objects being glued together to run on NT (Like the MIPS/PPC ones too). And if you wanted to make anything that for sure ran on all CPU type's you'd build /Od. Doing /O2 or /Ox was a recipe to run nowhere.

The real shame is that 64bit beta compiler does such a substantially better job on the Alpha, but like everything else once everything started to gel, it suddenly imploded.


Compiler was definitely somewhat to blame. I was on the C/C++ team at the time, and C++ was definitely an afterthought on the existing C frontend, of a big 3-stage compiler

I was on Digital UNIX so I didn't deal with VMS or Windows much. But Digital UNIX couldn't get out of their way with object file formats (COFF vs ELF and whatever else), on top of the compiler inefficiencies

A recipe for disaster when the MS company employee doesn't want to do it in the first place, and you give them an artisan product that's hard to use properly


damn so many potential questions, but I don't know if you can answer any as I know most people that were in those pivotal years are so cagey about it.


The single best x86 computer I've ever used was a Dec Pentium 90...updated eventually to a Pentium II-300. Dec hardware was stupendous. Other systems were (obviously) faster and smaller and probably cooler and more power efficient, but their daughter card implimention of a CPU/RAM upgrade came from their obvious expertise in the Server/Scientific Workstation market with non x-86 architectures.


and resulted in project failure. Grant Saviers was responsible for PC division in early nineties.

Oral History of Grant Saviers, part 2 of 2 https://www.youtube.com/watch?v=Od830KDrLUU

Oral History of Grant Saviers part 1: http://archive.computerhistory.org/resources/access/text/201... Oral History of Grant Saviers part 2: https://archive.computerhistory.org/resources/access/text/20...

>As DEC’s Corporate Vice President of PC Systems and Peripherals from 1990 to 1992 Grant successfully restarted DEC’s PC business from a dormant state and grew revenues to $350M and break-even profitability in 18 months.

@18 minute timestamp - they copied DELL strategy and did pretty good, business was growing and then DEC founder and CEO Ken Olsen decided to kill it. Grant got recruited to lead Adaptec.


Killed by Ken Olson? That doesn't really make sense? They still had the PC division in like 1994 long after Ken was gone? Or am I getting the timeline wrong? Maybe just slowed it.

I haven't yet listened to those Oral histories, but thanks for pointing out another that's focused on DEC.

Love those oral histories. I wish sometimes they were longer and asked some more in detail question about certain things. But I guess you can't keep people for 10h straight.

Edit: Where the hell is the video for Part 1? Doesn't exist?


From what I understand Ken killed the idea of using cheap commodity parts and moved PC division hardware design inhouse thus turning profit making venture into huge money sink.

Part 1 doesnt exist in video form :(


The document I linked in my top-level post suggest that PC division was doing Ok by 1994 but that document is written by an insider who was not directly working on the PC stuff.

> Part 1 doesnt exist in video form :(

Guess I have to go old school and read it :)


Awww. Boss had a Digital HiNote that seemed like a pretty nifty laptop compared to the competition at the time. https://en.wikipedia.org/wiki/Digital_HiNote#/media/File:DEC...


What a pity... it seemed like "what should DEC doing in the 90s, and how is it related to DEC's historical identity?" was the question when looking back at their 90's strategy – and it seemed like thoughtfully engineered hardware, even if on a commodity architecture, is one of them.

Even if it wasn't the full stack integration from semiconductor fabrication, to OS and networking, that they were used to.


Yeah that relationship is strange. Seems to me they should have used that Cutler/MICA thing to make a huge amount of money and help their PC division.

Not sure exactly what that 'Strategic Relationship' bought them. Seems to me all the companies making partnerships with Microsoft got shafted. Microsoft were always somehow able to use getting sued into making profitable deals for them in the long run. Because most companies didn't want to drive home the dagger, but rather get a few concessions on some kind of partnership.


Not sure if this is true, but Microsoft(or Cutler) allegedly stole MICA source code and used it NT, which was part of that lawsuit.

https://techmonitor.ai/technology/dec_forced_microsoft_into_...


Former VMS user here. The whole "affinity" program that DEC keep pushing just seemed confused and pointless. I think by the time DEC management realized that they had "played themselves" it was too late. And then the Itanium thing happened.


> And then the Itanium thing happened.

Its still funny to me how Itanium by its announcement basically killed or significantly delayed most other architectures only to turn out to be a dog.


Context is important.

All of those competing architectures were becoming prohibitively expensive to enhance.

Alpha never paid for itself, ever, the market share was too small. POWER and SPARC had multiple failed projects with enormous capital costs. MIPS hit a performance dead-end with such low market share SGI saw no way to rebuild. HPPA was bleeding HP dry.

Itanium may have hastened their demise but all of these archs had the same core problems with investment returns. The writing was on the wall, and Intel was offering something people desperately wanted. (Unfortunately for everyone involved, it did not pan out.)


I wonder how much of that was each one being largely one-vendor operations. Even high-margin markets like servers and HPC can't compete with selling a billion cheap chips for white-boxes.

If there had been an "Alpha Inc" akin to the ARM Ltd model, I wonder if the platform could have survived longer. It might be able to tap firms that want to play in the high end market, but weren't interested in buying from a direct competitor (picturing those big beige Dell Poweredge PII cubes, but with Alphas in them), or they might have gotten a better deal out of stuff like AMD using the EV6 bus for the Athlon.

Aside from that, I suspect there was a significant aspect of vapourware to the Itanium strategy. Peak Intel had great manufacturing process and an endless bankroll; it was easy to assume that they'd deliver a product that would be impossible to compete with, may as well give up on another architecture. By the time the Itanium product shipped and everyone saw what a lemon it is, it was too late to reallocate the resources and make up for years of lost effort.



I that understand that context. But even so.

POWER and SPARC were continue to be developed. And its hard to argue that either of them should have given up on their architecture. POWER is still developed. It probably made sense for Sun to stop working on SPARC eventually but not when Itanium was announced. Sun problem was just that they spend quite a bit of money, they were just never actually very good at designing processors.

SGI was the most aggressive on telling everybody that Itanium would be the future. And they paid for that. They massively delayed MIPS upgrades (only to then start it again once they realized it would take a while for Itanium to come to the market). For a while after this they still made their money on MIPS and all their attempts to push Itanium pretty much fell flat. So arguable it was smart for them to plan to eventually dump MIPS but now how and when they tried to do it (Also I don't disagree that MIPS was a deadend).

The deal HP got would have been fantastic if Intel had adopted PA-RISC 64-bit instead of Itanium. Given what happened HP wasn't really competitive with Sun 64-bit SMP servers and Sun made a killing on those things before the bubble. Once they bought Compaq and for a while were selling SMP servers based on Alpha.

HP really should have continued to push Alpha after the bought Compaq. The already had VMS ported to Alpha and a large captured base willing to pay a little extra. They aggressively ported VMS to Itanium and thanks to the insane deal they made Intel had to manufacture them processors for a decade+.

They are a large enough company and Alpha next few version had the chance to be really great, with things advanced vector extension and they had basically had the best processor team in the world.

So yes, it was correct that all these companies wanted to drop their development cost, but doing so before you know really about Itanium and how good it was, that's questionable and messed up all the strategies. There was just an assumption that Itanium was gone be amazing. Granted, VLIW processors were all the hype. Sun also waste a bunch of money on VLIW processor technology in the 90s and unlike HP, they didn't it on Intel.

Gordan Bell tried to hook up Alpha with Intel and make Alpha the 64-bit architecture that Intel went with. That would have been quite a different history. But for various reasons this didn't happen, and instead Intel went with HP and their next generation VLIW idea.

P.S: It would have been pretty smart of Sun went in on Alpha in 1992, then they could have saved on development threw out the 90s. DEC had built up a huge fabs to handle Alpha but almost nobody bought it. They were really looking for a high value costumer and couldn't find one.

Edit: > (Unfortunately for everyone involved, it did not pan out.)

Well it wasn't unfortunate for everybody. Sun made quite a bit of money for a few more years with SPARC. And so did IBM with Power. I think those two companies were happy with the Itanium failure.


I remember when Rick Belluzzo became CEO at SGI. He immediately started pushing Windows NT and Itanium abandoning IRIX and MIPS. A lot of people wondered why and it seemed like he was on the Microsoft payroll. Then he left after about 1.5 years in 1999 to join Microsoft. The whole thing still seems fishy to me.

https://en.wikipedia.org/wiki/Richard_Belluzzo


He pushed the same thing at HP as well. And the left Microsoft soon after as well.

I think its more like that he was brought in by the board because those were his opinions.

The board was sick of paying for development of MIPS and IRIX. The believed their future was massive multicore Itanium Windows IT systems. And because they wouldn't do any of the development but would make the same kind of profits, they would just print money.

This of course backfired in a whole bunch of different ways.


Sun's last successful SPARC project was the UltraSPARC II, in 1997. Their subsequent projects, like "Rock" and "Niagara," were disasters. (USIII and USIV were little more than core shrinks of USII)

If Sun had given up the day Itanium was announced, they mighta come out ahead!


Well technically successful maybe. But lots of people still bought their serves even after the bubble. And those server were expensive with good margin.

Giving up instantly when Itanium was announced would have just made them a company that waste a lot of money a lot of money on porting things to Itanium (and Sun had quite a lot of software that's not exactly easy to port). Only to not have a 64-bit system to build servers from. Essentially at best offering good x86 32-bit servers for the next couple years.

To be sure x86 workstations and servers should have been a big part of their strategy already in the late 90s. But that's not the same as adopting Itanium.

SGI did exactly what you suggest, give up on their chips and OS. This was terrible choice. Because Itanium was very late they had to restart MIPS development and were not able catch back up. Their Itanium products all didn't sell well even when they finally arrived.


They also didn't really invest in a next-generation graphics system. Just a rehash of the InfiniteReality.

SGI also did have X86 workstations, first with a proprietary SGI chipset that required as special HAL for NT/2K, and then just a standard PC workstation.


SGI was already moribund in 1997. Executives made a conscious choice to pivot. It didn't work out, but it is real hard for me to believe they could have done better with a redoubled focus on MIPS development they could not afford.


Yeah, in 1996 SGI bought Cray for $740 million. They got CrayLink/NumaLink, but also a big 64way SMP SPARC machine, which they sold to Sun for "significantly less than $100 million." Which became the Sun E10K, which made a lot of money.[0]

Then a bunch of the graphics people (who did the GPU for the Nintendo 64) went to found ArtX [1], which got bought by ATI.

There was also the ill-fated Fahrenheit project.

[0] https://www.forbes.com/2002/05/06/0506sun.html#703713c16a5e [1] https://en.wikipedia.org/wiki/ArtX


Pretty sure 'CrayLink' was actually developed by SGI and just branded 'Cray'.


My bad; from wikipedia.. Looks like this[0] was considered NumaLink v1. Hooking up 16 SGI Iris'

[0] https://en.wikipedia.org/wiki/Stanford_DASH


At least this proves that their marketing was successful :)


They were forced to redouble on MIPS because Itanium was so late anyway. And even after Itanium came out making products around it took a while.

And I'm not saying not reacting to Itanium means go all in on your existing ISA and your existing business model. That wasn't my argument.

My argument was don't go all out on everything that currently makes you money and your costumers expect for a totally unproven architecture that even if it works out means that you are just gone be a commodity provider with commodity software. Something totally outside of the whole history of Silicon graphics.


The more I learn about history of computing, the more it seems like DEC canceling of Project PRISM (and Project MICA) is at the center of a lot of history.

Project MICA would have yield a very modern OS that could have supported both Unix and VMS interfaces and it would be only natural to add whatever Windows would come up with in addition to that. That PRISM and MICA seem to be a real paths forward for everything from workstation to multicore servers. We later saw with Alpha what DEC was capable of and PRISM also did a lot of compiler work that was also used for the Alpha compiler later. But that was quite a bit later of course.

First of all, Cutler and lots of others went to Microsoft and developed Windows NT. Without Cutler, this project would look very different. Bill Gates was seriously arguing for having no virtual memory on this OS! Absolutely crazy to think what would have happened.

If DEC had expanded, rather then contracted its team there, it would have been quite a difference, with two major companies developing a next generation OS in the area. I think DEC was generally more popular for technical people then Microsoft.

Then for DEC, their workstation strategy went to MIPS/Ultrix witch was moderately successful but that whole thing was then killed in favor of Alpha/OSF1. This created a lot of bad blood by the people that were stuck on MIPS/Ultrix. DEC was one of the few companies that really went all in on the whole OSF stuff. Just like Sun, it might have been smarter to stick with BSD base rather then System V SVR4 base (love to hear from somebody inside about this topic).

Because PRISM didn't happen over the next couple years the VAX line suffered from pretty bad price to performance. Management also seem not committed to VMS as OSF and NT became big parts of their strategy.

So this post about running Windows NT+Alpha is really kind of a result of this canceling of PRISM/MICA.

I reading some books on DEC now but during some online browsing I found this interesting document from somebody at DEC from late 1994: https://www.dgregscott.com/wp-content/uploads/2020/11/demise...

Gorden Bell's Post Script on 'DEC is Dead, Long Live DEC' was also a great read.


>Bill Gates was seriously arguing for having no virtual memory on this OS!

That’d surprise me. This was a time when one of the major limiting factors to NT adoption was it needing “a lot” of RAM. And he’d had plenty of experience with virtual memory, including his own company’s Xenix and OS/2.


Had DEC no cancelled, NT would have never been, and Gates wouldn't have had the confidence to abandon OS/2.

So in the end we all won, just Dec had to die.

But DEC only saw Altavista as a toy, something for showing off big systems, or doing personal desktop search. The missed such incredible opportunity.


Not sure why you say windows not doing OS/2 is so great?

OS/2 was a reasonable good OS and a history based on OS/2 rather then Windows 95 and friends seems to be a pretty good path forward.

And OS/2 being Microsoft and IBM would have made Microsoft not quite as powerful and dominate.

> The missed such incredible opportunity.

DEC was sold for around 10 billion, just a few years later Altavista alone was sold for around 2 billion.

And Altavista was valued basically 0 in the Compaq acquisition. Compaq seem to really only value the Services division, everything else they didn't care about.


My school had at least a thousand DEC MIPS/Ultra/pmax workstations in 1993. When they announced they were killing the line we bought thousands of Suns and HP-UX machines. I remember seeing 2 Alpha machines in a special computer lab and I could log in but none of our engineering software ran on it. Maybe in a few years but I graduated by then.


This is kind of the crux of the issue. Alpha came out with great performance (but pretty expensive) in general but some issues related to performance of Unix and strings.

They killed the MIPS/Ultrix line but gave people no path to move to Alpha/Ultrix. Ultrix was never ported to Alpha. Only Alpha/OSF1 so a fundamental hardware and software shift.

Its easier to go to SPARC/SunOS or SPARC/Solaris and they had lots of machines from cheaper to more expensive. For schools they often bought the slightly cheaper workstations.

Alpha 64bit wasn't really a hugely important feature that point in time. This again where starting with 32-bit PRISM would have been a better move.

At the same time, their VAX line was still not Alpha, they bet on NVAX for quite a long time. So the VMS crowed didn't need to switch to Alpha.

DEC wanted to make Alpha an industry standard but then they had little interest in working with Apple to get volume. The big workstation/server vendors had their own ISAs and since DEC also wanted to compete in those market simply adopting a new standard from DEC was just not gone happen. And as a new standard for lower priced machines it just not gone happen, 64-bit was totally wrong for markets like that and you would need Windows/Intel to even have a shot at that. It was also not good for embedded so DEC went with StrongARM.

DEC just made really bad strategy choices starting in late 1988s. By 1986 they had started the VAX9000 project. That might be fine but by 1988 you have to realize that project was dead and simply kill it, and rush out VAX based on PRISM with nice high quality VMS port.


I had a stack of surplus purchase alphas i used for all sorts of things. All but one of them we put SRM bios on when we got them; but that one was for some reason unable to take it and so was stuck with ARC and (as i recall) wouldn't boot linux.

So I gave that one to my teenage nephew, who ran NT on it and became quite the Windows geek with a successful sysadmin career. "I ran NT on Alpha as a teen" is a pretty good resume enhancer, I guess.


I had one of the UDBs back in the late 90s and used it as a MP3 server/foot heater running RH 5.2 (I believe; it's been 25+ years). My wife made me recycle it when we were cleaning out our house before our first child.

I should have kept it as a combo doorstop and space heater.


I worked at Digital on the Alpha/NT team at the time- the AXP 150 "Jensen" was my daily driver desktop- that machine, like most other DEC hardware, was incredibly well built.


Was the HAL really that compilcated?

It was listed as one of the reasons why Compaq had bailed on NT.

Also were you involved at all with the 64bit versions of NT?

It's cool that checked build got out, and kind of funny how the compiler was hiding in plain sight for decades, but glad I spotted it.


I enjoy to have such a machine as one of the few machines I kept over the years.


We had one of those. Very quick though it ended up as a server for mp3 files for the office.


After trying and failing to get MS Access to function on a DEC Alpha server, I finally gave up and installed SQL Server. It was a massive performance increase for many reasons, most of which related to MS Access being trash, but the better memory management in SQL server was part of it as well.

Sadly MS dropped support for Alpha, and changed the pricing models on NT and SQL Server so much it became unaffordable for small businesses. In 2 cuts, I watched the future of databases crumble.


How big was the database? that database 'engine' thing is free for DB's under 2GB back in the SQL 7 days.


That’s great! I’ve got an old copy of visual studio and windows nt 3.1 in the closet, ran on mips, alpha, sparc and intel. Back when Microsoft was cross platform.


They still are, ARM is widely supported. There just aren’t many other platforms in wide use today to target.


After NT 3.1, I don’t think Microsoft was ever a single processor architecture company.

Beyond what NT supported, WinCE entered the picture in the very late 90s with ARM support among other architectures.


Even before then, their cash cow was first BASIC implementations, on at least 6502 and 8080 relatively initially.

Then Multiplan ran on most of the major microcomputers of the time.

It's interesting that in their history, strict Wintel is more the anomaly in their history.


instead of sparc, I wonder if you were remembering the i860, which certainly ran NT/OS2.


Whoops, it wasn’t sparc but PowerPC. I’ve also got a copy of CodeWarrior for sparc. Lots of oldies but goodies. A few big boxes of them. Someday I’ll have to do something with them. Not sure if I’d get sued if I rip them and post them someplace.


There was a sparc port. I don't think it was ever public though.


I'm 100% sure you're misremembering it running on SPARC, NT never ran on SPARC.


Correct. NT 3.1 ran on x86, Alpha and MIPS. NT 3.5 added PowerPC support. As you say, SPARC was never supported.

Windows 2000 had both x86 and Alpha support up through the release candidate, but Alpha was dropped before final release. I actually had an Alpha workstation running Windows 2000 (for fun) back in the mists of time.


There's also the Beta Whistler AXP64 Build which was used to port Windows over to 64-bit before there was any 64-bit hardware available that could run NT. The people in the virtuallyfun discord got that running (I am part of that! I have an AlphaServer 800 5/500 next to me, with some hacks to the setup and telling it what HAL DLL to load it installed perfectly!)


Later there was Windows XP 64-Bit Edition for Itanic Workstations and there was Windows Server up to 2008 R2 for Itanic Servers.


Itanium XP was weird, it's like that super early AMD64 version where more of the OS is 32bit than 64bit, with no high theming.

It was still exciting at the time.


IIRC it was even weirder, because it was a XP branded Server 2003 under the hood.


It's even weirder than that :) The initial release of XP for Itanium was more or less a stripped down XP build with limited 64 bit support. They then quickly gave up on that approach and released a second version of Windows XP for Itanium, together with the launch of Itanium 2 support, which was, as you said, a rebranded Server 2003.


One could say, that they tried really hard. ;)


There was an announcement that Intergraph would port Windows NT to SPARC: http://ftp.lanet.lv/ftp/sun-info/sunflash/1993/Jul/55.11-Sun...

Guess it never went anywhere.


It didn't. There are rumors it got out to a handful of alpha/beta sites, but it was never a supported product.


Does anyone remember the FX!32 binary translator?

https://en.wikipedia.org/wiki/FX!32

I remember Linux had some version of this as well as the ability to run OSF/1 binaries.


Sure do! I was a beta tester. Pippi (sp?) was my contact at DEC. It was cool that the more you ran an x86 exe, the faster it got.


I found a post about the Linux version em86. I don't think it saved the results like FX!32

Someone on this forum was the guy who made the Linux version using the FX!32 as the basis.

https://www.phoronix.com/forums/forum/software/general-linux...


Is it just me or has there been a resurgence of interest in the Windows 3 era of computing lately? Both actual machines and aesthetically I see things flash by a lot.


Nostalgia from elder millennials such as myself fuel this fire. When computing was much more of an academic exercise than commercial. When new, always incompatible hardware and software was released. Higher variety of OSes and processor architectures, which is interesting to those of us who were too young to purchase equipment.

Etc…


There were so many cool exotic systems that I read about in books and magazines, but never had the resources or knew the right people to actually play with. Parents and school admins barely knew what that newfangled information superhighway was, and later on at best the stock response to "I want to learn C" was "we aren't paying $1,000 for Visual Studio" - GNU was a completely unknown concept until I stumbled across Slashdot some time in 2000. I imagine similar stories played out for many a mid-90s rural computer nerd.

Second-hand nostalgia, if you will.


This.

I saved up and bought a Sound Blaster Pro. I wanted an SGI machine that cost more than a car, not that I would have the slightest idea what to do after running the demos on it. That or one of those orange AS/400 boxes. Which I would have even less use for.


I think it's a backlash against touch-UIs forced on on destop users: everything is flat, lots and lots of empty wasted screen space, huge buttons, no borders, no scroll bars, etc.


I saw once one :) For a very short time, the German computer retail chain "Vobis" would sell Dec Alpha machines with Windows NT 3.1. Too bad that didn't expand later on. It would have been great for the computer market, if more competition to Intel had existed longer.


I remember reading during this era that Bill Gates' daily driver was a Dec Alpha workstation running NT... Don't remember the source.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: