I do kind of wonder whether I’d have learned more Unix faster if I’d have bought an SGI O2.
Windows was a remarkable disaster area that I could barely grow to tolerate, let alone love. Everything seemed to be a canonical example of how not to implement something.
> "taking the helm of Commodore"
Oh, if only. Anything to get rid of the idiot that ran them into the ground.
There were articles and possibly demos of Be stuff in the Amiga magazines my family subscribed to IIRC (I was about 10).
It was discussed as a possible replacement, seeing as Amiga was going nowhere under Commodore/Escom/Gateway/etc. QNX was also discussed in a similar way, and I remember getting a demo of QNX for PC on a bootable floppy.
Linux turned out to be a more popular replacement (for me, and for Future Publishing who seemed to turn Amiga Format into Linux Format; at least many of the staff seemed to be the same)
And then commodore more or less just sat on it with very minor incremental improvements until by 1990 a 386 PC with VGA & a SoundBlaster pretty much closed the gap with a similarly priced & specced amiga 3000.
That's why (well one of the reasons anyway) the next generation of games like Doom originated on the PC and not on the Amiga.
Well, for the specific example of Doom the Amiga's main problem was its planar graphics; great for scrolling platformers and SEUs, terrible for raycasters, since plotting a single 1-byte pixel takes 8 writes (one per bit) in a naive implementation. There were a lot of clever hacks to address this, but none of them could quite close the gap.
Dave Haynie has written extensively on the C= suicide (as he describes it), I think there's even a few posts on Quora. Lots that might have kept the edge were started, even prototyped and announced to developers.
Course once developers start migrating away, it takes far more effort to get them back, if at all.
Sure, but they had a robust community of gaming and software developers on their side. Hence why they should have pivoted to the lower-power niche of the market - that's precisely where their pre-existing community could still be a decisive factor. High-end systems (x86, PPC) were a whole other ballgame, even in the mid-to-late 1990s. The console space is even more proof of this - mass-market consoles in the 1990s and to some extent the early-2000s were a lot closer to the 3DO than to anything based on x86 or PPC.
Also, nitpicking but Doom itself was "originated" on the NeXT, not on x86-- and the NeXT architecture was quite comparable to Amiga both technically and in its focus on the multimedia space. If anything, it makes a better case for Amiga having lost their technical "edge" at some point.
Also NeXT is more or less normal Unix workstation with m68k CPU and framebuffer and does not have convoluted architecture with tightly coupled various multimedia accelerators that Amiga has. And while the Amiga's architecture was good way to get impresive performance in 80's it is also the reason why Amiga could not keep up in the 90's as removing all the NTSC video related stuff would break backward compatibility.
For the Blinky lights, a magicshifter suffices:
(I did the OS)
So a RGB LED would have to be represented by three entries in /sys/class/leds/
That's kind of inconvenient when those three LEDs are actually one RGB LED.
This is only how the LEDs are represented to user space, you still need a driver for the chips the LEDs are connected to.
At the time I was so incredibly jazzed that I didn't need to "telnet" to the campus workstations to complete my Unix-based programming projects, since there was a full standard Unix API and an IDE (CodeWarrior IIRC) came with the machine. And compared to any other computer, the UI was mind-blowingly responsive under multitasking. It was also the platform I used for my first side-project (a graphical IMAP email client).
I thought I was quite the tech rebel at the time, thinking I was onboard the nascent stages of this OS, but it seems that by the time I became a devotee, it had already suffered a major internal setback with the AT&T deprecation of the Hobbit CPU. I had always assumed (mistakenly) that they switched to PowerPC because it was "better".
That's tautologically true, but something like the RT preemption kernel patch is still a basic requirement for a truly responsive system. Once that's merged, you then have buy-in for that use case and can start fixing all the things that are broken in user-space, as well.
Additionally, there's a lot that can be fixed by simple tuning. There's no excuse for UI input locking up when a system is low on RAM, for example. Any tasks that are critical to input on the system's console-like channel (whatever that is - it might be a serial console, UI graphics, a network connection or even an audio input subsystem plus some speech-recognition AI) should be running with elevated priority, so that (1) they're always responsive, no matter what, and (2) when the system starts thrashing, they can still grab enough of whatever system resources are available (RAM, CPU, device I/O), so that the user can survey the situation and kill the offending process. Linux distributors are dropping the ball here - this sort of system-wide tuning is their job!
> Any tasks that are critical to input on the system's console-like channel (whatever that is - it might be a serial console, UI graphics, a network connection or even an audio input subsystem plus some speech-recognition AI) should be running with elevated priority
This approach is prone to priority-inversion and it doesn't help with the underlying problem that programmers have written tons of synchronous I/O all over the place and the call chains are often highly non-obvious. For example, most Linux desktop environments will freeze if you use an authentication provider which isn't instantly responsive and that prevents things like interacting with menus, closing windows, etc. because various low-level libraries do things like check settings which at some point does something like getting your list of groups, which is normally very fast because it's cached. I got some patches into pam_ldap years ago to set socket-level timeouts so it would recover from something like a single socket error but the better solution would be fewer dependencies and interfaces which don't have tons of different things happening on a single thread.
Priority inversion is one of the main things that are addressed by the PREEMPT_RT patchset - without a fix to the priority inversion problem, you don't really have a RT system. Aside from that, I agree that we should move towards having async interfaces as the default whenever slow I/O is a realistic possibility. I'm not sure how many people actually use something like PAM-LDAP, though.
Not something where each computer is a special snowflake with a developer fragmented experience about which libraries are even available on the system, and lack of integrated tooling, forcing the trend of just bundling a web browser with the application.
Note that I included "desktop oriented OSes" on the same group as Be.
As for systemd, the very fact that Linux is late to the party of commercial UNIXes init replacements says it all, in terms of being a need that a large majority actually wants.
AIUI, these days you can just develop for Flatpack (if you're making a proprietary application) and declare your dependencies as needed. But do note that Flatpack is not a kitchen-sink "framework" or "platform": it's just something that addresses a number of fairly well-understood issues with the prevailing, mostly FLOSS-oriented, development model. (One reason that the issues are well-understood by now is that other "solutions" have been tried before, such as the LSB effort-- and found wanting.) This is key to decreasing complexity and making the system as a whole more comprehensible and surveyable. The commercial UNIXes you point to are far closer to your view of "each platform being a special snowflake", and big frameworks make this issue worse; they don't ameliorate it.
Setting the bar kinda low aren't we?
It can be hard to find the time to get productive in a new programming language. But if you have some spare time, this project could provide the motivation!
If that isn't "Near Death", then what is?
So... IBM definitely thought there was something of value there.
ARexx started as retail software that Wishful Thinking Development sold (like WShell). Commodore acquired it years later.
I developed for the Amiga and used OS/2, even on PowerPC machines. They did not look like each other. Although BOOPSI was object-based, it was still something bolted on top of the 1.x APIs. For example, BOOPSI had base classes for gadgets and images, but windows were an entity that lived outside altogether. OS/2 had the same problem: application windows were handles. But inside WPS, everything was a SOM object.
In the Amiga Workbench, everything was ad-hoc, even if some semblance of object-orientedness appeared with things like AppIcons, which required cooperation from the applications. They were still not objects: I am fairly confident that AppIcons were implemented through calls to workbench.library, not by creating a new BOOPSI object. I remember this stuff because I had lots of arguments with some very vocal OS/2 developers.
I still don't see what IBM would have needed from Commodore. It was actually the other way around: IBM extorted patent fees and cross licensing from Commodore. They had more patent lawyers in Boca Raton than C= had engineers in West Chester (my source: Dave Haynie, who dealt with such lawyers and had probably the most useful Commodore patents, the Zorro ones).
And as I mentioned, ARexx came out in 1987, in the 1.x days, as a commercial product, years before Commodore adopted it and before the OS/2 events mentioned. It had no license from IBM (what for?), since Will Hawes wrote it himself for the Amiga's architecture.
I do have some vague memory of IBM licensing something for OS/2 from the time. That weak reference was all search turned up. Quite a few links of people talking about it though, including a near duplicate of the conversation we're having on The Register. So if urban legend it definitely gained some legs somewhere. My CATS newsgroup archives, Devcon handouts and other stuff from that era are long gone.
That's not very helpful, sorry. :)
The one really unambiguously great thing that C= did was the custom chipset for graphics and sound. That was the area they should have continued to progress in; perhaps they would have ended up out-competing SGI and eventually inventing the 3D accelerator.
Startups are, almost by definition, exercises in turning disasters into opportunities.
MIPS and ARM were quite well established by that time. TI DSP's similarly. Redoing the hardware was certainly an option.
And that's exactly what Be did, switching to dual PowerPC processors instead.
Even though I know how this ends up (they move to PowerPC), I felt left hanging when AT&T canceled the Hobbit processor and they weren't sure what to do next. It was like the feeling you get at the end of a chapter of a great novel.
When Jobs hired Sculley, the company was on the way up. Today, it's losing customers, developers, money, and market share. [..]
Apple doesn't need a CEO, they need a messiah (or a crash test dummy). And any problem that requires walking on water as a solution is, you'll grant me, a problem ill-stated. Still, there may be a way to make the search for a new CEO easier
Sad to have watched BeOS go away. It was and still is pretty amazing.
Free from day one?
The big problem for any competitor to Windows was that nothing had the drivers support that windows had. Steve Jobs solves that by supporting only his own hardware. Drivers was a huge issue.
Interested to hear if Gassee talks about that in this series.
That's great in theory, but how do you then pay the developers?
Be was losing about $10M per year from the outset. It never came anywhere close to breaking even. The idea was, either this takes off (in which case you can make money a lot of different ways even if you give the core product away) or the threat that it might forces Apple to buy the whole thing, either way you get your money back and JLG gets what he wants, which is and has always been to be "right" even when he's monumentally and obviously wrong.
[ JLG led Apple product development when it was failing, Be which failed, and the Palm spin out that made an OS so lousy even _Palm_ never bothered using it any products. The man is often cited as a "guru" presumably in the same sense as when some random guy is brought in by gullible actors or musicians who've got a lot of money suddenly ]
In the late 1990s it looked as though it would probably close its doors and the investors would get nothing (technically they'd get the source code, branding and so on, but those are basically worthless), which happens all the time with that sort of business.
But the dot com boom saved them. Not Be, the investors. An Offering was written which under normal circumstances would get laughed out of the room. Ha ha ha, you have a product that nobody uses, and you want a nine figure sum of money for a business with a failed Apple executive and a bunch of hackers, No. But at the time you could write blah blah blah "Internet" blah blah and people wouldn't bother reading past the bottom of the first page of your IPO because they already had their wallets open.
So now the original investors had their money back, and gradually the big institutional hitters (e.g. pension companies) could exit too, you take $50000 of BEOS and you sell it to a thousand Be fans for $50 per time, when it turns into $4 of cash five years later you're fine and they learn a valuable lesson about investment.
Be got a little bit of money left over. The idea was, pivot into making software for "Internet appliances". You know, the computer next to your stove, or that multi-purpose device in your gym which... oh right, yeah, no, those aren't a thing, the idea went nowhere and Be's business went down in flames. JLG got to go ruin Palm instead.
Well, they weren't a thing at the time, and couldn't be a thing given the severe lack of wireless, always-on, Internet connections. They definitely are a thing today, but too late to save Be.