
The Acorn Archimedes was the first desktop to use the ARM architecture - lproven
https://spectrum.ieee.org/tech-talk/consumer-electronics/gadgets/why-wait-for-apple-try-out-the-original-arm-desktop-experience-today-with-a-raspberry-pi
======
mcv
My brother had a Archimedes. At the time they were introduced, they were
amazing things. 4 million colours, blazingly fast. At the time I was convinced
that this would quickly replace Intel-based PCs. That never happened, of
course.

Acorn had a tendency to be ahead of its time. Our first home computer, the
Acorn Electron, could be expanded with a 3.5" floppy drive. Later, when we got
a PC, we moved from 3.5" to 5.25" floppies. That was my first disappointment
in having to make a step backwards in tech.

The Archimes' successor, the RISC PC, could even have an Intel processor as
co-processor. Or, instead, a card with 4 or 5 more ARM processors. At the time
is was probably the cheapest way for a consumer to get a multi-processor
machine.

~~~
m0xte
Acorn were indeed way ahead of their time. Especially when it came to flunking
software projects. They had the hardware done but the OS was a quick hack job
because their proper OS (ARX) was late. So they ported MOS from the BBC series
and threw a GUI on top. And it initially sucked terribly. I was there for it.

I bought a fully stacked A440 when they were released, many thanks to the
kindness of a deceased relative and much to the dismay of it mother who said I
should have bought history books and art materials instead.

Really if they had something RISC OS 2 quality on day one, which would have
been a monumental feat like the hardware was they’d have done better. I
enjoyed every moment of using the machine after RISC OS 2 came out. Before
that I was slightly scared I’d burned a monumental amount of money.

~~~
rahoulb
One of the things I loved about RiscOS was a small UI feature. There were no
open or save dialogs. Instead, when you saved a file, it minimised to its icon
and then you dragged it to the relevant folder. To open a file, you located it
and double-clicked.

~~~
blywi
For me drag-and-drop is not complete without also being able to save by drag-
and-drop. On RISC OS you can save your files by dragging the file icon to the
place you want to save your file. This can be a directory window, or it could
also be another app icon on the taskbar, or an open app window.

This way it is possible to create workflows where you have small specialized
apps that complement each other. You would open a file by dragging it to the
first app, edit it, drag the result to a 2nd app do some more editing, then
drag the result to the next app and so on and save only the final result back
to disk. This is basically supporting the Unix philosophy of having many small
programs that do one thing well, and combining them to achieve the desired
result.

Instead what we now mostly have on all platforms are huge monolithic programs
that try to cram every possible feature into a single package. One reason for
this, is in my opinion, that a workflow that is similar to the RISC OS way of
dragging from app to app is just to cumbersome on systems without drag and
drop between apps, as every step one has to go through the filing system,
saving a file by browsing through a directory tree and browse the directory
tree again when loading the file in the next app.

~~~
m0xte
iOS (particularly iPadOS) seems to get this about right. Copy / paste /
sharing seems to work very well for various work pipelines I use. I rarely if
ever touch the filesystem.

~~~
rahoulb
Good point.

I tend to use my iPad as my primary computer - and once you get your head
around the fact that everything runs off the share menu it's basically the
same concept, without the physical dragging.

~~~
m0xte
Yeah same here on the primary computer front. It's certainly the best hassle
vs results computing device I've ever owned.

------
aaronbrethorst
The acronym "ARM" originally stood for "Acorn RISC Machine."
[https://en.wikipedia.org/wiki/ARM_architecture](https://en.wikipedia.org/wiki/ARM_architecture)

~~~
unwind
And present-day ARM cores are documented in (among many other) ARM's
Architecture Reference Manuals, i.e. the ARM ARMs. They're so funny.

~~~
klelatti
And the architectures are Cortex-A, Cortex-R and Cortex-M!

~~~
unwind
Ah yes, that too. So much funs.

------
klelatti
I remember attending a lecture given by Steve Furber in 1985 in Cambridge (UK)
on the new chip that he and Sophie Wilson had developed. The emphasis then was
on having an affordable 32 bit rather than low power. It seemed incredible
that such a small team could have developed a CPU at their first attempt that
was faster than offerings from their much bigger competitors.

Anyway much of the commercial success of ARM must be down to Robin Saxby who
was ARM's first CEO. His interview with Charbax is fascinating [1]. He
inherited a company with two customers, one of whom was failing, some
interesting IP but not much else. He seems to have had a relentless focus on
providing customers with what they needed - for example providing the Thumb 16
bit instruction set to reduce code size when potential customers such as Nokia
were reluctant to move to 32 bit designs.

[1]
[https://www.youtube.com/watch?v=FO5PsAY5aaI](https://www.youtube.com/watch?v=FO5PsAY5aaI)

~~~
guenthert
> He inherited a company with two customers, one of whom was failing, some
> interesting IP but not much else. Among the _else_ were those
> extraordinarily talented and dedicated employees, were they not?

~~~
klelatti
You're absolutely right and no offense intended to a pretty remarkable team of
12 who moved from Acorn. For balance here is Mike Muller from that team who
just retired as CTO of ARM last year.

[https://www.youtube.com/watch?v=ljbdhICqETE](https://www.youtube.com/watch?v=ljbdhICqETE)

and also Mike Muller actually presenting to Apple Computer staff in 1992!

[https://www.youtube.com/watch?v=ZV1NdS_w4As](https://www.youtube.com/watch?v=ZV1NdS_w4As)

Edit: added Apple presentation

------
Cadwhisker
I fired up my Armbook to write this comment. It's just an original Pinebook
with a special SD card to boot into RiscOS, but it's quite neat and very fast.

It's also the first true ARM/RiscOS laptop that has been made since the
original Acorn A4. That machine was the first portable ARM device ever made.

Operating systems have been born and died between the arrival of these 2
machines.

~~~
DennisAleynikov
I still use Samsung DeX despite it falling out of popularity after the inital
bump. They even had virtualization for a Samsung and Canonical provided Ubuntu
image that you could run in an app called Linux on Dex.

It has clear advantages for video editing, especially on export given the arm
chips often just spend time decoding video and encoding is quick cheap and
streamlined.

can't speak for other industries, but as a dev most stuff in apt is arm64
compatible and therefore runs fine inside of a container on android (this was
disabled with android 10 unfortunately)

~~~
CameronNemo
I have been using a pinebook pro for web development. Some docker images do
not work for arm64 and RAM is constrained (setting up zram helps), but
otherwise it is a great device.

------
beagle3
Title is factually correct, but is somewhat misleading - it implies ARM arch
existed and then the Archimedes was designed around it (and was the first
desktop to have that property);

In fact, the ARM was designed for the Acorn Archimedes, and that’s why no
desktop, or server, or anything, used it before the Archimedes - ARM stands
for “Acorn RISC Machine”.

~~~
jecel
VLSI made the ARM2 and chipset available in early 1987. I seriously considered
using that in a machine I was designing instead of the 68020. But I had been
burned by Motorola announcing chips and then cancelling them when almost
nobody used them (an update of the TRS-80 Color Computer chipset that could be
used with the 6809 or 68000) so I decided to continue with the 68020 until
somebody else came out with an ARM product. Acorn launched the Archimedes at
the end of 1987 and I bought the chips and the manuals and wrote an ARM
assembler and simulator. But the project was frozen and I only actually built
the machine in 1992 though the chips were obsolete by then.

[http://www.smalltalk.org.br/fotos/casa3.jpg](http://www.smalltalk.org.br/fotos/casa3.jpg)

While the chips were indeed created specifically for the Archimedes, it would
have been possible for some other company to have made a machine with them
before Acorn.

~~~
mattlondon
Your story sounds interesting - is there more we can read about it?

~~~
jecel
Some pictures from 1983 to 2008:

[http://www.merlintec.com/swiki/hardware/28.html](http://www.merlintec.com/swiki/hardware/28.html)

A talk about Smalltalk computers includes my projects, but I didn't make it
very clear in the slides which ones are mine (13-15, 22, 24, 28, 29, 35, 36,
40, 42-54):

[http://www.merlintec.com/download/2019_slides_jecel_fast1v2....](http://www.merlintec.com/download/2019_slides_jecel_fast1v2.pdf)

[https://www.youtube.com/watch?v=tATpzsyC6OA](https://www.youtube.com/watch?v=tATpzsyC6OA)

------
utopcell
I'm very curious to see this transition to ARM play out. There is a lot of
software that will now be compiled for ARM that can potentially be used in
open-source projects on much more cost-effective boards. Music (VST) plugins
come to mind.

~~~
rrmm
If I were a user in the music/film industry I would be worried if I depended
on Apple.

Apple has shown pretty clearly they see themselves as a consumer electronics
and services company above all. If all the "professional" users who rely on
Logic or FC or Apple in general jumped ship, Apple wouldn't likely notice it
in their bottom line (loss of mindshare not withstanding). They have no vested
interest in catering to these users and have produced hardware intended for
them as an afterthought. The high cost of hardware and absence of middle tier
products that make sense for these users is just icing on the cake.

"Professional" users of Logic or FC have little leverage to affect the course
of Apple one way or another (unlike a more focused company whose business
model is catering to these groups).

On the other hand Apple pro users do have a long history of being abused by
the company so they might be alright with it in the end.

Anyway, I'd be looking for an out if I were them.

~~~
samatman
I think this is completely backward.

To a first approximation, Apple _only_ cares about video producers, software
developers, publishers, and musicians, in their Mac strategy. In approximately
that order.

The only reason developers are on that list, is because XCode is how software
gets written for the rest of their platforms; the dominance of Macs as
personal workstations for Valley-style development is only a side effect of
that.

These users provide a halo for everyone else. A college student might spend an
extra $1000 on their laptop because they want to moonlight on some beats, or
have a YouTube channel, that sort of thing.

I remember seeing this disconnect when the new Mac Pro was released and many
here dismissed it as too expensive. Someone who works in CGI came around to
say that, no, $50,000 is a normal amount of money to spend on their
workstations and that their company was ecstatic to be able to keep working
with macOS.

~~~
fiddlerwoaroof
Yeah, unless you’ve actually priced out business-class setups, the Mac Pro’s
pricing seems crazy.

I’ve repeatedly found that the “Apple Tax” is at most something like $500:
other companies tend to advertise configurations that I wouldn’t buy, and when
I tweak them to match my requirements, their price is not notably different
from a Mac

~~~
rrmm
If you price like for like sure. My issue is usually that I almost never do. I
usually fall into needing something akin to a mac mini but slightly bigger.
The last mac pro I bought was a dual-G4 model when the gap between the mac pro
and lower end models didn't seem so severe. (Although I obviously do not
represent a sizeable demographic of the market).

For Apple, not catering to every particular consumer whim/need out there is
just smart business. But for the consumer, that is just another weakness of
the mac ecosystem.

~~~
fiddlerwoaroof
If you include the iMacs (e.g. the iMac Pro), I think there aren’t really all
that many gaps in the product line, spec or price wise. It’s mostly that you
can’t necessarily get the machine you want in the form factor you’d like.

~~~
hakfoo
That's sort of a deal-breaker for a lot of use cases. Think of all those small
corporate desktops out there which are beefier than a Mac Mini and wired to
cheapo VGA-only 19" LCD monitors because finance isn't going to pay to replace
a working screen. The iMac form factor is never gonna compete there.

And of course the neglected "I just want internal drives or expansion cards"
demographic. I;m not sure if the cheese grater design fixes it, but with the
trashcan Mac Pro, there were plenty of configurations where the equivalent
Dell/HP/Lenovo workstation was one clean self-contained box stuffed with PCI-e
cards and SATA/SAS drives, and the Mac Pro was an angry squid of Thunderbolt,
USB, and power cables to feed an array of external drives and devices.

I understand that their brand is based on seamless design and we-make-the-
decisions-for-you presentation, but it feels like there'd be an opportunity
for them to use a small-scale clone program as a market research tool.

Have it sell the form factors that Apple won't. The long-whined-for xMac.
Units with servicable/cleanable designs for embedded markets. A mini-ITX Mac
mainboard you can fit into existing kiosk/appliance designs. A rebranded
Toughbook running MacOS. Something in a huge rackmount/server cube case that
you can fit with a dozen internal drives. Frankly, I'd envision it as a
wholly-owned operation, that charges over-the-odds prices. If people are still
willing to put their money where their mouth is, they can claim epiphany and
make an Apple version of the same design. If not, they can declare the
business a failure, shutter it in a year, and start over next time they want
to trial a product.

------
rbanffy
You can also buy an ARM-based six-core laptop for $200.

[https://store.pine64.org/?product=14%e2%80%b3-pinebook-
pro-l...](https://store.pine64.org/?product=14%e2%80%b3-pinebook-pro-linux-
laptop-64gb-emmc-iso-keyboard-estimated-dispatch-in-october-2019)

~~~
dleslie
If only it had more RAM.

~~~
fit2rule
It would be a dream machine with 32 gigs of RAM and I would abandon my current
MacBook subscription instantly.

------
kalessin
Really good talk about this computer from CCC last year for those who haven't
seen yet:
[https://media.ccc.de/v/36c3-10703-the_ultimate_acorn_archime...](https://media.ccc.de/v/36c3-10703-the_ultimate_acorn_archimedes_talk).

------
ochrist
I have an Acorn Archimedes in my basement. Even though I don't use it anymore,
I won't throw it out. I did a lot of things on it back then. It was just miles
ahead of anything Microsoft did at that time. Good times.

~~~
russdill
Might want to keep an eye on the battery if it's an A540, A3000, A5000, A3010,
A3020 or A4000.

~~~
ochrist
It's an A440, but actually I also have an A3000. Are there any specific
problems with these batteries?

~~~
snvzz
They need to be removed. Cutting them out is alright.

If they haven't leaked already, they are about to.

~~~
ochrist
Thanks. The A3000 might end up in a museum, but I'll look right into it.

------
jonplackett
I made my first ever computer program on an Archimedes.

It was a ‘compatibility calculator’ where you put in the names of two
classmates and it gives them a percentage of love between them. We used to do
it on paper, taking all the instances of l,o,v,e in each name then adding them
to the number adjacent until you just have one number.

It went pretty viral within our class of 20.

~~~
spacedcowboy
I wrote my first ever (and last) virus for the Archimedes...

Some history:

Waaay back in the mists of time (1988) I was a 1st-year undergrad in Physics.
Together with a couple of friends, I wrote a virus, just to see if we could
(having read through the Advanced User Guide and the Econet System User
Guide), then let it loose on just one of the networked archimedes machines in
the year-1 lab.

I guess I should say that the virus was completely harmless, it just prepended
'Copyright (c) 1988 The Virus' to the start of directory listings. It was
written for Acorn Archimedes (the lab hadn't got onto PC's by this time, and
the Acorn range had loads of ports, which physics labs like :-)

It spread like wildfire. People would come in, log into the network, and
become infected because the last person to use their current computer was
infected. It would then infect their account, so wherever they logged on in
future would also infect the computer they were using then. A couple of hours
later, and most of the lab was infected.

You have to remember that viruses in those days weren't really networked. They
came on floppy disks for Atari ST's and Amiga's. I witnessed people logging
onto the same computer "to see if they were infected too". Of course, the act
of logging in would infect them...

Of course "authority" was not amused. Actually they were seriously unamused,
not that they caught us. They shut down the year-1,2,3 network and disinfected
all the accounts on the network server by hand. Ouch.

There were basically 3 ways the virus could be activated:

\- Typing any 'star' command (eg: "* .", which gave you a directory listing.
Sneaky, I thought, since the virus announced itself when you did a '* .' When
you thought you'd beaten it, you'd do a '* .' to see if it was still there :-)

\- The events (keypress, network, disk etc.) all activated the virus if
inactive, and also re-enabled the interrupts, if they had been disabled

\- The interrupts (NMI,VBI,..) all activated the virus if inactive, and also
re-enabled the events, if they had been deactivated.

On activation, the virus would replicate itself to the current mass-storage
media. This was to cause problems because we hadn't really counted on just how
effective this would be. Within a few days of the virus being cleansed (and
everyone settling back to normal), it suddenly made a re-appearance again,
racing through the network once more within an hour or two. Someone had put
the virus onto their floppy disk (by typing *. on the floppy when saving their
work, rather than the network) and had then brought the disk back into college
and re-infected the network.

If we thought authority was unamused last time, this time they held a meeting
for the entire department, and calmly said the culprit when found would be
expelled. Excrement and fans came to mind. Of course, they thought we'd just
re-released it, but in fact it was just too successful for comfort...

Since we had "shot our bolt", owning up didn't seem like a good idea. The only
solution we came up with was to write another (silent, this time :-) virus
which would disable any copy of the old one, whilst hiding itself from the
users. We built in a time-to-die of a couple of months, let it go, and
prayed...

We had actually built in a kill-switch to the original virus, which would
disable and remove it - we didn't want to be infected ourselves (at the
start). Of course, it became a matter of self-preservation to be infected
later on in the saga - 3 accounts unaccountably (pun intended :-)
uninfected... It wasn't too hard to destroy the original by having the new
virus "press" the key combination that deleted the old one.

So, everyone was happy. Infected with the counter-virus for a while, but
happy. "Authority" thought they'd laid down the law, and been taken seriously
(oh if they knew...) and we'd not been expelled. Everyone else lost their
infections within a few months ...

Anyway. I've never written anything remotely like a virus since [grin]

~~~
eythian
Did a sorta similar thing on RiscOS, in BASIC, on our high school computers.
It would hide in application directories and put itself in as !Boot (so it'd
run when the icon was viewed) and then copy itself into any other application
that was run/viewed (I forget which.)

Being the smartarse I was, I also unlocked the harddrive of some of the
computers (hold some key on boot), put it into an installed application, and
then relocked it behind me. This way anyone using that computer was guaranteed
to get it on their disk.

My fall came when I left a disk with the raw source on it with my name on it
in the lab and the teacher found it and figured out what it was.

Luckily the teacher was a good sort, I got a talking to from the principal,
and banned from the lab for the rest of the school year (but it was September,
so only a couple of months.) The next year they started giving me a lot more
access to our various school machines so that I could do whatever projects I
wanted with them, and once they start to do that you don't want to piss them
off and have them take it away, so I behaved :)

------
codeulike
Yay Acorn Archimedes. Programming it was such fun after a BBC Model B, like
stepping from a cupboard into a cathedral.

~~~
zem
i'm sad i missed out on it, jumped from a bbc b to an "ibm compatible"

~~~
ochrist
I moved on from a BBC B to an Archimedes. Still have both machines in my
basement. As I had to use a PC for the IT studies back in those days, I used a
PC emulator. This allowed me to use Pascal and other stuff when needed. So I
didn't get an 'IBM compatible' until much later.

------
brainwipe
Acorn Archimedes was where I learnt to code properly. Our teacher (Bruce
Dickson, legend) said that we weren't allowed to play games on the Archimedes
machines unless we coded them. I completely transparent ploy, we knew we were
played but it worked. I made a game called Stick Fighter and a tanks game.

My favourite app was called Euclid, which was a 3D app like Blender. I built
cities, an American semi truck and all manner of stuff. My love of gaming and
3D continues today.

------
lsllc
RISCOS still lives! You can easily run it on a RPi (amongst other things):

[https://www.riscosopen.org/](https://www.riscosopen.org/)

~~~
self_awareness
I've tried RISC OS on RPI years ago, but it didn't work that well. Sometimes
it dropped mouse clicks from some reason, also there were pauses for 1 sec at
seemingly random times (though this might be caused by slow SD card).

------
panpanna
First of all, the article is very apple centeric, as if the author is
completely unaware of anything outside his bubble.

Back to the article: what author kind of missed is that ARM was Acorns spinoff
of the CPU business since their computer flopped.

As an owner of one, I found it interesting but the OS was really odd (for
example . was the folder sepeator which caused "interesting" problems for C
compilers). They also promised high performance x86 emulation using their RISC
magic but failed to deliver.

Transmeta was eventually able to deliver what Acorn promised, but that came
5-10 years later.

Anyway, what finally saved Acorn was Nokia and Ericsson picking ARM up for
phones due to low power usage.

~~~
regularfry
The x86 daughter-boards in the RiscPC were a neat trick.

~~~
panpanna
Yeah, but that's all it was, a neat trick. Not really well thought from a
business pov and technically they had some issues, specially since acorn hw
was so different from a standard PC.

------
Avshalom
[http://www.marutan.net/rpcemu/easystart.html](http://www.marutan.net/rpcemu/easystart.html)

------
bencollier49
It suddenly occurs to me that RISC OS will now be a viable alternate operating
system for modern PCs.

 _And I 'll be able to run BBC BASIC natively on a Mac_.

~~~
pavlov
Apple has said they’re not going to support booting non-Apple operating
systems on the ARM Macs:

“We’re not direct booting an alternate operating system,” says Craig
Federighi, Apple’s senior vice president of software engineering.
[https://www.theverge.com/2020/6/24/21302213/apple-silicon-
ma...](https://www.theverge.com/2020/6/24/21302213/apple-silicon-mac-arm-
windows-support-boot-camp)

~~~
mhh__
It really frustrates me.

Apple put so much effort into very beautiful hardware only to completely lock
in their users. I understand why, but they have one of the only market
positions that could charge a premium for a computer ecosystem which is
genuinely free from the cancer of the "free" business model like Google and
Facebook and seem genuinely care about avoiding that model while
simultaneously locking people in to an almost Soviet degree.

I'd love to own a beautiful arm laptop, but I find apple software just dull -
not as "big" as windows but not as fresh as Linux.

------
JoachimS
"Now it’s back to competing on performance, and its ability to succeed will
likely depend on how well Apple has matched its surrounding silicon hardware
accelerators to the needs of developers and consumers. "

One thing the author seems to have missed is that what sets Apple apart from
most ARM based SoC developers is that Apple has an archicture license. They
don't use the cores provided by ARM. They implement the ISA by themselves. And
they are really good at it. Look at the A12 chips for example.

The ARM A-7x cores are quite powerful. But we can be sure Apple will take full
advantage of the higher TDP and incease performance in the cores beyond what
ARM provides in their cores.

Also, expect more CPU cores/device than what Intel provides.

The thing about the cores is alo true. Tighter integration with coprocessors
for AI/ML, Crypto (and improved ability to shut the down, increase/decrease
clock speed etc) will also be a boost compared to x86, where it is either done
in SW of using an internal or external GPU.

~~~
panpanna
Apple is far from the only company with that kind of license ( Qualcomm,
Nvidia, Samsung, even Intel)

~~~
JoachimS
I hope you noticed "most". I did not say "the only". There are many, many ARM
SoC developers. Very few of them have the right to modify or develop their own
ARM core.

It is also a question of what you do with your license. The cores in Qualcomm
Snapdragon chipsets for example differs quite little from the cores from ARM.

Simularly with Samsung Exynos. They are basically A7x with some other
big.LITTLE combination.

Apple on the other hand have repeatedly shown that they are both willing and
capable at doing their own micro architecture. I would like to credit the
aquisition of PA Semi for starting this. But they of course are not the only
people Apple have added to build competence and capability in this area.

[https://en.wikipedia.org/wiki/P.A._Semi](https://en.wikipedia.org/wiki/P.A._Semi)

~~~
panpanna
Not trying to start an argument with you, but this is incorrect.

Essentially all major vendors have this license, and essentially all major
vendors do their own heavily modified designs. They have different
goals,priorities, budgets and abilities but if you buy a COTS soc from a major
vendor you can bet your neck it's not original arm hard macro or even a
lightly modified custom version.

In fact, the only major 64-bit pure-arm soc I can think of right now is the
Hisense (Huawei) Kirin 960.

------
johannes1234321
Interesting history piece.

But to answer the question from the title: Since people wonder about the MacOS
experience on ARM. As the article rightfully says phones (thus the probably
most used end-user computers) are on ARM as well and are being used for many
things desktops were used before ... :)

~~~
rbanffy
Also, even the humblest Raspberry Pi board is orders of magnitude faster than
the fastest Acorn box.

------
jlarcombe
Apple's investment in ARM when it was originally spun out from Acorn a very
far-sighted decision in retrospect.

Another curio, for a while Intel made the fastest ARM processors, after they
bought StrongARM from DEC...

------
wenc
RaspPi's don't provide desktop class performance. I've owned several and they
could never be my daily drivers. (they're just not very fast, even when
running Raspbian)

These ARM servers however.... (32-core and 48-core ARM processors)

[https://www.asacomputers.com/Cavium-
ThunderX.html](https://www.asacomputers.com/Cavium-ThunderX.html)

------
anomaloustho
I had naively thought the term “Apps” was coined by Apple during the initial
iPhone release, as it had been called “Applications” on the Mac ever since the
time I had switched (from XP to OSX Tiger).

Seeing the Archimedes screenshot with “Apps” in the taskbar made me realize I
was sorely wrong. Does anyone have any interesting history on who was first to
start using “Apps”?

~~~
fiddlerwoaroof
“App” is the obvious abbreviation of application, I think the iPhone sort of
publicized it, but I’ve always used phrases like “spinning up a new app” even
with things like Delphi in the 90s

~~~
ncmncm
Calling computer programs "applications" is a neologism. Once that catches,
shortening to "apps" is automatic and happens everywhere at once, somewhat the
way programs became "proggies" in places.

Some of us are still cheesed at having our old, comfortable programs re-
branded to dodgy new applications:

"Programs run, applications crash."

~~~
fiddlerwoaroof
“Applications” doesn’t seem to be particularly new to me: e.g., Rapid
Application Development (RAD) tools like Delphi and Visual Basic were fairly
popular in the 90s. And, the jargon file attests to the use of the term to
signify programs for non-developers going as far back as 90:
[http://www.catb.org/jargon/oldversions/jarg211.txt](http://www.catb.org/jargon/oldversions/jarg211.txt)

~~~
ncmncm
The suits tried to get people calling what they sold "applications" right from
the beginning. "Nobody will pay for a program, but they will pay anything for
an application." Back then, an application was, literally, what you were using
the computer to get done: a spreadsheet would be a program, budgeting an
application. The suits bought and sold "applications"; users ran programs.

Then Eternal September happened, there were too few to push back, and all the
other stuff paying customers used to demand--training, printed manuals, _not
crashing all the damn time_ \--fell away, and finally the program became,
itself, the whole application.

Then the users needed an abbreviation for what it was that was crashing: the
"app".

------
rbanffy
Someone should port RISC iX to the Raspberry Pi.

~~~
LeoPanthera
RISC iX was just BSD ported to Acorn hardware, so there's very little to
distinguish it from any ARM port of a BSD OS.

~~~
rbanffy
It'd still be cool.

~~~
jlarcombe
I think you can emulate it, but it's a bit awkward.

~~~
rbanffy
The interesting aspect is that it had IXI's X.desktop. It ended up in SCO
Unixware, but it's an interesting historical artifact.

~~~
jlarcombe
Yeah, also an Econet implementation and some novel on-the-fly executable
compression in the file system I think? the whole thing was slightly hamstrung
by the comparatively huge page sizes of MEMC...

------
Crosseye_Jack
My first paid gigs was software for the Arc while I was still in school (BBC’s
and Arc’s were the computer of choice of many U.K. schools at the time). My
first HTML was written on an Arc.

Fond memories of the platform. Lander is still hard as fuck to this day...

------
easygenes
I would really love to see the macOS 11 beta running on something other than
DTK as a hackintosh.

~~~
brycesub
Will hackintosh be a thing once the ARM transition is complete? Surely Apple
will bake in some security into their silicon that will prevent OS11 from
running on anything but Genuine™ Apple Bionic.

~~~
rbanffy
Very certainly not.

Apple can add all sort of logic in their chips that competitors won't be able
to replicate.

~~~
mxcrossb
Agreed. People need to stop thinking about this as a change in ISA. The
important part comes when Apple starts loading their silicon with custom,
specialized components aimed at improving OSX.

~~~
AceJohnny2
> _when Apple starts loading their silicon with custom, specialized components
> aimed at improving OSX._

That's already the case, and was advertised. See the keynote picture that's
used (for example) in this article:

[https://appleinsider.com/articles/20/06/23/why-the-macs-
migr...](https://appleinsider.com/articles/20/06/23/why-the-macs-migration-to-
apple-silicon-is-bigger-than-arm)

"Secure Enclave", "Machine Learning Accelerators", "HDR Display support",
"Neural Engine", it's all already in there.

~~~
easygenes
Apple is unlikely to ever have any major SoC features which are generally
useful to end users that are exclusively available in their chips for more
than a year or two. Even if Win10 ARM vendors or whatever’s coming for PC are
playing catch-up with Apple, we’re unlikely to see major divergence. People
still need to get roughly the same things done whether they prefer Apple or
otherwise.

~~~
rbanffy
It's not enough to have an equivalent capability - if you want a Hackintosh,
you'll need to present those capabilities in a way that fools macOS into
believing it's running on supported hardware. For now, ARM has been limited to
the low-power desktop/mobile and high-end server niches. Apple Silicon will go
for the space between as well.

OTOH, Apple is smart enough not to tightly marry its software to a given
hardware architecture - they were compiling Rhapsody for Intel from day one
and a lot of iOS is directly lifted from macOS. Its lineage started running on
68K, then HP-PA, SPARC, x86, then mostly PPC, then mostly x86 again, and now
it'll switch to mostly ARM.

------
opless
The first desktop to use the ARM architecture was the BBC micro, it was used
as the second processor when it was being developed.

The Archimedes was the first ARM desktop computer.

An important distinction I feel.

~~~
joosters
...which is specifically explained in the article! Did it get something wrong?

~~~
opless
You missed my point ...

The (current?) title of the HN news item is:

The Acorn Archimedes was the first desktop to use the ARM architecture

Which is technically incorrect.

It doesn’t even vaguely match the articles title of:

“Why Wait For Apple? Try Out The Original ARM Desktop Experience Today With A
Raspberry Pi”

~~~
lproven
OP here. As I commented elsewhere: this is not the title that I submitted to
HN, and it's not the title on the target page. Someone edited it and I am not
sure why.

~~~
opless
Agreed, it would be nice if HN was less opaque about such things.

Thanks for the clarification though, it’s much appreciated! As was the
article!

------
physicsguy
I would have started infant school in the UK in 1995 and we still had one of
these in my classroom until I was in Year 2. Great educational computers.

------
whywhywhywhy
These felt so futuristic at the time, anti-aliased fonts, nice smooth
scrolling/dragging window dragging all while PCs were stuck on Windows 3.1.

------
guidoism
This is kind of a dumb article. It's not like those of us in the computer
industry don't already know about ARM. Before Apple's announcement you could
run Risc OS on Raspberry Pi — It is one of the more prominent install images
for RPi — It's not like Risc OS is what Mac OS on ARM is going to look and
feel like.

Ugh. These kind of articles annoy me.

~~~
asveikau
I had heard of RISC OS but never used it. The screenshot looks pretty ahead of
its time for 1989. Nicer looking than NeXT for example, though similar
aesthetic.

[Edit: After some digging around it seems like perhaps the screenshot features
the visual style introduced in the 1999 release or thereabouts?]

Seems like on HN of late there's some interest in 1990s UI revival. I'm
thinking RISC OS looks like a good candidate for that.

~~~
gindely
I also didn't use RISC OS - I came close in school, but the ancient machines
always got replaced the year before me.

However, I have use ROX for a long time - called RISC OS on X - it is/was
basically a file manager based around RISC OS, as well as some associated
desktop technologies (like app directories). I think it's fair to describe it
as a dead as a doornail (I just upgraded to Ubuntu 20.04, and since it no
longer comes with pygtk2, various tools are dead). Some of its innovations
found their way into other Linux desktop technologies, such as shared-mime-
info.

But if you want to see what once was, it's a place to begin.

(ROX was actually a lot nicer than RISC OS, when I finally installed it on my
RPi.)

~~~
regularfry
ROX was so good. I had it on my first Linux machine - a Mandrake install that
I only ended up with because it was the only distro I could find with an
installer that would give me working video card settings. I _lived_ in that
UI.

Time for a resurrection, maybe?

~~~
UncleSlacky
It's still part of the default desktop for the antiX distro (ROX-IceWM).

~~~
lproven
Only the file manager -- not the whole desktop environment.

The desktop manager is called ROX-Session and provides an icon bar, a pinboard
and so on as well as the filer windows.

------
moonbug
It was a magical machine.

------
tdy721
Photoshop; Adobe Creative Suite. I also missed Source Tree, but that one
didn’t change the kind of work I could take.

Also I think I’m waiting for apple to put a freaking touch screen and keyboard
on the same product. (Touch Bar does NOT count)

Then again, I’m an ARM desktop user already!

Why wait for Apple? idk, let’s wait and see.

