Instead of typing a sequence of commands on the computer keyboard; the user merely points to tiny "icons" or commands on the screen by sliding the "mouse" (a plastic control box the size of a cigarette pack) on the desktop beside the computer. As the mouse rolls, an arrow called a cursor moves across the screen. To erase obsolete information, for example, the user moves the mouse to point first at whatever is to be thrown away, and then at an icon in the shape of a tiny trash can...
I've heard them referred to as "rats", but that term seems to have disappeared into obscurity.
The call went something like this:
[Call center introduction and customer response]
Employee: "So what seems to be the problem, sir?"
Customer: "I can't find where to change the background of my page. I've looked at all the buttons but I don't see a 'background' or 'color' button."
Employee: "I completely understand. Sometimes these types of things seem like they're hidden but it's pretty easy."
Customer: "It can't be that easy. I've been looking for over an hour."
Employee: "Well, it's in a context menu so they're kind of hidden. You can access those by going anywhere on an empty are of the screen with the cursor and then just right click on the mouse. It should pop up a little menu where the cursor is."
Customer: "Ok. Just give me one second. I want to make sure I do this right. I'm going to go get a pencil."
Employee: "Of course, sir. I'll repeat it again when we've done it once so you can make sure you write it down correctly."
Customer: "Thanks." [A few moments of silence] "Ok... so I move the cursor to a spot on the screen... [shoulder brushes receiver as he writes] ...C-L-I-C-K and then the computer will understand that as a command and give me a menu? Is that right?"
Employee: "That's correct. After you right click, you should see a little box pop up next to the cursor with a list of items. One of them will be 'Properties'."
Customer: "Do I have to wait? I did that and nothing has come up yet. How long do I need to wait for this menu to come up?"
Employee: "It should come up right away after you right click. Did you write click on the mouse already?"
Customer: "Yes, I did. I mean... my handwriting is not the greatest but hopefully the computer can still understand it."
The customer had used the pencil to literally write the word "CLICK" on the mouse. He didn't know that the left and right mouse buttons did something different. He thought they were for left-handed or right-handed use.
There was also a major bug in the M68000 - instruction backout didn't work. That was fixed in the 68010. But on the 68000, a page fault was not handled properly if a register had incrementation set. So the Lisa compiler had to be dumbed down to not use that feature, slowing down execution somewhat.
If Motorola had fixed those problems sooner, the history of personal computing might have been very different. Intel's x86 machines, with their 16 bit address spaces, might have gone nowhere on the desktop.
Hence the cost-reduced Macintosh - no MMU, no memory protection, no CPU dispatcher. Also no hard drive. The original 128K Mac was a flop commercially. Not until memory cost came down and Apple got a hard drive into the product did it sell successfully. The IBM PC had a hard drive earlier, which got them going in business use. The floppy-only Macs were incredibly slow.
The way it's achieved may not matter much with a 4 GHz multi-core CPU running a multitasking OS, but having to deal with 16-bit pointers and segmented memory in a 4.77 MHz 8086/8 was a huge pain I felt in the flesh.
Most compilers wouldn't even deal with that mess.
Now had Compaq not been so lucky, then the PCs might have indeed gone nowhere.
Of course these were after the Apple 2 so maybe you're correct but I don't think even I saw my first Apple computer until 1989, and that wasn't even in the UK (I emigrated to the US for a short while that year).
My secondary school had a bunch of Research Machines, until they were replaced by BBC micros.
Edit - oh yes, and about the ZX Spectrum... I can't imagine anyone willingly changing from something else to that rubber keyboard! Yuck.
The ZX Spectrum had tons of games (many of them quite creative, many weird), all kinds of software (limitied, of course, but even a Lisp) and several magazines dedicated to it. Many of my friends owned one, so it was much more fun than owning a more obscure computer, no matter how much better its keyboard was. Remember, there was no internet.
I still have nostalgic conversations with friends from that time about how magic the ZX Spectrum world was.
For reference: south east of England.
Had the 'PC' not won, I suspect some of the dos compatibles that the UK produced would have been more popular - specifically, Apricots were doing fairly well with the F1 and XEN in the 83-87 era where the PC hadn't quite dominated yet.
I do remember reading about the Lisa and thinking it looked amazing.
The BBC Micro didn't really get into mass production until 1982.
If you mean early eighties, yes. If you mean the whole of eighties, I'd say no. I remember lots of businesses with Macs in the late eighties/early nineties, especially for print businesses.
Around here print business were using Amigas and Ataris.
I used Apple ][s and had a demo of the Lisa when it was being launched.
At the time though, the 68K was regarded as the best general purpose CPU you could get and was sort of a "default choice" for building any reasonably high performance systems. Unix workstation manufacturers like Sun and SGI originally built their systems on the M68K platforms before their respective RISC architectures developed and matured.
With that said, it's totally reasonable to make the hindsight argument that Apple should have developed the IIgs line further, with its 16-bit 65C816, and then reevaluated the CPU landscape in the mid 1980s, where you had more mature 68K chips, 32bit x86 CPUs, and a plethora of RISC options. In that world, I could very easily see the Macintosh make its debute on a RISC or 386/486 platform.
Speaking of the IIgs: It's rumored that Steve Jobs requested they downclock the CPU (it only ran at 2.8mhz) to prevent it from seeming faster than the Macintosh.
At this point, I'm thinking the x86 memory segmentation sucked less than a 32 bit CPU with no MMU and bugs requiring the compiler to pull the parking brake
I am familiar with how bad it was on the x86 world, but then maybe that explains why Mac OS before X was so bad
This would have been around 1988-1989, and the Mac had been out for a while; this seemed like a quaint old computer at the time. Since the Lisa wasn't quite Mac compatible, there wasn't really anything you could do with it...the software it had (early variants of stuff like Mac Write and Paint and such, called, I think Lisa Write, etc.) was what you got.
When I unboxed everything I found the receipts from when the original owner bought it all new. He'd spent something like $20,000+ on the whole setup. Computer was $10k, and each of the hard disks was several thousand dollars.
I sold the whole setup for about twice what I paid for it (I seem to recall about $300, but it's been a long time, and it wasn't super memorable...I bought and sold a lot of weird old computers back then), after cleaning it up and testing everything and tinkering with it until I was bored (I was a Commodore kid with a C128D and saving up for my first Amiga...Apple stuff was just a curiosity, not anything I wanted for myself).
Though I wasn't super into it at the time, it's one of the few things I kinda wish I still had all these years later. It has real historical significance that I didn't really appreciate at the time.
Now, buying a new computer results in marginal productivity gains because likely it's just replacing another computer and maybe improving things or maybe just adding a touch-bar.
There are a few things you can do by adding expensive hardware, but they're generally limited to advanced media processing, like video editing and raytracing. Maybe a few medical tricks.
(Gaming does not count, it scales to generally available hardware.)
Apple Lisa 1983 $9,995 ($25,143 in 2018 dollars)
Apple Macintosh 128k 1/24/84 $2,495 ($6,000 in 2018 dollars)
Macintosh II 3/2/87 $5,498 ($12,125 in 2018)
Next Cube 9/18/90 $10,000 ($19,177 in 2018 dollars)
In hindsight I should have bought Apple stock. Oh well.
I'd say it's now $3,750 for the whole setup.
$3,000 for the tower/laptop plus docking station  , $650 for dual monitors  , and $100 for peripherals   .
And that's with Amazon + non-Mac.
Don't get the most expensive RAM or CPU, you lose 10% real performance but you halve your budget.
Windows is fantastically supported: Microsoft Office, Adobe suite, AAA games, Active Directory, databases, browser and browser plugins, and a thousand others.
Today, a Rasbpi costs $5.
Holy freakin' cow.
When I was in college in the mid 2000s my senior project needed a single board computer. We spent $500 on a small PC104 board with a 100MHz National Instruments Geode (x86 compatible) process. That didn't include any RAM or storage. I really wish the Raspberry PI would have been available back then. Especially the Raspberry PI ecosystem, makes interfacing with stuff so much easier. We interfaced a small LCD I found at a surplus shop and it took us weeks to get it working. No you can buy a touch screen LCD for $30 and there are libraries to get it working in about 20 minutes.
By the way, it turns out people think "automation is replacing jobs in manufacturing" because they've misread the data showing them just how much cheaper computers have gotten since then, and therefore how much more computer one factory employee can make.
But Lisa's pricing made the Mac more appealing. It wasn't exactly affordable, but you were getting maybe a third of a Lisa - including the new GUI, which was obviously the future [tm] compared to the Apple II - at less than a third of the price.
Even at the prices being charged, the Mac and PC were competitive with previous standards, and affordable enough to have a sizeable market among affluent middle class users.
They offered more than the old S-100 boxes did, for the same or less money. And they were much cheaper and more "personal" than industrial minis like the PDP-11 and the VAX.
Regarding the price, mind that the Lisa came with an integrated office suite (long before the success of MS Office) and was targeted at offices and professionals (think dentists). Which rendered it a somewhat curious "workstation for office work", at least from todays perspective, where eventually office machines became the epitome of cheap, bare-bones boxes.
The MRD mentions as potential users secretaries, managers, and executives (of Furtune 1500 businesses) as well as bookkeepers in general.
I grew up with the ZX Spectrum, like so many others, and it is what started me programming/developing/being interested in computers.
You're probably looking at it from a biased point of view though. When I was growing up everyone had a Snes. Of course it was just all teenage boys that had one.
Everyone loved the C64 though, so fond memories are worth something.
I did buy a dual floppy PC clone in about 1983. I don't remember how much it was--wish I still had my receipts from that far back--but it was a big purchase for me at the time. [ADDED: I probably dithered over it for something like a year, during which time it became obvious that PC clones were the future rather than the S100 etc. systems running CP/M.] Based on ads from the time, it was probably about $2500 but a printer and software would have added to that.
And when I went to business school about a year later, I was one of very few people in my class who had their own computer. (There was a small computer lab in the school--that actually had a Lisa among other things as I recall. At some point when I was there they added a bigger lab with a bunch of IBM AT clones (80286s)).
I remember one of my friends had a Tandy the 386 ran rings around it (it supported vga graphics for one - I can remember playing the original Civilization game and Prince of Persia on the 386).
When I think about that investment as a ratio to my disposable income at the time, I was crazy! But, as a retired IT Exec, things did work out in-the-end.
The software side also helped. Windows (¿mostly?) ran a fixed palette, while the Palette Manager on the Mac made it easy for applications to pick whatever color (out of 16M) they wanted.
On the hardware side, most PCs still had monochrome monitors, and even those with color monitors didn’t typically have ones as good as the Trinitrons that Macintosh IIs often had.
You're right though on the color palette. I never understood why Windows nor its applications didn't change it to something less garish than the default CGA palette.
If you saw a PC with a color monitor, chances were that monitor was moved over from a DOS machine, and dated from the 320x240 era.
Your comparison is hardly apt.
As a kid I took it all for granted, but we never would have been able to afford such hardware on my dad's engineering salary at the time. The price for a new one adjusted for inflation today is jaw dropping.
According to the article Lisa was a $50 million gamble.
I just read somewhere about the 2000 engineers Google took on for the Pixel phone from HTC. Keeping the lights on for the buildings they are in is a $50 million gamble, that's without paying them or allowing for inflation. But you get the idea, $50 million was cheap for the product compared to what hardware tech costs to develop today, particularly if it has an operating system to write from the ground up.
Incidentally, the threat from IBM mentioned in the article. Peanuts turned out to be the ill-fated PCjr that had good graphics and a bad price point for the home market. Popcorn turned out to be the first PC based lug-gable computer from IBM.
And a really lousy "chiclet" keyboard. I don't remember all the details of the PCjr flop but, as I recall, it was still quite a bit of money for a system that had a whole big bunch of compromises.
Combined with his emotional issues, rage, temper - it resulted in a manic back & forth. When he wanted to be magnanimous or kind, he could be, and it always had to be strictly on his terms and only if he didn't feel forced into it in any manner (to the far end of that spectrum, like he'd do it only when it was least expected, as an amplification device). If anything attempted to force his hand, or he perceived such, he very aggressively rebelled against it in all cases (you see that pattern over and over again throughout his history with people and business). It had to be on his terms, or there would be no terms at all. When he could reorient the context of Lisa (his daughter) to his terms, as and when he saw fit, then it became acceptable. You can see some of that behavior in action, in Lisa's description of what it was like to live with Steve when she was younger (my way or the highway atmosphere; he had to feel in control). He seemingly struggled to control his emotions for most of his life, which must have been wildly frustrating for someone like him. That lack of personal emotional control might explain a control over-compensation directed at other things in his life, the need to control everything else (and perhaps a behavior where if he felt in control of everything else around him, then he could keep the other things from setting off his emotions, which he couldn't control properly).
That's my read on it anyway.
'20 something' is well into adultland.
You want a good job? You want to vote? You want to be treated as an adult?
Then there's no such thing as 'out of his control', really.
For someone who's ostensibly responsible enough to run an entire company, being responsible for one's children is well within reason.
Everyone has challenges in their personal lives, it's 'never a cakewalk' - ok - but there are really no excuses for Jobs here. Point blank.
This is stuff Lisa herself has written about.
(also 20somethings can act very immature, as can 30somethings, 40somethings, etc and the opposite where teenagers act more mature than expected - it is that "expected" part that sometimes fails with people, not everyone is the same)
Contrast this to the IBM PC where people could easily get started programming in BASIC (included in ROM!) or Asm (MS-DOS DEBUG), for which many magazines of the time had listings. Of course not every user did, but certainly a lot of them started and eventually helped greatly grow the amount of software available.
The PC had a learning curve but the user was in full control, whereas the Lisa didn't have much of one but had many impediments that prevented users from becoming developers. This attitude persists in Apple today.
Considering you needed a Lisa to develop for the Mac for the first few years, it’s good that they made it.
Check folklore.org for more stories of that time.
Although AFAIK the original Mac was also programmable with Pascal (in folklore it mentions that the calculator, etc were initially written in Pascal as examples for the API).
awhile back I was able to find recreated source of the game
Spreadsheet of game maps, etc.
Wizardry III re engineered
Lisa Desktop Library: https://www.apple.asimov.net/documentation/applelisa/AppleLi...
As an example, the Lisa Desktop Library has
PROCEDURE HiLiteMenu(menuId: INTEGER);
The Xcode 11 beta that just came out this year still has <Menus.h> which includes...
HiliteMenu(MenuID menuID) AVAILABLE_MAC_OS_X_VERSION_10_0_AND_LATER;
I think AppleScript/Apple Events is the oldest technology still standing, but maybe printing and scanning is older.
But -- if we want to be technical, the function still actually exists, since macOS menus are still driven by Carbon code.
$ nm /System/Library/Frameworks/Carbon.framework/Frameworks/HIToolbox.framework/HIToolbox | grep HiliteMenu
0000000000077d86 T _HiliteMenu
That's kind of amazing to me!
I recommend reading Walter Issacson's book on Steve Jobs, or watch the movie "Pirates of Silicon Valley" for a good take on it.
V1 Jobs (post 97) was much the same, but with a bit of humility that allowed him to empathize enough to be able to make successful products.
I'm trying to imagine most readers of a 1983 Newsweek magazine would read this and be turned off from computers for life
> Jobs personally heads the MacIntosh development team
I don't know at what point Apple settled on its non-standard spelling for the product, but it might well have been after this article was written.
Edited to add from wikipedia:
“Apple Inc. employee Jef Raskin named the Macintosh line of personal computers after the McIntosh. He deliberately misspelled the name to avoid conflict with the hi-fi equipment manufacturer McIntosh Laboratory. Apple's attempt in 1982 to trademark the name Macintosh was nevertheless denied due to the phonetic similarity between Apple's product and the name of the hi-fi manufacturer. Apple licensed the rights to the name in 1983, and bought the trademark in 1986.”
Unsurprisingly the price doesn't seem to be listed on their site.
They'll probably program it in "C", and then in the 1990s move on to JAVA, with some software in PERL.
Then it was mostly in C++.
Apple platforms never were big C fans.
Ah, but were they "C" fans?
Never mind. My joke apparently went over everyone's head.
No, it wasn't, but it gets called that often enough for me to have made a joke about it.
Feels bittersweet to recall the age of "friendly" computing in these dark days of "antisocial" computing.
It debuted at $9995, which is about $26,500 in today's money.
Edit: Guess HN thinks we're going to be carrying screens in our pockets forever.
AR still seems like a solution looking for a problem.
If you've ever had to use a hardware "simulator" to train on Big Hardware like a plane or submarine, using VR as a replacement is an "obvious win." Rather than dedicated rooms with all sorts of custom fake hardware that only a few people can use at a time, you can buy one classroom full of VR equipment, and then every student can do their simulator runs in parallel, allowing each student far more total simulator-time. As well, to switch to a simulation of newer-model hardware, you just need a new piece of VR software, rather than entire new rooms full of molded plastic and slapshod wiring.
I think it is used for pornography a fair bit though
As with the iPod, iPad, and iPhone, I fully expect Apple to wait for things like smart glasses to evolve a bit before jumping into the fray with a mass-market application. Being one of the first-to-market has never been their thing.
I think I'd feel sick if I had to type anything of any length, and I don't even get motion sickness usually.