Hacker News new | past | comments | ask | show | jobs | submit login
Why did Mac OS 7 perform poorly with virtual memory enabled? (retrocomputing.stackexchange.com)
83 points by bane on May 2, 2017 | hide | past | favorite | 31 comments



all memory writes are also to disk

I'm pretty sure that is wrong. What most likely slowed down with VM was that as soon as a page operation had to be performed, the OS would block and do nothing else until the disk access was done.


How bad is that? If I'm actively using one program, and that program is waiting on a page from disk, what useful thing can you do in the meantime?


IIRC, OS 7 was only cooperatively multitasked. If that were the case, and the program was blocked by the VMM so that it couldn't release the CPU to the next program, I'd imagine the whole system would hang until the read was finished.

I worked with a web developer in the mid 90's who served some image files off his desktop Mac. We noticed intermittent hangs and slowdowns during file transfers. After much head scratching, we pieced together that every time he held down his mouse button, networking stopped until he let go. This was 100% reproducible, and 100% hilarious to us Linux folks.

It got eventually got better.


Sounds like this web developer's computer may have been one of the infamous Performa/Power Macintosh x200 series[1]. These are ridiculously designed machines.

Choice quote from http://lowendmac.com/1997/performa-and-power-mac-x200-issues...:

One of the biggest complaints about the x200 series is slow Internet handling. For one thing, looking at the chart above, all data from either the ports or the ethernet controller must pass through the processor to get to memory, then be processed, sent to the IDE controller for cache saving, and then interpreted for graphics display.

There are symptoms to notice because of this. While a web page is loading, typed characters will be lost. When dealing with high IDE access, the graphics controller will seem to freeze. When copying to a network or downloading a file, the monitor will rarely update and will have redraw problems. Spooled print jobs will take forever if lots of processor resizing is necessary.

[1] http://lowendmac.com/2014/power-mac-and-performa-x200-road-a...


When I was a kid we had a Performa 5200, and I suffered from this. We also got a surplus Quadra 800 with a 66 MHz PowerPC upgrade from my dad's work - which outperformed the 75 MHz Performa in everything aside from MP3 decoding.

We also ran into the serial port issues the article describes, we got better performance hooking our 56k modem up to the Quadra and sharing that connection to the 5200 over LocalTalk via IPNetRouter than actually using the modem directly.


Didn't the Performa use the PPC 603? ISTR it did. The 603 was the 'cheap-n-cheerful' chip in the family, I suspect the Quadra upgrade would have had a 601 or 604, both of which had more oomph.


Yeah doing the research now - the 5200 had a 603 @ 75 MHz, and the PowerMac PDS Card had a 601 @ 66 MHz. I was also reminded the 603 had too small of a L1 cache to hold the the 68K emulator, reducing performance even further...

Now I'm curious what made the 5200 better at MP3 playback. My recollection is that the only way to get the upgraded Quadra to play MP3s in realtime was with SoundApp monopolizing the CPU so you couldn't even click anything else. On the Performa, you could run MacAmp in the background while browsing the web, with stuttering "only" during whole page repaints. This was 20 years ago, my recollection could be poor... I soon got my own Quadra 660AV (video digitizing!) and eventually a PowerMac 7500 (that I upgraded to G3 and could even run early versions of OS X with Xpostfacto)


I had a 5200, and I had AOL Instant Messenger installed. I'd be working away in some other program, and then the whole thing would freeze for a second, and repaint, every time I got a new IM and was using another program.

After a few days of that crap I just installed YellowDog Linux.


Try telling iTunes to download a large number of files from iTunes in the Cloud, and then popping open the little downloads popover and continuously scrolling it up and down. Still as of iTunes 12.x, the downloads would fail to progress! (Not just, like, the UI failing to update; the download tasks themselves would fail to read(2) from their sockets, and so the TCP streams would get blocked, causing the server to stop sending.)

I think this is a rather uniquely-bad behavior among modern macOS apps, though; I have a feeling it's because it's the only one that also runs on Windows, and the required portability layer is "a bit" legacy.


> After much head scratching, we pieced together that every time he held down his mouse button, networking stopped until he let go.

This was back when holding the mouse button down was required to keep a menu open, right?


That's correct. So as he wove his way through Photoshop, our downloads would sputter and stall.


I was just remembering that you could basically "freeze" those machines by holding down the mouse button. I can't tell if I'm vindicated in my memory, or horrified that that was how things worked. Possibly both.


It was roughly like that. Windows versions prior to 95 also worked on the same principle (1.0 to 3.11).

That was why when BeOS came out around that time and, often on the same hardware you used MacOS 7 on, you could drag windows around with video playing in them, with 3D objects spinning in other window, with video being wrapped on to 3D objects in real-time, it was utterly mind-blowing. The user interface never quit on you, it never beachballed, it just kept going.

I have to chuckle when I see Windows 10 struggle to do what BeOS did effortlessly on a 603e machine with a video card so primitive it couldn't even run any real 3D games.


This is where some of us mention "Amiga", then duck ;)

Same difference, but 10 years prior to BeOS...


The Amiga was the first consumer OS with pre-emptive multi-tasking, absolutely, but BeOS was the first with such aggressive multi-threading and exceptional support for multiple CPUs.

On BeOS the menu UI had its own thread. Even if the app was busy thinking it still wouldn't jam. That wasn't the case on other operating systems that simply had multi-process support.


Yep, if you held down the menu bar no other applications would get any time or events except for interrupt-time callbacks like VBL tasks and async IO completion routines. Any application that actually did something on classic MacOS while the menu bar was held down was doing it through those (tricky to get right) mechanisms.


I remember installing an INIT called "Menutasking enabler" that let processes continue in the background. The downside was that any app that redrew a window would end up drawing all over the open menu.


Yeah, everything pre OS X was cooperative, including the last one (9.2.2).


> If the OS and apps you needed to run required 4MB RAM, then you really should have at least 4MB RAM. [...] Set your VM page size to only 1MB more than your physical memory size.

ahhh... I'm fond the days from when you could run whole applications in all of 4MiB -- GUIs included.


Yeah. I had a colleague who used Photoshop a lot. She got management to spring for 30MB of RAM for her machine. I thought that was the most outrageous thing I'd ever heard.


My father used to run a desktop publishing bureau that output people's print jobs onto film, which was then sent to printing presses for printing. This was for magazines, artwork for packaging, etc.

He had a Quadra 950 with 128MB of RAM, which required some pretty exotic SIMMs. That would be a pretty expensive machine in today's money. This was for handling Adobe Illustrator files with large embedded images.


Ah, 16MB 30-pin SIMMs. One of the reasons they were not popular was that the first 16Mbit chips was 400 mil, making them pretty tall


Designers that did print-quality photography, especially posters, have always been crying out for more memory.

In the G3 and G4 days with easy-open computers it wasn't uncommon for one designer to put out a call for more memory to do something intensive, like edit a poster, and other designers would sacrifice a stick or two to the cause. When the job was done they'd put the memory back in the other machines.


Rathole alert, but I worked at a Forestrey Commission Research Station here in the UK in the 90s. One of the mathematicians was running an environmental model on a Sun workstation and it was running out of RAM and swapping and was going to take 2 days to run at that rate.

The department had another identical workstation that wasnt being used just then and I pointed out I could swap half of it's RAM to his machine for the day. He and his boss looked at me as though I was insane. An hour later they called me and asked me to do it, the model run completed in a couple of hours and I swapped the RAM back.


I wonder which year was that.


Probably around 1991.


I blew a thousand dollars (Canadian) to upgrade my new Mac Plus to 4 megabytes and the sales guy asked --slightly tongue in cheek but only slightly-- if I was doing CAD for NASA.


I extended memory in my Mac Plus to 4MB for about $5 a few months ago. Nowadays, when anything marked with the "retrocomputing" label can be ridiculously expensive, this kind of 30-pin SIMM memory is still considered as obsolete, not classic.


Anyone BTW remember the thick 1MB SIMMs based on DIP chips that was common before DRAM prices declined in 1989?


In what year? The DRAM market changed between 1987, 1988, and 1989, for example.


I remember when you could do that with 512 - 640 KB, using DESQview, ViewMAX or Turbo Vision based applications on MS-DOS, or my beloved Amiga 500.

So just imagine the possibilities for ESP32 applications.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: