This guy was incredibly lucky in finding a mostly non-corrupted floppy available for sale. In just a few years the few remaining working floppy originals will cease working and MacApp will end up as just another lost piece of software - unless pirate copies are made.
You can say and think what you want about software piracy, but it's undeniable that it has saved a lot of ancient software and games from ceasing to exist in anything other than our memories. Illegal software/ROM archives have probably done more to keep the revolutionary time of the early computing/gaming history alive than all the legal old-fashioned libraries have done combined.
Thankfully there is very little enforcement of the copyright law on ancient software - and so few developers/publishers are bothered enough by the fact that someone is illegally enjoying, archiving and sharing their 10-30 year old software that they bother issuing takedown notices - that it is effectively "legal", or at least tolerated in the same way as cannabis is in the Netherlands. The fact that it won't actually be really legal until 2040-2060 is totally absurd, and a powerful testament to how outdated and irrelevant the current copyright laws are.
I had a magtape of PDP-10 games, but that sadly was lost.
I still think that PDP-11 assembler code is a thing of beauty.
Here's the 1977 version of Empire written in PDP-10 FORTRAN, which I did manage to save:
Copyright was extended further and further at the request
of powerful interests. It used to be much shorter.
Originally US copyright lasted only 14 years, with a possible extension of another 14 years if the author was still alive and requested it.
Nowadays US copyright lasts until 70 years after the original author's death.
You are correct that the original US copyright law was set to the reasonable maximum of 14+14 years, however that was extended to 28+28 years already in 1909, not exactly a relatively recent development.
In 1976, 35 years ago, it was further extended to 75 years after publication or 50 years after death of author. In 1998 it was extended further to an insane 120 years after creation or life + 70 years.
Even if the copyright length insanity actually only was a relatively recent phenomenon from 1998, I don't see why that would exclude it from the "outdated" term.
I make an effort to buy old software on eBay when I come across it, archive it in a stable format scan the media, and then store the originals.
It would be nice to see a project (would archive.org host it?) that applied standard archival rigor (http://en.wikipedia.org/wiki/Archivist) to preserving our computing history.
And FYI the correct expression is "Hear, Hear!" (http://en.wikipedia.org/wiki/Hear,_hear)
Emulators are also valuable here; the work done on bsnes is amazing, and MAME does a pretty solid job of emulating hardware (including old Mac hardware) accurately.
There's also still quite a lot of retro hardware and books for sale through used markets; enough that people make it their (if not particularly lucrative) business to collect and resell 'vintage' hardware.
History tends to filter out the bad ideas, such that many of the best ideas survive today. However, human's cultural filters are not perfect, and there are lots of good ideas and intellectual context that can be exceedingly valuable to understanding how things became the way they are, or demonstrating alternative paths (which may even be better than what we have today) that did not emerge for market, financial, or cultural reasons.
I really find historical technology to be edifying in the same way political, musical, and broader cultural history is edifying. I think it's a shame that we don't spend as much time teaching technology/design history to software engineers as we spend training 6th graders to understand their local political/cultural history.
> I really find historical technology to be edifying in the same way political, musical, and broader cultural history is edifying. I think it's a shame that we don't spend as much time teaching technology/design history to software engineers as we spend training 6th graders to understand their local political/cultural history.
I couldn't agree more. My local Science Center is currently running an exhibit on video games of the past 60 years http://www.ontariosciencecentre.ca/GameOn/, and while I was familiar with most of the 90s and onward ones, it was incredibly cool to walk by and see all the different hardware and games pre-Win 95 which was the first computer I ever used. I would love to go through a curated presentation of the history of other types of software, and see them evolve over the years, see what concepts stuck and what concepts were dropped in time. It would be even better if along actual demos, I could read about some of the quirks that they had to deal with and anything else to give a young one like me more context about those times. EDIT: It was even cooler going there with my younger sister, and reliving some of the fun times we had as kids.
Now that I think about it, a physical exhibit where they can recreate the hardware (even if only on a superficial level of aesthetics), and the rooms and settings where that hardware would have existed, and the software it ran would be incredibly interesting. It would essentially be as close as we could get to a time machine for the foreseeable future. That kind of exhibit would probably become one of my favourites very quickly.
I agree, that would be fantastic. You see hardware (especially Apple's) show up in fairly prestigious design museums (eg, http://www.moma.org/collection/browse_results.php?object_id=... ), but it's never functioning, much less available to actually use.
I often think about how incredibly inefficient modern software is given that we've had roughly the same level of productivity software ability (with varying levels of usability) for a few decades now. I distinctly remember my NT4 system of the late 90's, and how it was roughly as productive as my current machine, and I wonder just how much cruft is inside modern operating systems and applications.
As a fellow young person (probably even younger than you), I remember watching this video http://www.youtube.com/watch?v=vPnehDhGa14 two years ago when it came out and I remember gaining quite an appreciation for all the work put in over time in Windows and the complexity that it probably brings.
When the spreadsheet software was first brought to market, it produced an order of magnitude improvement in ability to do accounting and calculations with the computer. When digital photography tools like photoshop were introduced, they produced an order of magnitude improvement in ability to do image editing. You can go down the line of categories of software, and pinpoint a rough spot where the order of magnitude jump was made.
We've stopped jumping in the productivity software category. There are no order of magnitude improvements in productivity software. Each year's software release is the equivalent of city sprawl in urban planning. It grows bigger, and contains more, but it doesn't fundamentally improve the "capability".
There are still huge leaps being made in software, but not on the desktop.
Using C for Mac development was a bit of a pain in the early days because of the stack calling differences between Pascal & C (before the declarations were built into all the compiler libraries), the conversion between pascal strings & C strings, and the work arounds to handle Pascal structure unions in C.
Does anyone have a decent mirror?
Ah, c'mon. WebKit, Linux kernel?
Your average reader will find them inscrutable.
That's just how it is. Some works are simple, elegant and great. Others are hard, deep and also great. It has nothing to do with literature or code.
You picked some very specific authors then pointed at readers. Either you missed the point of the argument or are trying to create a straw man (or just don't understand how analogies work).
He was talking about writers though -- not average readers. And your average writer has certainly read Joyce, Pound, Eliot and Faulkner.
The argument was writers vs developers (not average readers vs average users).
The argument is that most developers don't read code, as opposed to writers. My point is that serious professionals try to expose themselves to the best there is, even if it's damn hard stuff.
I seriously doubt it. I even doubt that your average lit major has read them.
I read a lot and I often read the author blurbs, my second anecdote is that it seems a large number of writers studied English/Literature. In addition authors often drop hints to having read the above, listing works as recommendations or influences, or dropping qoutes or allusions.
Still I'm sure there are plenty who haven't read those authors, but perhaps they have read Shakespeare, Melville, Tolstoy etc.
Still, I follow the literary world in 2-3 countries quite closely, and all those are old staples. Your average writer with literary aspirations have read all or most of them and tons of others. There was no published poet a few decades ago that hadn't read Elliot and Pound.
Heck, even university students in most fields had read those authors back in the day. It was just something you did. Even rock musicians. You can find references to Elliot for example in the Beatles, Bob Dylan, and lots of other bands.
In essence, reading other writers (and a lot of times, criticizing them, often in literary magazines) is what writers did/do.
Let me know if you run into any issues.
(2nd EDIT): I just updated the Pastebin cache with the original author's e-mail address, as it was in the blog. Don't mind me, I'm just the schmuck who HN-ed his server's bandwidth.