These simple sites show us something profound: If you want something to last, don't base it on something that won't last. There are a some technologies that will never allow somebody to build a site and leave it unchanged for 20 or 25 years. Cold Fusion comes to mind. Almost nobody hosts it anymore for one. Can you imagine running the same WordPress version for 25 years? The version of PHP it runs on will be EOL long before.
I guess what I'm saying is that if you want to build a site to last 25 years without numerous redesigns, build a static HTML page.
Looks like Web 1.0 got something right after all :)
While this website still works fine, the actual HTML that Frontpage generated isn't exactly easy to maintain if Frontpage stops working for whatever reason.
The author of this website is basically stuck using whatever version of Frontpage supports the markup of his website. And I bet there have been plenty of people who used <some other WYSIWYG webpage editor> who are no longer able to maintain their website because their editor no longer runs on their system.
Microsoft has been fairly good at allowing older binaries to run on newer systems.
Apple is pretty annoying in this regard. There’s a lot of software that doesn’t work on versions maybe only 5 years old.
A lot of software doesn’t need to change to be honest. Microsoft word for example. Word processing: you sit down and type stuff, maybe change the font once or twice. I guess the collaborative features are nice being able to edit the same document with others.
It would be fun to use an older machine and see how productive you can be with the old software too !
Been hit with that, I needed to run chromium v49 to be able to remote debug some TVs with old opera tv sdks, the version I had stoped working, and several versions that I tried crashed when using the chromium devtools. I ended having to use a windows virtual machine
I wish I could find the movie on youtube again, a demonstration of collaborative text editor from 1960-ish. Been looking for it a number of times just this year.
It's a very interesting show, how they solved the displays with commercial cameras filming the lcd displays in the lab and sending to the users screen. How the mouse worked, five button keyboard and so on.
Not to be snarky, but if there is a need to do this, it's pretty easy. If there is a real need, it is pretty trivial to do with VirtualBox or DosBox.
Those applications from 20 years ago running in emulators will work far better in 20 more years than Apps from today that stop working due to remote service dependencies to force vendor lock-in.
It is endlessly amusing to me that the more tightly integrated the cloud services get to conventional computing tasks, the more likely we will end up with Vernor Vinge style programmer archaeologists from A Deepness in the Sky...
When I worked for a NASA contractor doing sounding rocket telemetry, the main telemetry stack programming software was a Turbo C program from 1987-1990 (TDP502.exe on the odd chance that the maybe 50 other people on the planet who have ever used it sees this). Works just fine in DOSBox, at least to create files. Still needed an actual older PC with an ISA slot to handle the hardware that TDP knew how to control. But for configuration tasks, Windows + DOSBox + a USB 3.5" floppy drive = I could do things on an actual modern system.
So yeah, you're right, emulation saves the day in many cases. And I felt like a programmer-archaeologist using DOS to launch something into space in the 2010s...
Can you imagine how massive the field of software archeology will actually need to be to capture an understanding of the human experience of software development in the “distant past”.
Take any given product or system. what percentage of the software involved in implementing the system was written x years in the past. What’s gonna happen to that distribution in 10 or 100 years?
I wonder how inevitable it is that this percentage of ancient software in virtually every system will just keep growing and growing over time — until the systems of the future are tiny layers built on top of 1000 year old, impenetrable old growth forests which as well have been written by aliens for their understandability ...
Virtualizating Windows isn't very hard, even back to something like Windows 95.
On the other hand, only OSX 10.7+ are really easy to run in a VM, and .5 and .6 only work for servers, and anything before 10.5 isn't really going to be compatible with virtualization. That's 2007, so OSX lets you virtualize back about 13 years, and Windows you can go back almost 30 years. People even have Win 3.1 running in VMware.
This is probably due to the fact that there isn't powerpc virtualization software, but if you need to run osx software from before 2007, you're basically out of luck.
You can also virtualize windows from just about any OS you can imagine, Mac, Linux, Windows etc, while OSX virtualization has a hard requirement for running on Mac hardware.
There seems to be a misconception that you can only run 10.5 and later in a VM, but you can actually run OSX 10.4 Tiger fairly easily. This is the non server version. [1]
I was able to import almost everything from my old PPC computers. It's not completely virtualized because is using Rosetta and can not use Classic OS apps. But it is still extremely useful, and way faster than my PPC computers ever were.
>OSX virtualization has a hard requirement for running on Mac hardware.
If you aren't a stickler for Apple's terms of service (if you're doing this for business purposes, I suggest you should be), you can use a tool called macOS unlocker to patch VMWare Workstation to run macOS VMs. Runs great, though all VMWare products can only render display output for macOS in software mode.
Run a shady binary that seems to not have a certain author/website, as administrator, so it modifies VMWare binaries? A rather... curious approach, but for some reason common in Windows among e.g. gamers.
I've ran MacOS in VirtualBox iirc, without shady patches―though it probably was in Linux.
I have literally never heard of a "gamer" running shady binaries with administrator privilege in my entire life. Maybe you're thinking of the hacker culture of the 80s, but gamers today use launchers to manage downloading, installation and setup of software. Maybe you're thinking of software pirates using scene software as keygens or DRM-defeaters. I suppose that's common among kids who don't buy things (but I don't believe those tools run as admin).
It may be more common in Windows, but I would challenge that since Windows is basically free and runs on anything from a raspberry pi up, the vast majority of "hacky" stuff happens in Windows and Linux. Mac users buy very, very expensive hardware to do very specific tasks, and "hack around" is often not a good enough justification for the most expensive personal computers money buys.
I would also suggest that it in the Linux world where running random binaries as root is most common. Found some random repo that claims it's a fork of a good one with a bug fix? Build it and run it!
If the current version of OS X was backwards compatible with 10.0 - 10.4. It would still need both a PPC emulator and a 68K emulator since iOS 9 still had 68K code.
So if Apple kept “25 years” of backwards compatibility, should they have been better off bundling a 68K and PPC emulator? Why stop there? They should have kept compatibility with the Apple //e and also bundled a 68K emulator?
Someone else was complaining that they didn’t keep FireWire. Should modern Macs come with ADB ports?
Obviously not, but that doesn't prove that there isn't value to having backwards compatibility. Sometimes you just want something to run and not have to touch or change it for a long time.
A 20-year old machine that's critical to a factory can run off a serial cable plugged in to an expansion card running software written in the 90's that will still run on Windows 10. Nobody in their right mind would decide to write that same software on a Mac.
Well, given where all of the PC manufacturers are that were around in 1990 compared to the revenue and profit of just the Mac division, it seems like Apple didn’t make a bad business decision not prioritizing backwards compatibility.
If you compare where Apple is and where Microsoft is also, it doesn’t seem like chasing enterprise PC sales was as good of a long term bet as going after the consumer market....
> So if Apple kept “25 years” of backwards compatibility, should they have been better off bundling a 68K and PPC emulator? Why stop there? They should have kept compatibility with the Apple //e and also bundled a 68K emulator?
I don't think it's unreasonable that Apple hasn't done so, but neither do I think doing so would be unreasonable. Archive.org can emulate Apple II's in your browser, I'm sure Apple could add an equivalent feature to MacOS if that were something they cared to do. They obviously don't, and that's their prerogative.
I have a Windows 10 PC with a PCI (not express) slot that I installed a Firewire card in last year to use 15 year old software still available from Sony's website to rip a stack of Digital8 home movies.
I tackled that project about two years ago. Asked around and a friend had an old laptop with FW port, so I installed Ubuntu on it and copied all my old Video8 and Digital8 Tapes.
In the face of what Apple does for privacy, comparatively, nobody else doing a damn thing. Privacy is by a very significant margin the most important metric.
True enough. Though to be fair the last new version of a Win16 OS shipped 26 years ago, and Win32 became the standard API in consumer products 24 years ago. There are degrees of worry here. Software of the vintage you're talking about was contemporary with System 7, and the closest ancestor to current OS X was called "NextStep 3.3".
The point upthread was that genuinely useful stuff gets retired just a few years after release in the Apple world, and I think that's broadly true. It's true with hardware too -- professional audio people are stuck with truckloads of firewire hardware that they can't use with their new laptops, for example.
Apple shipped the last 32 bit Mac in 2006 over 10 years before 32 bit software wasn’t supported. There were plenty of FireWire to Thunderbolt adapters.
No the closest ancestor to MacOS X is System 7. There were Carbon APIs until last year. A poster up thread said they could use an emulator. There are 68K Mac emulators available too.
AppleScript for instance is a System 7 technology - not a NextStep technology.
No MacOS X when it was originally released had parts from NextStep and parts ported from Classic MacOS including QuickDraw, AppleScript, QuickTime, some audio frameworks etc.
The entire Carbon API was a port of classic MacOS APIs to make porting from classic
MacOS to OS X easier.
MacOS X was a combination of both. That was the whole brouhaha of why Apple ported Carbon APIS to OS X because major developers like Adobe and Microsoft insisted on it.
That’s not to mention that the first 5 versions of MacOS had an entire OS 9 emulator built in.
To take the analogy to the extreme. MacOS had two parents - Classic MacOS and NextStep.
I would disagree, most of what was brought from Classic OS was ported, adapted, out of necessity and short lived. OSX was an entirely new operating system that ported some frameworks and software but wasn't backward compatible. Were it so, they wouldn't have provided an emulator.
I think you're just supporting the original assertion that Apple does not support things for very long. Does Software written for OS X v10.1 run on Catalina today without using 3rd party tools or emulators? Software written for Windows 95 still runs on Windows 10.
Sounds to me more like the ported programs were short lived - and IMO, in that they are not entirely wrong.
Sure, Carbon and Rosetta certainly were no mean feat, and the drastic PPC/x86 break is something Microsoft never really had to deal with (heh, the biggest problem trying to run a PPC/MIPS/Alpha based NT application today is actually finding one :) ).
But Apple never went to the same lengths as Microsoft regarding backwards compatibility, and while Carbon and Rosetta immensely eased the transition, the continuity definitely wasn't comparable and it was never transparent to the developers (and in Apple's defense, this was never their intention and they always were quite open about it.)
For one, Rosetta (and thus PPC compatibility) was dropped with Lion in 2011, so no amount of Carbon would help 10.1 applications after that.
And even with Rosetta, each release, especially after Tiger, came with quite a list of API changes and deprecations (with the whole of Carbon declared obsolete in 2012) - and and increasingly longer list of high-profile software that would not run anymore and require an update or upgrade. And while Microsoft did a lot even to prevent and/or work around issues with notorious software (hello Adobe! :) ), Apple was far less willing to do so.
I mean, just as an example - I can run Photoshop 6.0 (from 2000) on Windows 10 (certainly no thanks to Adobe), but no chance for PS 7.0 even on Leopard...
PPC to x86 possibly the smoothest transition I've seen in my lifetime, for most it was just a recompile, and I'm convinced it was only as smooth as it was because of the shit show transition to OS X.
Apple announced it's plans to move to OS X in 1997 and that they'd ship an emulator, Blue Box, to run classic apps. That was met with a resounding "no" from the community.
Carbon was never suppose to exist, the Classic APIs were not memory safe, don't support thread, and had a lot of other issues. Apple wanted a clean break in the form of Cocoa but the community said no. So Apple came up with Carbon, which was sort of a port of Classic APIs to OS X, but because the two operating systems were so different it wasn't anywhere close to a 1:1 copy and required developers to port to it.
Since it's inception, Apple wanted Carbon dead, it required them to rewrite core parts of OpenStep in C and they had to maintain them alongside their Obj-C equivalents. It took them 12 years to get to the point where they felt comfortable killing it off and almost 20 years before they actually could.
> Can you run the PPC version of any Windows NT apps?
Developing for PPC was much like targeting x86 and PPC on a OS X. It was mostly a recompile unless the App used assembly. You can't run the PPC version of an NT app on modern hardware just as you can't run the PPC version of an OSX app on MacOS.
The difference thought is that PPC on NT never took off so there's something like 4 or 5 Apps for NT versus the thousands or hundreds of thousands for OSX.
I haven't forgotten anything, I just fail to see the relevance to this discussion. (68k? Really? That one's been dead for 14 years. And what is with you and NT on PPC? You really want to start comparing a 25 year old, short-lived, ultra-niche side version no one bought or even wrote software for with the "mainline"?)
I think you missed the entire point of my posting, i.e. that even outside the architecture changes long term compatibility was never even near the same level (and different arch often not even the culprit). Carbon being available doesn't help you a thing when old software still doesn't work.
If you are complaining that you can’t run 25 year old Mac software on an x86 Mac, the only option is for Apple to ship MacOS with a 68K emulator and a PPC emulator. The first version of MacOS that ran natively on x86 came out in 2006.
Yes I realize that PPC Macs came out in 1994. But they required a 68K emulator because even parts of MacOS were 68K.
There were a few breaking change epics in MacOS history.
There were three major breaking changes for MacOS.
- If you bought the x86 version of software in 2006. It would potentially work until 2019 when Apple dropped 32 support.
- If you bought the first first version of OS X PPC software in 2001, it could potentially run until July 2011 with the release of 10.7.
- If you bought a classic MacOS app, it could run from pessimistically from 1992 with the release of System 7 to 2006 with the introduction of the first x86 Macs.
"Carbon was an important part of Apple's strategy for bringing Mac OS X to market, offering a path for quick porting of existing software applications, as well as a means of shipping applications that would run on either Mac OS X or the classic Mac OS. As the market has increasingly moved to the Cocoa-based frameworks, especially after the release of iOS, the need for a porting library was diluted. Apple did not create a 64-bit version of Carbon while updating their other frameworks in the 2007 time-frame, and eventually deprecated the entire API in OS X 10.8 Mountain Lion, which was released on July 24, 2012. Carbon was officially discontinued and removed entirely with the release of macOS 10.15 Catalina."
I think you are confusing "supported" with EoL. Adobe was pissed because there was originally talk of doing a carbon64bit and they never supported it so they had to move their entire app over.
The main point is, that Windows would never stop that api from "existing" In some manner. Unlike Apple.
This is just a difference in how both companies view themselves. While Apple claims "it just works". That isn't quite true in some of the cases we have seen. Microsoft has actually done a far better job of this.
I know someone that worked on the visual studio team. They literally had 100-200 servers that would run overnight with each build guaranteeing that the software would install and run on every single permutation of windows on an array of hardware.
I've only heard complaints from Silverlight and Windows Phone/Mobile developers anecdotally.
From a web perspective (and my experience), .NET Framework 2/4 -> Core is actually not a big changeover outside of the views (probably better if you switched to MVC).
The Windows Phone apps I built are dead now, but that isn't a matter of APIs no longer being supported, but an entire platform going under.
As a macOS user, I had one operating system update kill external GPU w/ Nvidia cards (that sucked) and another update kill 32 bit apps (that one isn't a big one for me personally). All on the same computer.
The entire ASP.Net Core and Entity Framework architecture was changed and is not compatible. Not to mention all of the legacy third party .Net Framework only third party packages that don’t work.
Microsoft also completely abandoned Windows CE/Compact Framework while there were plenty of companies that had deployed thousands of $1200-$2000 ruggedized devices for field services work.
> The entire ASP.Net Core and Entity Framework architecture was changed and is not compatible.
There's been a lot of confusion, due in no small part to Microsoft's branding and communication, but what you said is not at all accurate if not intentionally misleading.
What's been know as .NET for the last 20 years is now called ".NET Framework", this is not unlike how OS X is now called MacOS retroactively. ".NET Core" is an entirely new framework that just happened to be compatible with ".NET Framework" but as time goes on the two have diverged.
> Not to mention all of the legacy third party .Net Framework only third party packages that don’t work.
".NET Framework" and ".NET Core" are similar to Cocoa and Cocoa Touch in the sense that you can write code that will compile under both AND you can write code for either that will be incompatible with the other. In fact I maintain a half dozen packages that are compatible with both.
> Microsoft also completely abandoned Windows CE/Compact Framework while there were plenty of companies that had deployed thousands of $1200-$2000 ruggedized devices for field services work.
Microsoft didn't "abandoned" Windows CE, it stopped development for it 6 years ago as it was largely dead and Microsoft offers many pathways off of Windows CE. The CF actually runs on platforms other than CE intentionally such that any apps written for the CF will just work elsewhere. AND they still support CE and CF to this day, they just don't maintain or develop new versions of them.
What's been know as .NET for the last 20 years is now called ".NET Framework", this is not unlike how OS X is now called MacOS retroactively. ".NET Core" is an entirely new framework that just happened to be compatible with ".NET Framework" but as time goes on the two have diverged.
The two weren’t initially slated to diverge at all. .Net Framework and .Net Core were suppose to be separate implementations of “.Net Standard”. In fact, you could originally create ASP.Net Core and EF Core apps that ran on top of .Net Framework.
NET Framework" and ".NET Core" are similar to Cocoa and Cocoa Touch in the sense that you can write code that will compile under both AND you can write code for either that will be incompatible with the other. In fact I maintain a half dozen packages that are compatible with both.
Which will not be the case for long since MS has stated that no new features will come to .Net Framework.
Microsoft didn't "abandoned" Windows CE, it stopped development for it 6 years ago as it was largely dead and Microsoft offers many pathways off of Windows CE. The CF actually runs on platforms other than CE intentionally such that any apps written for the CF will just work elsewhere. AND they still support CE and CF to this day, they just don't maintain or develop new versions of them.
Which is also not true. The last version of Visual Studio that supported Compact Framework was VS 2007. It was far from dead in the Enterprise by 2010 or even 2012. Companies were still relying on CF to run on their $1200-$2000 ruggedized field service devices. They had deployed literally thousands of devices in the field. I know, I was developing on VS 2007 until 2011 just to support them.
I mean devices like these that cost $1300 each. I deployed software for a few companies that’s had thousands of Intermech and ruggedized Motorola devices.
> The two weren’t initially slated to diverge at all. .Net Framework and .Net Core were suppose to be separate implementations of “.Net Standard”.
Uh... no. Hard fucking no. .NET Standard is the commonalities between Core and Framework. Core and Framework were NEVER the same or intended to be the same.
Framework is all of the legacy Windows specific Libraries for things like the File System, Active Directory, etc.
Core is intended to be platform agnostic and cross platform.
And yet you can still run .NET 1.0 apps on Win10, and this isn't changing in the foreseeable future.
Hell, you can run VB6 apps on Win10 - it even ships the runtime! - and the remaining hold-outs in that developer community have been complaining about abandonment for two whole decades now.
The .NET Framework 1.1 is not supported on the Windows 8, Windows 8.1, Windows Server 2012, Windows Server 2012 R2, or the Windows 10 operating systems. In some cases, the .NET Framework 1.1 is specifically identified as required for an app to run. In those cases, you should contact your independent software vendor (ISV) to have the app upgraded to run on the .NET Framework 3.5 SP1 or later version. For additional information, see Migrating from the .NET Framework 1.1.
Which would be no different to a macOS app hard coding a check for 10.3 and not working if you have anything newer. Neither says that the app _couldn't_ run, just that a badly thought gate prevents it.
> There must be enough apps that don’t run that MS thought to call it out.
The callout exists because Microsoft takes a different approach to support from Apple. Microsoft provides support material for all of it's legacy and deprecated software, as well as the ability to download and install them. So it's important to identify and track incompatibilities between them.
When Apple moves the past is whitewashed over and when support stops they forget it ever happened.
And so the mystery of why a 32bit version of Windows 10 still exists is solved.
What's mildly annoying is that much of the early 32bit Windows software came packaged in 16 bit installers. Office 97 would be such a breeze on modern hardware.
Office 97 can be installed on 64 bit windows 10 with original installer. I have done it just last month and it runs without any problems... and it is fast.
There are special workarounds, many of the old installers run a small piece of 16 bit code which doesn’t work in 64 bit Windows but because it’s so common Windows just runs a replacement version.
Failed the last time I tried, which must have been on 7. Will be a strong example for Microsoft dedicating resources to compatibility if they added it for 10 or with a patch update. (both their resources and the users', there's a crazy amount of checking for necessary compatibility hacks going on whenever an executable is started)
There are some third-party implementations of NTVDM that allow running 16-bit DOS and Win16 apps directly on Win64. Although DosBox is still the easiest route, and "good enough" in practice.
> The author of this website is basically stuck using whatever version of Frontpage supports the markup of his website.
But at least getting it done largely depends only on them, and it's not too hard. I have friends who swear by ProTracker and still use it, even though it's thirty years old and the platform it's running on has been dead for more than twenty. They don't have an Amiga but it's trivial to get it running in an emulator today.
You can run Windows 98 in a browser, and your web editor in it. It's certainly less complicated than hosting a WebObjects application today.
I know a person who is maintaining a few sites she built in like 2005 with a version of Dreamweaver a little older than that, so never dares to upgrade the Dreamweaver version.
The whole thing is terrifying and horrific to me, but they keep paying her to do the work so she's fine with it.
I actually just finished redesigning my site with static HTML using Dreamweaver 2004 on an iBook G4. Why? Why not? My little brother passed away a couple years ago and I inherited his iBook, and I have decided its going to be my personal laptop from here out even if all I use it for is VNC to one of my other computers. Plus, as mentioned above it can still run all that delicious old Mac stuff from System 7 through OSX 10.4.xx and its all "abandonware" now, yet in many cases still VERY usable.
I cut my teeth using Dreamweaver for tripod and geocities sites way back in the last century and have fond memories of it. It was great for templating headers and footers before I discovered PHP, which I have less fond memories of, but that's another story.
With all the churn in certain areas of tech, it's easy to forget how much stays the same.
I was using Macromedia Fireworks MX from the early 2000's right through about 2013 to do graphics for sites I was building people on the side.
I used it while it got several version updates, Adobe took it over and updated it for five or six years, and then discontinued it.
Meanwhile I was still using the old version to make beer money.
I only quit using it because I finally admitted I kind of suck at graphic design. Besides, there doesn't seem to be much need for graphic design in much of the modern mobile-first, material design world anyway.
I bailed on the whole thing and have been sticking to back end at the day job these days. Things move at a slightly less hectic pace back here for me most of the time.
You would just edit the html pages lol. I still use dreamweaver for the visual editor if I need to copy and paste from a pdf and want perfect html. No one has made anything like it. No current editor has a quick sftp that allows you to connect/edit move on.
I mean .. it's just a static HTML editor at that point (maybe it does some includes/builds to simplify things). If you're just pushing out static content, you don't have to worry too much about outdated libraries and security issues, so long as the web server it's being served from is maintained and up to date.
"Dreamweaver Templates" was basically an early static site generator that made it really easy to design and include site-wide or section-wide elements.
Yeah you could always edit the individual files that it outputted, but in some cases people were using this system to manage sites with hundreds or thousands of pages. As recently as a couple years ago it was how the natural history museum in DC managed their site content.
Presumably the author used a WYSIWYG editor in the first place because he is not a technical person, so for him/her to now not only learn enough HTML/CSS/javascript to turn to hand editing but to also understand Frontpage's noisy output would probably take enough effort that they might rather decide to shut down the site if they're not able to continue using Frontpage. Hiring a dev to redo the site is another option but that presumes they have enough money to invest in a hobby site...
Nah, I don't think this is such a big deal. Adding a row to a table is much easier than creating a table from scratch. And Frontpage's output isn't that noisy -- I had to go through that experience myself. That said, my old Frontpage from 2005 (which I copied from Win XP probably) still works fine except for a warning it throws at start about not finding some registry value. I wouldn't want to use it any more (it doesn't understand CSS and screws it up), but if I wanted, I could.
ironically, i bet the author has learned more technical skills by maintaining a system that can continue to run their version of frontpage than they would have if they had just taught themselves HTML from the start.
I properly own several versions of Office all the way back to 95 so I can say this as I am covered :)
Years ago I found a "Portable Frontpage" which of course I downloaded and still have somewhere zipped. I know that MS wouldn't like this much, but life is life and Portable Frontpage exists. So as long as there are Windows, Frontpage will work!
OTOH, I guess that maintaining a windows VM for use with frontpage would be a lot simpler and safer than maintaining an old software stack server side.
Actually the browser makers are shouldering the burden of supporting the dredge output by FrontPage. Remember that the intention was to make it work in Internet Explorer and crash in Netscape.
> If you want something to last, don't base it on something that won't last.
and
> I guess what I'm saying is that if you want to build a site to last 25 years without numerous redesigns, build a static HTML page.
While simplicity is a great way to future proof things, I'm not convinced that this argument in general would work nearly as well without the benefit of hindsight. One could be forgiven for confusing it with "guess the future correctly". Plenty of relatively safe bets from 10, 20, 30 years ago haven't panned out that well. It's an interesting line of thinking though: exactly what properties of HTML make it so long lived?
My text files still work. I have MUD design documents from when I was in high school (mid to late 90s). Org mode and Markdown are kind of eternal formats. Even if all the tooling dies, they still look decent. Basic HTML still works well enough as well. You can write a parser for XML pretty easily. HTML can also be processed and rendered trivially. I think we could collectively find some other technologies that are likely to be around in another 20 years. The simpler the file format the more likely it is to be around :)
edit: A few more popped into my head. CSV. SQL schema + Data dumps (text format). The common theme to everything here is plain text. SQLite, although binary, is probably close to eternal. Git is eternal enough (recent HN post showed even POSIX shell is good enough to write a basic git client). JSON is easy to write a parser for as well. YAML.
I think you can generalise the advice: remove as many processing steps as you can.
It's not so much that you needed to guess that HTML was going to be as long-lived as it is, it's that HTML is the final product that actually loads on the users computer, and those tend to stick around for a long time (or at least be emulated). The code that lives on a backend server somewhere, not so much.
For what it's worth, I don't think this example is necessarily bulletproof: it requires a working copy of Frontpage. If Microsoft behaved more like Apple it might have been deprecated away long ago!
Google controls the major web engine. I don't trust google to not deprecate parts of html over time because the new shiny is "better". I would rather maintain markdown generators which I can update to change the markup to whatever the latest google insists needs to work instead of rewriting all my documents.
I'm currently tasked with writing a UI for a machine that has a 25 year expected lifespan before wear means it is replaced. This is a real concern - think about where computers were 25 years ago and try to find something you are sure will work and look nice.
Google wields too much power. To an extent, they can dictate to website owners what HTML is allowed and not allowed thanks to their dominance in search. This is compounded by the fact that their browser marketshare via Chrome and now Microsoft Edge basically allows them to do what they want with HTML.
Matters are even worse. Last year, the W3C became the "yes-man" of Google. They decided to stop developing the HTML standards and just start rubber stamping whatever WHATWG produces. WHATWG is run by Apple, Google, Microsoft, and Mozilla. And who has the most power in that relationship? Yep, Google.
Even if Google did do that (which they’ve shown no signs of, and they are still far from a browser monopoly when you look at iPhone etc) it wouldn’t stop HTML from being read. Translating from HTML -> GoogleHTML wouldn’t be meaningfully different to translating it from Markdown.
It could be argued that AMP was that attempt, and the only reason AMP gained tractions was Google started using it in the carousel of their SERPs.
While Safari, when mobile is included, has ~17% of the market, that's not enough when you combine Google's browser share along with their search engine share.
Is the machine connected to the network/internet? Are you planning on any software updates? I'm curious how you plan on handling https root certificate updates.
Https is something I haven't figure out. If anybody has a good answer to this please let me know.
The only thing I can come up with is http (no s!) and firewall rules that limit connections to 192.168.1.xxx - or otherwise not allowing connections from outside of the local subnet. I don't like it, but I don't have a better plan.
This is one reason why the static site generator I use for my personal website uses HTML rather than something like Markdown.
I don't think Markdown is going anyway, incidentally, or that it would be hard to process on my own if I needed to. But the HTML I use is simple enough and Markdown only decreases the probability the site will last a long time.
Markdown and/or markdown processors are known to change.
Since there's no single Markdown spec, determining just how a page will render, or what will break, is a bit of a crapshoot. And since Markdown treats nonparsable markup as ... plain text, you don't even get errors or other indicators of failure. You've got to view and validate the output manually or by some other means.
With formal tag-based markup languages (HTML, SGML, LaTeX, DocBook, etc.) you've at least got 1) an actual markup spec and 2) something that will or won't validate (though whether or not the processor actually gives a damn about that is another question, hello, HTML, I'm looking at your "The Web is an error condition": https://deirdre.net/programming-sucks-why-i-quit/)
I can't find the post at the moment, but someone recently wrote a cogent rant on the fact that a change in their hosting provider (GitHub via a static site generator IIRC) had swapped out markdown processors, with changed behaviours, rendering (literally) all their previously-authored content broken.
Which is indead a pain.
I personally like Markdown, and find it hugely convenient. For major projects though, I suspect what I'll end up doing is starting in Markdown, and eventually switching to a more stable markup format, which probably means LaTeX (HTML has ... proved less robustly stable over the 25+ years I've worked with it).
Though for simple-to-modestly-complex documents, Markdown is generally satisfactory, stable, and close enough to unadorned ASCII that fixing what breaks is not a horribly complicated task.
I appreciate your reply. Seems Markdown is more complex than I recognized and this just makes me want to avoid it more. If you do find the rant you mentioned, let me know.
> HTML has ... proved less robustly stable over the 25+ years I've worked with it
The first website I made in 2002 still views fine in a modern browser. I didn't do anything fancy, though. I would be interested in what has been unstable as it might give me ideas on what to avoid in HTML.
I don't find HTML to be that much harder than plain text or Markdown so I think I'll keep using it for smaller projects. LaTeX is worth considering as well, particularly given that I will have math on some of my webpages. One issue is that the stability of LaTeX depends strongly on which packages you use. I need to take a closer look at the health of every package I use. I think avoiding external dependencies is easier with HTML.
My sense is that Markdown is probably pretty safe for most uses, particularly if you control the processing. If not, then yes, it can bite. For me that means pandoc to generate endpoints such as HTML, PDF, etc. I'm fairly confident that most of that toolchain should continue to work (provided computers and electricity exist) for another 2-4 decades.
For certain more complex formatting, Markdown has limitations and features are more likely to change. But I've used Markdown to format novel-length works (from ASCII sources, for my own use) with very modest formatting needs (chapters, some italic or bold text, possibly blockquotes or lists), and it excels at that.
For HTML, it's a combination of factors:
- Previous features which have been dropped, most to thunderous applause. (<blink>, <marquee>, etc.)
- Previous conventions which have largely been supersceded: table layouts most especially. CSS really has been ... in some respects ... a blessing.
- Nagging omissions. The fact that there's no HTML-native footnoting / endnoting convention ... bothers me. You can tool that into a page. But you can't simply do something like:
<p>Lorem ipsum dolor sit amet.
<note>Consectetur adipiscing elit</note>
Nulla malesuada, mauris ac tincidunt faucibus</p>
... and have the contents of <note> then appear by some mechanism in the rendered text. A numbered note, a typographical mark ( * † ‡ ...), a sidenote, a callout, a hovercard, say.
In Markdown you accomplish this by:
Lorem ipsum dolor sit amet.[^consectetur] Nulla malesuada, mauris ac tincidunt faucibus
[^consectetur]: Consectetur adipiscing elit.
Which then generates the HTML to create a superscript reference, and a numbered note (when generating HTML). Or footnotes according to other conventions (e.g., LaTeX / PDF) for other document formats.
- Similarly, no native equation support.
Maybe I'm just overly fond of footnotes and equations....
But HTML and WWW originated, literally, from the world's leading particle physics laboratory. You'd think it might include such capabilities.
- Scripting and preprocessors. I remember server-side includes, there's PHP, and JS. Some browsers supported other languages -- I believe Tcl and Lua are among those that have been used. Interactivity and dependency on other moving parts reduces reliability.
The expression "complexity is the enemy of reliabilty" dates to an Economist article in 1958. It remains very, very true.
HTML is for me more fiddly than Markdown (though I've coded massive amounts of both by hand), so on balance, I prefer writing Markdown (it's become very nearly completely natural to me). OTOH, LaTeX isn't much more complex than HTML, and in many cases (simple paragraphs) far simpler, so if I had to make a switch, that's the direction I'd more likely go.
I agree with you entirely on the abandoning of conventions with HTML. I haven't paid much attention to multi-column layouts in CSS over the years but my impression is that it's gone from tables to CSS floats to whatever CSS does now that I'm not familiar with. Browsers are typically backwards compatible so this isn't that big of a deal to me. But I have no idea if what's regarded as the best practice today will be seen as primitive in 15 years.
> The fact that there's no HTML-native footnoting / endnoting convention ... bothers me.
I've seen people use the HTML5 <aside> element for sidenotes, styled with CSS. Some even make them responsive, folding neatly into the text as the viewport shrinks. I'm not sure if this is the intended use for <aside> but the result is reasonable and I intend to do the same. If you're set on footnotes, though, yes, I don't know a native implementation.
Equation support with MathML is okay in principle but not practice. I'd like to have equations without external dependencies (MathJax's JS alone is like 750 kB!), but that's not possible until Chrome decides to catch up with Firefox and Sarafi on MathML. I've been thinking about just using MathML as-is (no external math renderer), and if Chrome users complain, I'll tell them to get a better browser. ;-) Maybe that'll help some Chrome users understand why they should test their websites in other browsers.
Semi-relatedly, I think even the linear form of UnicodeMath [1] is very readable, and it would be great if there was more support for building it up into nicer presentation forms in the browser wild (MathJax has had it on the backlog since at least 2015, for instance), as that seems to me to be a better "fallback" situation than raw MathML given its readability when not built up.
> I haven't paid much attention to multi-column layouts in CSS over the years but my impression is that it's gone from tables to CSS floats to whatever CSS does now that I'm not familiar with.
CSS Grid [2] is the happiest path today. It's a really happy path (I want these columns, this wide, done). CSS Flexbox [3] is a bit older and nearly as happy a path. Some really powerful things can be used with the combination of both, especially in responsive design (a dense two dimensional grid on large widescreen displays collapsing to a simple flexbox "one dimensional" flow, for example).
Flexbox may be seen as primitive in a few years, but Grid finally seems exactly where things should have always been (and what people were trying to accomplish way back when with tables or worse framesets). Even then, Flexbox may be mostly seen as primitive from the sense of "simple lego/duplo tool" compared to Grid's more precise/powerful/capable tools.
Thanks for mentioning UnicodeMath. That does seems like a better fallback solution than raw MathML. It appears there's a newer version of the document you linked to that was posted on HN, by the way: https://news.ycombinator.com/item?id=14687936
Thanks for mentioning grid, as that's a tool I've not looked at myself.
CSS columns and Grid are not entirley substitutable, though they share some properties.
I see Columns as a way of flowing text within some bounding box, whilst Grid is preferred for arranging textual components on a page, more akin to paste-up in Aldus Pagemaker (am I dating myself) though on the rubber sheet of the HTML viewport rather than on fixed paper sizes.
Yeah, they are very different things. One is for text/inline flow and the other block flow. As a fan of CSS Columns (multicol) I hope that the interaction between Columns and Grid gets better standardized. (In my case I wanted better support for embedding Grids in columns; my tests worked in everything but Firefox. So it is interesting to me that Firefox seems the most interested in pushing multicol forward as a standard [1], since it stopped being a Trident/Spartan priority when Windows 8.1/10 abandoned multicol as a key UX principle of Windows 8 apps.)
My preference is to use them with @media queries to create more or fewer columns within auxiliary elements (headers, footers, asides), usually to pretty good effect.
Multi-column body text is largely an abombination.
For images, I'm still largely sticking to floats.
I've done some sidenote styling that I ... think I like. I don't remember how responsive this CodePen is or isn't though I've created some pretty responsive layouts based on it:
I feel like the difference with Markdown is that it's not meant to be a hidden source format. It's meant to take an existing WYSIWYG styled-text format—the one people use when trying to style text in plaintext e-mail or IM systems—and to give it a secondary rendering semantics corresponding to what people conventionally think their ASCII-art styling "means."
If a Markdown parser breaks down, it's quite correct for it to just spit out the raw source document—because the raw document is already a readable document with clear (cultural/conventional) semantics. All a Markdown parser does is make a Markdown-styled text prettier; it was already a readable final document.
Whether or not it's intended to be a hidden source format, the fact remains that if it does not render reliably and repeatably, it's failing to do its job.
Markdown's job is to be a human-readable, lightweight, unobtrusive way of communicating to software how to structure and format a document.
It's one thing for a freshly-entered document to fail -- errors in markup occur and need to be corrected. It's another to change the behaviour and output of an unchanged text, which is what Markdown implementations have done.
(I've run into this myself on Ello where, For Mysterious and Diverse Reasons, posts and comments which I'd previously entered change their rendering even when I've not touched the content myself. This is compounded by an idiotic editor which literally won't not muck with plain ASCII text entered and insists on inserting or creating hidden characters or control codes. Among the reasons for my eventual disenchantement of what would otherwise be an excellent long-form text-publishing platform.)
No, that’s a misunderstanding. Markdown is, as I said, a formalization of existing practice. Nobody’s supposed to be “writing Markdown” (except computers that generate it.) You’re supposed to be writing plaintext styled text the same way you always have been in plaintext text inputs. Markdown is supposed to come along and pick up the pieces and turn them into rich text to the best of its ability. Where it fails, it leaves behind the original styled text, which retains the same communication semantics to other humans that the post-transformation rich text would.
The ideal Markdown parser isn’t a grammar/ruleset, but an ML system that understands, learns, and evolves over time with how humans use ASCII art to style plaintext. It’s an autoencoder between streams of ASCII-art text and the production of an AST. (In training such a system, it’d probably also learn—whether you’d like it to or not—to encode ASCII-art smilies as emoji; to encode entirely-parenthetical paragraphs as floating sidebars; to generate tables of contents; etc. These are all “in scope” for the concept of Markdown.)
In short: you aren’t supposed to learn Markdown; Markdown is supposed to learn you (the general “you”, i.e. humans who write in plaintext) and your way of expressing styles.
If there’s any required syntax in Markdown that a human unversed in Markdown wouldn’t understand at first glance as part of a plaintext e-mail, then Markdown as a project has failed. (This is partly why Markdown doesn’t cover every potentially kind of formatting: some rich-text formatting tags just don’t have any ASCII-art-styled plaintext conventions that people will recognize, so Markdown cannot include them. That’s where Markdown expects you to just write HTML instead, because at that point you’ve left the domain of the “things non-computer people reading will understand”, so you may as well use a powerful explicit formal language, rather than a conventional one.)
Interesting viewpoint, though not one that persuades me.
At least not today ;-)
Human expression is ultimately ambiguous. In creating some typographic output, you've got to ultimately resolve or remove that ambiguity. Preferably in some consistent fashion.
There's an inherent tension there. And either you live with the ambiguity or you resolve it. I lean on the "deambiguate" side. Maybe that means using Markdown as a starting point and translating it ultimately to some less-ambiguous (but also less convenient) format, as I've noted.
But that means that the "authoritative source" (Markdown manuscript) is not authoritative, at least as regards formatting guidelines. Whether or not this is actually a more accurate reflection of the status quo ante in previous, print-based, typographic practice, in which an author submits a text but a typesetter translates that into a typographic projection, making interpretations where necessary to resolve ambiguities or approximate initial intent, I don't know.
Interesting from a philosophical intent/instantiation perspective though.
It is exactly the "guess the future" problem that static sites avoid.
The vast bulk of software goes unsupported in less than 25 years. If you want to depend on something that long, you can guess which package will survive that long, or you can store your data in formats that the widest array of tooling supports.
If you drop into a coma after uploading your static HTML and wake up in 25 years, you might have to use whatever fills the text-manipulation-scripting niche then to beat it into the right shape to import into whatever kids these days are using.
If you used Wordpress, well, maybe it takes over the world, maybe it ends up a Wikipedia entry. (Putting aside, of course, that your site began hosting cryptominers a week after you slipped into that coma because you missed an update.)
> It's an interesting line of thinking though: exactly what properties of HTML make it so long lived?
I've thought about this on and off for a few years. Here's what I've come up with:
1. Popularity. You can't really display anything in a web browser without it, blank pages with one AJAX script notwithstanding.
2. Ease of use. Open a text editor, type some markup, save the file with .html, and open in a browser. When you're done, transfer to a server to show the world. That's a pretty straightforward process.
3. Well-defined, open standard. Every important piece of the web is defined, from the markup to the protocol to transfer it. I think that reasonably bug-free implementations of those standards help.
It's not that there's a well-defined open standard.
It's that browsers will eat any old crap that's thrown at them and turn it into something plausible, if not precisely what the author intended or reader really wants.
Yes, there's a standard, and yes it's open. It's observed far more in the breach, as a few minutes with a validator on well-known sites will demonstrate.
Your comment alone (prior to my response to it) returns:
>It's that browsers will eat any old crap that's thrown at them and turn it into something plausible, if not precisely what the author intended or reader really wants.
Your comment merges well with one slightly above from TheFlyingFish, It's that browsers pretty good at displaying stuff even when html is not to spec.
and really it's that everyone uses browsers that still display text and such on the screen even if it's broken in several places.
This could change if google decided to stop showing pages with broken html - like them killing flash big cuts at a time.
I have turned several worpress based sites into static html with one of the static html making plugins - and that turned those tools into the right ones for those jobs. I think most WP sites can be converted and be just fine, most people don't add new posts to them regularly from what I've seen.
> exactly what properties of HTML make it so long lived?
I think we're thinking about this backwards. It's not anything inherent to HTML that make it long lived, it's that the code to parse static HTML is simple, it's more or less standardized and has stuck around for a long time.
Back in 2001 I redid the UWA computer club website (https://ucc.asn.au/) using XSLT with a custom doctype ('grahame').
In the early 2000s XML was the cool shiny thing. They're still using it, in fact I found out recently that someone wrote a Markdown to 'doctype grahame' converter to 'modernise' the site.
I guess what I actually built back then was an early static site generator, but it's still kind of cool they're using it 19 years later, hacky as it was / is :)
I still use my old XML doctype with xslt to produce some websites I maintain. Whenever, if ever, xslt is removed from browsers, converting it to a static site generator will be easy.
I regret nothing. Editing simple xml using Emacs is a breeze.
Last time I worked with XSLT was in 2014, redesigning a major Brazilian airline reservation and ticketing system. At the time their passenger service system (Navitaire New Skies [1]) had already switched their white label front-end app from a home grown XSLT web framework to ASP .NET MVC 5, but the company I was working for wasn't particularly interested in paying the (higher) fee for using the "new" front-end framework.
There is an obscure search engine called wiby.me that only indexes pages like what is posted here. I used to design websites in the late 90's, and very much miss the simple HTML pages of yore.
I don't think this is really unique to Web 1.0; certainly something that still works from the Web 1.0 days seems "impressive" just because of the passage of time, but there's probably some element of survivorship bias there. You mention ColdFusion as an example, but this guy's site is made using FrontPage. He didn't know in 2001 that he'd still be able to run FrontPage in 2020. He made a bet, and it paid off. Other people made similar bets, on other technologies, and unfortunately got it wrong.
My personal website uses Jekyll, and while there's always the possibility it would become abandoned and stop working (I've definitely found someupgrades to be a pain, and ruby tooling in general doesn't help either), I'll always have the simple, readable markdown files the site is based on. While this wouldn't be an option for a non-technical website author, if I really had to, I'm sure I could write a simple markdown->html renderer over a weekend (or a converter to transform it into the future format-du-jour).
I been using ColdFusion for over 20 years. The HTML it creates can be as simple or complex as the developer intends. The output can last decades without updating, I don't understand your comparison.
I recently changed jobs to a shop with 15-20 year old ColdFusion+SQL instances that were originally HP3000 Image Databases and COBOL screens. At first I laughed, now after a couple years, I agree ColdFusion is pretty robust and its HTML isn't bad at all. Its easy to do fairly complex forms, file operations, email generation, document generation, database in-out things, and it just keeps running and running. Its easy to read and understand and even though we're using MX-era code, the server still installs and runs on recent (Ubuntu 16.04LTS) Linux with no issues.
For a blog or personal site (that didn't have any functionality that really needed a back end) I suppose you could just scrape the generated pages and push them up to any host, but CF seems like a fairly awkward static site generator compared to the usual suspects like Jekyll, Hugo, etc.
> Looks like Web 1.0 got something right after all :)
The secret is creating a standard early on that thousands of different pieces of software depend on, so that changing it would expensive and require a phenomenal amount of decentralized coordination.
Don't worry about making it good -- just make it good enough that people won't want to tear their hair out and unanimously agree to never touch it again. Make the short term cost of applying hacks on top of it low, and the cost of throwing everything out high.
Trac has been end of life for some time. It doesn't run on python 3. There are open bug tickets about it that have been stale for years.
Maybe it will be upgraded now that python 2 is officially dead, but given it wasn't so far and there was no effort in that direction, I wouldn't bet on it.
I find it odd too that they did some minor releases, yet python 3 was not on the radar.
End of life is correct. It is end of life since it doesn't run on current platforms.
I am not sure if the latest distributions (Ubuntu, Debian, RedHat) have all removed python 2 packages. If not, it will be gone with the next major release. You're going to be in trouble to run software with no available interpreter, plus all the libraries in use are effectively abandoned.
> I am not sure if the latest distributions (Ubuntu, Debian, RedHat) have all removed python 2 packages. If not, it will be gone with the next major release
Red Hat has not. Ubuntu has not in its most recent stable release. Debian "unstable" is still using Python 2, so I don't think your statement holds up.
I think I should have said "don't base it on something that CAN'T last". This requires no future knowledge. We know that a WordPress version and its supported PHP will be obsolete.
If you haven't been keeping the WP back end up to date it's not functionality that's a problem it is security. Unpatched WordPress installs account for a huge portion of malware distribution. There's a number of exploits that allow attackers to upload files to your server. So they upload malicious payloads that exploits then download to infected systems.
Most of those exploits are from plugins. If they aren’t using those they can also change the default login url. Also Wordpress lets you export and reimport to current versions without coding. I think it’s one the best future proof platforms, most of the web still runs on it.
Many WordPress exploits are in plug-ins but there's still plenty in the base install (over multiple versions).
Also suggesting that "most of the web" runs on WordPress is a bit absurd. WordPress accounts for a huge portion of spam-y SEO blogs and other outright noise on the web. It's popular no doubt but definitely not most of the web.
It's popularity and porous security is a big problem as it's such a huge malware delivery vector. Everything from worm payloads to JavaScript crypto miners is served up from millions of exploited WordPress installs.
Oh man, don't I know it. I work for a small business whose long-neglected Wordpress site (nothing e-commerce-ey, almost no plugins, just a glorified billboard/contact-info type site for a non-tech company that no-one had updated in literally years) had been exploited in uncountable ways. It had probably been owned long before I was even hired a year ago. A few months ago it just broke, it was too riddled with problems to salvage.
I was able to convince the bosses to let me take on the fixing-the-site project solo, even though my job has little do with IT. I replaced it all with a static site generator I wrote in Go. No logins, no PHP, no database, nothing to exploit in the first place. Anyone in the office can update it by copying images into arbitrary subfolders in the generator's images folders, and double-clicking the update executable. It builds and uploads a fresh site in a couple of minutes with nice gallery carousels. And as a bonus it loads basically instantly on even the bargain-basement shared hosting we're on.
I do wish that IE compatibility wasn't one of the bosses' firm requirements, due to a lot of our clients not being tech people and still using IE on decade-old computers. Life would be so much simpler if I could just use CSS grids for layout. I f'ing love grids.
Wow, TIL that "36%" is most. Your own quote tells you that their measurements are only sites they scan and can determine the CMS used. As I said, WordPress is extremely popular in the SEO spam community and powers thousands of dead blogs, but it's a far cry from powering "most of the web".
None of that is material to the original point that thousands upon thousands of unpatched WordPress sites might work but also deliver tons of malware. WordPress' popularity is problematic because it has had a d will keep having serious security problems. WordPress exploits are entirely automated and performed constantly by zombie networks.
I've been using my Wordpress site for 12 years now. Sure I upgrade versions from time to time, but the original post is still there and works perfectly.
It's had around 8 million page views in that time.
No way in hell today’s HTML will survive 25 years now that google owns it, browsers will literally crash due to lack of user tracking. Best just host a static txt file.
Nah, the next iteration would be called GHTML, it will include fact checking by Google AI and Google Analytics by default, all for free. Everyone will use it otherwise Chrome will give you strange security errors and after all you wouldn't want to use the web that is full of fake news and other content that can be offensive to someone. /s
What you said made me glad that I've just developed DocxManager (https://docxmanager.com) - its concept is like WordPress (document-focused editor, themes and templates) but it generates standard html/css/js, and use Word as the document editor.
Fortunately there are plenty of static site generators. Frontpage is out of support and likely the currently popular generators will meet their end one day as well. But even then you can still run them in the future, and their output should not have any major issues (unlike a CMS which might get hacked if it's not kept up to date).
> Can you imagine running the same WordPress version for 25 years?
If you keep active on your Wordpress install, the regular updates will be no issue for you and will (almost) never break your website. Not sure why you would expect a regular Wordpress user to run the initial install without recommended/mandatory upgrades over a long period of time.
I guess what I'm saying is that if you want to build a site to last 25 years without numerous redesigns, build a static HTML page.
Looks like Web 1.0 got something right after all :)