Hacker News new | past | comments | ask | show | jobs | submit login
Is anybody still using Windows 95? (quora.com)
125 points by vishnuharidas 46 days ago | hide | past | favorite | 339 comments



When I was still working at Arecibo Observatory, there was definitely a couple systems running Win95 that were considered data critical. And they were still running on ~486 hardware IIRC.

They hadn't been replaced as they were running some custom ISA interface boards developed by a research group in the 90s, and the community was still using the data output by the machine. And since it had been trucking along for ~20 years, convincing them they needed to develop a replacement was hard.

When I left, it was still there, chugging along. And once a day, a tech would wander in with a floppy and copy some data off and wander back to the control room where a USB floppy drive was attached to a workstation specifically to read this floppy and let the tech copy the data files off to a network location.


Sounds like their system works. If that 486 hardware lasted 20 years already, seems like there's a pretty good chance it'll outlast most of the modern stuff it would be replaced with.


It works yes, but they were running out of spare parts when I left. That system's parts were on frequent Ebay searches, as it was becoming quite fragile. I think it was on a 3rd power supply and a second mobo or proc, can't quite remember.

It was on a redundant on-line UPS as the last time it went completely off and was booted from cold, it ended up needing parts. So no, I'd say it's not likely to outlast modern replacements. Just had lots of support and organ transplants.

Systems like that ultimately end up being a resource drain, as unlike modern hardware it's not simply wandering over to a retailer and buying standard replacement parts, but sourcing used or NOS parts and hoping they work, often requiring far more time and effort by someone who could be doing some other important task.

The problem is that in open use observatories like Arecibo, a lot of equipment like that is put in place by third parties, but becomes our responsibility to maintain. But the science is important, so we maintain it as best we can.


Eventually, yes, it becomes a resource drain. If you want to know when, ask the finance people.


> If that 486 hardware lasted 20 years already, seems like there's a pretty good chance it'll outlast most of the modern stuff it would be replaced with.

Unlikely. Good quality 'modern stuff' has far better quality than what was available back then. Power supplies have gotten much better (and have more protections), as did motherboards with solid state capacitors. Older motherboards loved their electrolytic capacitors.

Processors had less protections. Thermal monitoring was at its infancy.

Memory sticks back then were very unreliable. As were hard drives.

Then there's just a matter of age. If it has lasted 20 years, chances are it won't last much longer.

It has probably lasted 20 years because it wasn't being powered on and off constantly. It most likely had stable power sources and - most important - a climate controlled environment with low amounts of dust.


A 486 system was built before the capacitor plague. Quality electrolytics may not last forever, but can last a long time. Also, built before RoHS and the reduced longevity of poorly formulated or poorly applied lead-free solders that were common for many years.

Processors had less protections, and thermal monitoring was at its infancy because almost all the 486s wouldn't use enough power to cook themselves to death, even with passive heatsinks.

I don't know about memory sticks and hard drives from that period. It's quite possible they've replaced the hard drive with compact flash or something too, which should be pretty reliable given they apparently only make a floppy disk's worth of data every day.


This! Got a used AMD-K5 PR133 in a big-tower a long time ago, integrated it into my home lab, and used it to experiment.

About 1 one year later it underwent my cleaning routine, vacuuming dust out, cleaning the fans, etc. I took the cpu-cooler off to check for bad thermal grease and discovered it had none at all! No pad either, someone just clipped that thing on, and instead of direct contact it had a small airgap which you could look through from the sides.

Unbelievable.

When I got it, it must have been in use for years already!


I'd love to hear more about what "the capacitor plague" is


https://en.wikipedia.org/wiki/Capacitor_plague

It's why computers from the 80s can last until now without needing any recapping but computers from 2002 or so are ticking time bombs.


Um.... All electrolytics eventually go bad as the electrolyte solution dries out eventually. Re-capping retro computers and radio gear from the 70s through the 90s is very much a thing.

That being said, electrolytics were somewhat less common than they are now. But the Capacitor Plague is not the only reason electronics are recapped, especially older/vintage/retro electronics.


Yeah, but the 70s-90s stuff may die /eventually/ but not pretty consistently after 2-3y, that was the point.


But not after 2007?


Well, for some years after 2007, you still have old stock bad capacitors floating around. And you're into lead-free solder problems.


Modern components are run much closer to their limits, simply due to engineering optimisations.

A 486 doesn't even need a heatsink, it's passively cooled and perfectly fine with that, as software of the time (mainly DOS) didn't even "idle" the CPU --- it was drawing full power all the time.

That also means if you manage to boot DOS or similar era software on a modern machine (especially a laptop) that was designed to have the CPU idled for most of its life, it is likely to go into thermal throttling immediately.


Is that really true? I booted FreeDOS (had it installed for a while just for kicks) on my Thinkpad T420 (i5-2540M) and I didn't notice any heatup.


Yes and no. The 8086 had a HLT instruction that was meant to replace the busy loop waiting for an interrupt. It was soon understood that you could also save power by turning off the clock. I'm not sure if the "SL" versions of the 486 turned off the clock with the HLT instruction, but later versions did. Use of it was apparently introduced in MS-DOS 6.0, so FreeDOS, Windows 95 and OS/2 2.1(?) would have probably used the instruction to effectively idle the processor. Not sure if Windows 3.1 made use of it, but it would make sense as the basis of the system's event loop.

https://en.wikipedia.org/wiki/HLT_(x86_instruction)


I recently installed DOS 6.22 (and Windows 3.11) in a VM and it causes one core to sit at 100% all the time.

I don't know if the VM changes anything, but it definitely seems as though it does not halt while waiting for activity.


I can recall there were programs like "Waterfall" and "Rain" which were supposed to force Win9x to HLT when idle.


Besides what the sibling reply mentioned about capacitors and processors, WRT memory that machine has survivorship bias. The fact it's been running for thirty years tells you the RAM in the machine is good and is unlikely to have a problem barring damage. The same is true for all the other still working components.


True, but people have backported modern hardware to that era. You can take a modern power supply and drop it into that PC, I'm almost sure.


I worked in an office with a similar setup that "worked", but it was very fragile. When the HDD started failing on the old 486 machine we scrambled to migrate as much as possible (including a few rewrites and ports of utility programs). We were lucky that our problem was the software not running on modern systems (for reasons), not a dependency on the hardware. Mostly we communicated with hardware systems via serial, so any computer with a serial port (or a USB-to-serial adapter, which are often unreliable) could be made to work if the software existed.


After 20 years, might they be in territory beyond the far side of the 'bathtub curve'?



> Sounds like their system works. If that 486 hardware lasted 20 years already, seems like there's a pretty good chance it'll outlast most of the modern stuff it would be replaced with.

Surely it does make some sense, but think about the hardware suddenly breaking down and not finding what to replace it with. Even with spare parts it could be a pain as the old generation retires


Just get an industrial board which provides ISA-slots and has something like http://www.vortex86.com/ on it.


I actually was surprised to see these are made, as I considered getting one for a few projects. The ones I saw were pretty expensive with some being resold on ebay for just under a thousand dollars. Reading some reviews though it seems yoy have to be careful as not all the boards may support what you want. Apparently a good many don't support DMA.


I thought about this, but let it stand.

It can be tricky. I had to work with something like this a few years ago and managed. As I remember I had to get Scitech Display Doctor/Univbe from some abandonware sites to make use of the weird onboard VGA under Windown95. The ISA-card for the CNC-machine worked, it used DMA, among other things. It all worked, and was even more fluid to operate/program.

The cost was just over 400EUR. (for the board)

Anyways, it's industrial, the formfactor often does not fit existing cases and backpanels, but it's doable. More so than hunting ebay or similar for spares of dubious provenance.

Only exception would be if the "App" and the board/card were designed so sloppily that they really only run on exactly that same hardware.


This is due to the rule of antifragility.

https://en.wikipedia.org/wiki/Antifragility


I suppose its fine as long as its not connected to the internet.


IIRC, the machine was on an isolated equipment network with no external access for some time, but the network drivers would often cause some kind of conflict with the ISA interface card (and custom driver) so the network interface was disabled. So yeah, should be safe (and is likely no longer hardwired to the network).


Do you suspect there are any 0day/worms still targeting Windows 95 systems?

I wonder if an unpatched Win95 box is actually safer than an unpatched modern Windows box.


Win95 is running as root by default. Any process can open another's memory. It has the same registry entry as Windows NT for running a program on startup. The keylogging API is basically the same. The IShellFolder and IWebBrowser interfaces are identical, so as long as the malware didn't request specific classes or features that are outside the realm of Windows 95, it can probably do pretty well. Luckily, Internet Explorer 3 doesn't support TLS.


Except for the Ex functions. I wonder if a malware can be run under KernelEx.


That floppy disk controller is the only way into the box and isn't nearly as vulnerable as a modern usb controller since there's no smarts!! The windows 95 system is going to be safer than a patched windows 10 on the network for their use case.


For the common-case en masse compromises, not a concern. For targeted attacks it can still be an issue. Stuxnet is an example of this where they compromised an air-gapped system. Stuxnet was obviously extremely sophisticated, technically and for the overall operation, but the principles still apply. Also people frequently misjudge how big of a target they may be & what their risk profile really is (not saying for this use-case specifically, just in general).


>Do you suspect there are any 0day/worms still targeting Windows 95 systems?

You'd be surprised how much malware is "legacy" 32-bit and works without issue on WinXP. A lot of malware even shies away from Unicode API's and specifically calls out the *A Win32 API functions.

Considering the Win32 API is backwards compatible to a fault, I don't see why modern malware couldn't absolutely wreck a Win9x box.


Yes, but Windows XP was in widespread use until recently and is still in use in a lot of places. I don't know where "netmarketshare.com" takes their user statistics (Google just used them when i searched for that) but it reports "1.26%" for Windows XP, which i think is still a large number. There are still so many users of Windows XP that Microsoft released a critical security patch last year despite XP being official EOL for years now.


Sure – once the malware lands. But I doubt anyone is going after any Win95 vulnerabilities, so how does it get on the box to start chomping?


Loaded from the boot sector of any floppy that happens to be present when the system is restarted, unless the boot order has been changed from factory defaults and the (possibly very old) coin cell powering NVRAM holds sufficient charge to retain these changes when (if) the system is ever powered off.


In general, attacks (e.g. most ransomware campaigns) are not automated malware spreading on its own, they involve people. If it's accessible on a network (which is the big 'if'), then it's obvious on any scan that it's a Win95 machine, and then you can try to look up a vulnerability for any exposed service. For example, if file sharing is running, then all the SMB vulnerabilities (e.g. EternalBlue and friends) apply also to Win95 systems, and I'm not sure if Windows 95 had a patch made, backported and issued for them.


Eh, no. Not by a chance. I have some old as heck games which don't even work even under W2k with w9x compat mode (it's hidden, you need a command to enable it), so a total nope under XP.


Windows XP (NT kernel) and 9x are totally different OSes, using totally different kernels. They share some API, but not really the interesting parts.

I work on software that still supports XP with only a little pain. Visual Studio 2019 will still install the necessary toolchain if you ask nicely. Targeting Windows 95/98 in 2020 sounds like a nightmare.


> Visual Studio 2019 will still install the necessary toolchain if you ask nicely

Could I get a link to how to do this and/or web search terms? Is this just something targeting visual C runtime 20xx or older?


Risk is hardware related, finding replacement parts. But yes, the software part seem to work fine.


I wonder if it's even just as easy to just run some of that on VirtualBox / some hypervisor running Windows 95, that way they have access to modern replacement hardware.


But there’s also a fair chance something will break and you can’t buy spare parts for museum pieces like this.


How are they doing backups of the software and data?


The entire disk (all 10s of GBs of it) was cloned and backed up as a disk image, and can be re-imaged whenever necessary.

The data is copied off daily via floppy (don't get me started...) and is backed up on local network shares. The specifics of that weren't in my need-to-know and was managed by the IT group.

The system was essentially stateless, so spinning it up from the image wasn't an issue, but the PC itself is essentially part of the "equipment", as it ran the custom drivers and it was specifically designed for that PC and a one-of-a-kind ISA card.


Reading this... my first thought was "But newer x86 can.... oh right ISA..." then I googled for USB ISA slots... and they are apparently a thing. That doesn't mean someone wouldn't need to write a new driver but at least they could still use the cards.


Many years ago I was using an MCA (multi channel analyzer) ISA card connected to a surface barrier detector to measure beta decay spectra. This stuff basically counts the number of pulses from the detector binning them according to the height of the signal. It had its own acquisition software meant for DOS (the manufacturer provided the sources as well).

The thing is you could operate it under Windows (98) in protected mode, but then you lost statistics. How come? Well, Windows scheduling meant that the MCA only took data while it was allotted time slices, as a result the dead time of the whole setup increased, and you couldn't tell by how much.

Under real mode DOS it took advantage of DOS being almost an RTOS and the program running as a single task and you could be pretty confident about your data.

I don't think using adapters with this sort of cards is a good idea. It's not a matter of drivers, sometimes the old stuff simply doesn't work well when plugged in newer systems because everything has changed and you need to redesign everything. If the old stuff does the job, the OS being unsupported only becomes an insurmountable problem for the folks that consider using an unsupported OS some sort of crime in itself. Well, there's usually more than just an abstract computer being used in such circumstances.


Exactly. I believe someone tried investigating a solution to use a modern system and either perform some kind of pass through to a Win95 VM or some such magic, and it ultimately came down to needing to re-implement either the driver or the software, and that is hard to do without funding and time. Or any good documentation on what the original function and behavior was.


Yeah my first thought was just running a 16bit OS on modern hardware (it will do it if you're careful and do the right things). But then the issue of it being an ISA card popped up... that's not really a solvable problem with modern hardware unless you can convince an IHV to make you a custom motherboard. Although apparently some exist ( https://www.ibase.com.tw/english/ProductDetail/EmbeddedCompu... ) probably for industrial applications. While I'm very sure you could get DOS running on that... I'm less sure of Windows 95. The google search did show other options that are plausible but I doubt many of them would arrive working as is without needing capacitor replacement at a minimum.


There are all sorts of USB-to-Foo interfaces, the vast majority of which implement whatever Foo is just barely good enough to say it works with a straight face. Like the USB-Serial adapters...most of them work ok at 9600,8,N, but trying 5-bit with 1.5 stop bits (something a 'real' PC serial port can be persuaded to do) and good luck.

So, yeah, I'm sure that USB-ISA adapter probably works with that old 3Com ethernet card or that Soundblaster card that 'definitely' sounds better than anything else, but a custom card with custom software and probably unknown tolerance to variations from 'real' ISA? Good luck with that.


Oh probably and even if you did find one that works "well" you'd have to buy a ton of them to ensure you have backups. I'm well aware of the issues. But without an alternative or a grant to update the hardware what do you do? At some point the availability of period hardware with ISA slots is going to be prohibitively expensive.


You can still buy new manufacture motherboards with ISA slots and modern processors. Intel still (or until recently) made ISA bridge chips for modern Southbridge chips. And it's something that a smallish FPGA could handle. There is a whole ecosystem of companies that recreate old-style motherboards for exactly this scenario.

My brother's company does IT stuff for the manufacturing sector. Very conservative. I've worked with a number of his customers to source exactly this sort of thing. Years ago they asked my thoughts on an Italian made cloth dying machine the size of a bus with a DOS PC controller and a minimally documented ISA controller. A mid 7-figure US$ setup; the company that built it is long gone. We found a shop that supplied new motherboards, modern (at the time...P3/P4 era I think) with ISA slots the vendor guaranteed it would be every bit as slow and weird as a PC/AT. Migrated everrything over, the company bought a stack of motherboard spares new, and I think they're still using them.

Now...you want a challenge? Migrating the proprietary, mostly undocumented software, all in Italian (we aren't native speakers...though my high school Latin occasionally helped) from an old ESDI disk to something, anything, else. That was much more interesting.


Just like most USB-to-parallel adapters will only work with parallel port printers and not much else.


To save others a search: http://arstech.com/install/ecom-prodshow/usb2isar.html - about $145 each.


Currently I have a related task, a telescope is being operated using a set of old hardware (late 90s-early 00s) and same age software, due to current situation in case of lock-downs the observations are not performed, so the idea is to migrate everything into virtual machines with PCI forwarding, so that VNC/whatever other remote desktop software can allow remote operations. Does anyone have such an experience? What are the pitfalls? I plan use Xen or KVM.


I did something similar for an organisation that had a strange music licensing setup for pub jukebox machines. A significant portion of their business hinged on this one ancient DOS pc with custom hardware and a licensing dongle.

Ultimately it comes down to whether you can get the inputs and outputs you need replicated/attached to the virtual machine and and exact copy of the machine itself into virtual form - the latter challenge can include having to do risky stuff like physically removing the source HDD to plug it into various interface converters and then taking an image from it using an intermediate machine that can read it.

Depending on how your setup works you may run into issues with things like time precision within a VM - if high time accuracy is important in the work it’s doing you’ll need to put attention into ensuring VM time = real world time.


Some top of my head suggestions based on our experience:

- Use Proxmox. Even free version can go a long way (because backing up VMs periodically is a good thing and Proxmox can do it fine).

- If you're going to run Linux, run XFCE desktops with X2Go server. X2Go can work almost in any network condition.

- Test PCI forwarding extremely well before taking the plunge. Especially if you're going to virtualize specially designed equipment. If it works, it's magic. If it doesn't then you're in big trouble.

- If the PCI cards will be migrated from older equipment to newer one, test to see whether they work well with spread spectrum. If it doesn't make sure that the server you use allows disabling it.


The amount of questions I have about this would mean we’d be here for days but:

How long ago did you work at Arecibo?

What was it like?

What did you do there?

Did you ever... find.. anything that you can’t talk about? Blink twice if yes!!


If you're really curious, hit my profile and email me. Always happy to chat about AO and the fun stuff there.

I was there for about 3 years and left around 2016. It was one of the best and most interesting places I've worked at so far.

The people were great, the work was great, and I personally felt the work came with a huge feeling of accomplishment and made me feel like I was contributing to a better world by facilitating scientific progress.

I was part of the Electronics Dept. which was ultimately responsible for the scientific hardware, the signal paths, and everything between. There were some people in Physical that were more responsible for the electro-mechanical stuff like motors and such. I was primarily responsible for assisting with the analog signal path and was considered a Receiver Specialist, but the latter half of my time there was spent writing a lot of digital control stuff for newer projects as I was the only one on staff at the time who had the skills and time to take those on.

As for finding anything..... that's classified :p

We did manage to talk to an old NASA satellite from the late 70s. The ISEE-3/ICE reboot project. Some details on Wikipedia here: https://en.wikipedia.org/wiki/International_Cometary_Explore...


What did the ISA boards do?

Also the name sounded familiar, this was the radio telescope which incurred damage recently.


The ISA board was a custom interface to a number of GPS receivers and some IO chips for time sync calibration and tracking for Pulsar astronomy.

And yes, this is the one that recently suffered a cable failure that damaged the reflector.


Curious, how long ago was that? I visited last year and got a tour from Angel. Yea, they had a lot of old hardware.


When I left? Around 2016.

I'm almost certain that '95 machine is still sitting in the Clock Room to this day.

Man, I miss Angel, gotta shoot him an email sometime soon. Miss sitting around and chatting about ham radio with him!


I help to maintain an isolated network of desktops running Windows 95. It's a complex EHR system with multiple clients.

It's a family business, and the software works really well (a medical niche), so the investment to update the stack doesn't make any sense. (And there's no need to bother clinicians neither with things that don't add value) I have a VM for tweaking and working with it. I have multiple ETL scripts that extract the data to a modern ERP.

I don't have access to the source code, but I have tried to reverse engineer it from multiple angles. Essentially it's a 16 Bits client written on Delphi with a database (dBase) that doesn't support multiple readers or writers, shared via a network drive. Clients read files from the shared drive and create locks to avoid multiple clients working in the same data. That's the main source of pain, but the clinicians adapt to it in a week or so.

I haven't seen yet a modern system as complete as this one. There are multiple clinical centers in the area using the same software (25ish) and the 2-3 that moved into newer systems regretted greatly.

The market is too small for any new developments. But the guy who built it in the 90's maintains it and earns enough to live happily.


How do you deal with operating system security?


An isolated LAN and no USB peripherals. There's a wireless network with internet but no device has access to both.


I have industrial machines that run the user interface on Windows 3.1.

There was a fad for a while of not using PLCs, and instead using 'regular computers', at least for a portion of the system (usually a PLC still handled the most performance sensitive parts).

When you have a production line that's 100 feet of steel, motors, ovens, and other equipment, and the cost of redoing all the control systems is quoted at hundreds of thousands of dollars, then why change it? It still works.


I guess it would come down to a combination of the companies risk tolerance and the cost of the worst case scenario. Eventually the system is going to blow w/out possibility of repair, now you have the expenses of downtime + replacement. So one argument in favor of replacing "prematurely" is to prevent just that. Like how you might sell a car for higher towards the end of its life instead of waiting until it's EOL, since now you get less money for the car and suddenly need to buy a new one. Granted, a replacement system could be worse than the original.


How do you find parts?


There's literally millions of old computers sitting around in warehouses and storage closets all over the place. That's besides mounts of new old stock sitting around.

If you're looking for replacement parts for old PC equipment, you don't need a million units, you just need a couple units. Finding a handful of units for replacement parts is pretty easy.


And it's also environmentally friendly, to keep old parts going instead of straight to the dump.


Reminds me of George RR Martin (author of the Game of Thrones books) who uses DOS for writing books.

[1] https://www.bbc.com/news/technology-27407502


I use WordPerfect 6.2 for DOS, not for any nostalgia or legacy reasons, just because it's a full-featured and highly configurable word processor that I can use in a terminal. I only use it for writing letters and so on, nothing too serious, but I prefer to stay in the terminal if I can.

It works beautifully under dosemu2, which has a terminal mode that can translate various VGA modes into S-Lang calls (S-Lang is like ncurses, so no X11 required). I find this technically impressive and makes a lot of old DOS software indistinguishable from native linux software; stdin/stdout, parameters, host filesystem access, etc all work transparently.

Here's a screenshot: https://twitter.com/taviso/status/1272670107043368960/photo/...

It can import TTF fonts and print to PostScript, which I just pipe into ps2pdf and then handle on the host.

I'm not aware of any other full-featured modern word processor that can run in an xterm. I know about wordgrinder but it's very very basic. You could use a text editor, but it's not ideal for layout because it doesn't understand things like proportional font geometries - you need that to know how lines/glyphs will fit on the physical page when it's printed. You could write it in some form of markup, html, TeX, markdown, whatever, but if I'm just trying to format a document I prefer a word processor.

(Note: dosemu2 doesn't require virtual 8086 mode, so it works fine on x86-64)


Brilliant - I've long observed that almost everything we are doing with our GHz computers, we were doing previously with old MHz systems. We have more pixels (but use them for the same result), more memory (but everything has grown to fill it), more bandwidth (but websites packed with junk bytes).

The only "new" thing enabled by our bigger systems is "big data", which is largely a process of finding patterns that will fool the user/purchaser/customer.


> I've long observed that almost everything we are doing with

> our GHz computers, we were doing previously with old MHz

> systems.

I do also, but then I do have to pinch myself and remember some of the cutting edge software that really does make use of the hardware - i.e. games, video editing, neural networks, etc. I know we could already do these things on lesser machines, but there is no doubt that the level of which they are currently done could not be replicated on a lesser machine. And also remember the efficiency of systems such as web servers, todays average usage would be a DoS attack of the past.

I do look in disappointment though at text editors, window managers, file viewers, etc - that despite having much more computing power, offer not many more features but still eat tonnes of resources.

I'm currently (slowly) working on a Window Manger for X11 which tries to bring back some of these ideas but for modern devices: https://github.com/danielbarry/oakwm/ It's built on top of wm2 (which itself has roots going back to Bell Labs plan9). A lot of the work is in ripping out the unnecessary features and making it touch friendly. The idea is to run it on the PineTab Linux tablet.


I'd like a browser that is just a nice browser. It should fit easily into a few 10s MB and start instantly. It wouldn't waste 100s of MB on implementing dangerous/pointless/wasteful javascript APIs, like battery state or physical screen APIs; it wouldn't support (literally) 4300 options; wouldn't save 200 MB of config data; and wouldn't load many MB of data just to start and display a blank page!


Dillo/NetSurf may be of interest to you. They fit the size requirement as well as the "lack of JS" requirement (i.e. no JS support at all). "appsites" will definitely not work in them, but the document-oriented web and things like HN are fine.


I've used Dillo, but unfortunately most web pages are seriously malformed. Would be nice if it was possible to get some better CSS support.

I would like to see better JS support, but the scope of JS is simply insane for such a small browser. It's unfortunate much of the web is completely unusable without running JS. Perhaps it's possible to first run the page through a larger browser engine and then send the processed content to the small browser (such as Dillo), that would massively widen the scope of what it could display.


Use dillo from HG, it has better CSS support.

>But the scope of JS is simply insane for such a small browser.

Edbrowse has JS support thanks to duktape.


> Use dillo from HG, it has better CSS support.

Here? https://hg.dillo.org/dillo/

Seems to be still pretty old...

> Edbrowse has JS support thanks to duktape.

Huh, very interesting: https://duktape.org/

I initially assumed by ducktape you meant "barely held together"!


For the record, chrome on android is ~60MB, less than half the size of desktop chrome (~150MB). Android chrome supports the same set of javascript APIs. Clearly that isn't the whole picture.


I'm preeettty sure that's just because it's relying on the separate WebView app, using WebView is also how it's possible for people to make sub-megabyte android browsers, such as [Naked Browser](https://play.google.com/store/apps/details?id=com.fevdev.nak...) and [Via](https://play.google.com/store/apps/details?id=mark.via.gp&hl...)

*correction, according to https://developer.chrome.com/multidevice/webview/overview they are still separate, it references android L dev preview so not the latest source tho...


Actually, you can select which Chrome to use for the Webview in Android developer settings if you have Installed multiple versions of Chrome. Chrome stands on its own.


I measured Trichrome, which is the separate APK where the actual rendering engine is stored.


So Chrome for Android is 10MB larger than the entirety of Damn Small Linux, which includes a desktop environment with three web browsers (firefox, dillo and netrik).

While 60MB is still humongous, it does make sense that Google would at least care slightly on Android, as the majority of the Android market are low-to-mid-tier smartphones with reasonable performance instead of the "desktop-in-a-pocket" that are the current flagship phones.

But while they care slightly on Android, they don't care at all on desktops.

Remember kids: Large binaries are slow binaries.


Firefox on my mac is over 200MB. Is it smaller on Damn Small Linux?

And yes, there is clearly only an incentive to care on Android. The automated tooling to track Chrome binary size only works on the Android build.


>Large binaries are slow binaries.

False. Loop unrolling is always faster.


> False. Loop unrolling is always faster.

Blatantly false, which is why heuristics decide when to unroll.

Large code trashes instruction caches, and cache misses are very expensive. More code, more misses.

For very small loops, the unroll can pay off, but these are often too small to have significant size detriment.


Depends largely on the state of code caches, microcode and bus structure. Loop unrolling can be the worst thing you can do, on some architectures.

See, the hardware folks have listened in on the compiler people and their problems. They've done things like identify loops and rewritten them in microcode for optimization. If a short loop can fit entirely within the CPU code buffer, speed goes way up. Unroll the loop and blow the CPU code buffer, defeat the optimization and lose all that.


what you want isn't possible with the web being like MTV and moving far beyond it's original intention. The best you can get is get one of the small browsers and live with broken pages. Some people do just that. I think you should be think in terms of relativity though. Think about how much a "10 MB" only browser took up in main memory back in the day.


> I think you should be think in terms of relativity though.

> Think about how much a "10 MB" only browser took up in main

> memory back in the day.

This is exactly the point, software has swelled to use the resources available to it, so with each new iteration your machine gets faster but what it runs gets slower. It doesn't feel like this is something we should settle for.


Well that and high quality video. One and only one tangible benefit of this new web is better quality video. Everything else sucks as much or worse than in the days of Altavista and punch the monkey ads.


And browsers that don't crash all the time.


And search is good now.

But apart from the HD video, the search, the aqueducts, the roads, the education, the browsers that don't crash all the time, and the wine... what have the Romans ever done for us?


> And search is good now.

I disagree. Searching for keywords is practically impossible nowadays, with all smart AI-powered™ search engines that unhelpfully return results that are not relevant to your query. And SEO spam, which has killed the niche high signal-to-noise handcrafted websites for content aggregators.


Since 2-3 years, I saw a dramatic deterioration of search. Whenever I am looking for anything that is not very mainstream, I get pages and pages of irrelevant results / SEO hacking sites


Oh, and XMLHttpRequest and Websockets, so we don't have to rely on META REFRESH in hidden frames...


Sorry, search used to be good, around ~15 years ago.


I never once had a browser crash on me in the nineties.


> I never once had a browser crash on me in the nineties.

IE5-6 were a crashfest.


I used Netscape on a Mac in the 90s.

I lost track of how many times I saw the spinning beachball of doom. The only solution for which was the power off and on again.


Remember "popup spam" and then how enough windows would open that none would respond?

I'm guessing anyone who didn't get browser crashes in the 90s was only browsing a couple of sites or something. Certainly anything before IE5.5 was crash city - in particular Netscape 2.0!


Chrome still crashes all the time for me - in fact, it seems to be crashing more than it used to. It's just that now the crash is isolated to a particular tab.


And that you can resize without the browser reloading (very slowly) the URL.


I forgot about those! Hahaha good times


I've been a developer for a while, starting from QBasic, then Pascal, Delphi, C/C++, Flash, Qt, etc. happily switching from one technology to the next when it makes sense.

Today I use Electron and React Native, which needless to say are not very popular, but for sure I could never have developed the kind of cross-platform software I write today with the technologies I used many years ago. Partly for lack of skills, but also of time (cross-platform development was way more difficult), or simply because computers back them were not powerful enough.

I don't have any special nostalgia for old technologies, some stuff were good, some not so much. And as OP is showing you're still free to use old software if you don't like what's being done today.


> ... or simply because computers back them were not powerful enough.

They still aren't.


MHz level stuff works for 'work' that normal people do - writing documents, emailing, etc. I dreamed for a while about creating a MHz-level processor from the ground up with an open hardware specification and creating a simple kernel + OS to run on top of it, with new secure protocols for networking (websites, not reinventing TCP/IP) and communications (email). Right now every single computer is backdoored by the governments of the most powerful countries: we have Intel MEs and AMD PSPs in the hardware, and the frameworks everything uses - Linux, OpenSSL, etc. - are almost all huge complex codebases that the NSA has tons of zero days for. If you could get open MHz-level hardware and get OpenBSD or something to run on it, and develop applications with a focus on secure code, it would be pretty great.

But ultimately this would all fail. Aside from the network effects - nobody's going to use it - the fast processors and tons of memory are necessary. We just don't think about it unless we're running out of it.

- high resolution photo and video - photo editing programs like GIMP and Darktable can spend some time processing photos; these days even with complex effects you usually never experience lag more than a few hundred ms on many megapixel photos, because our hardware is fast. Same for video, the memory and storage space and bandwith is a hard requirement, and going back to 360p is not really acceptable.

- high resolution monitors - no point in having great looking 4k 10 bit color video without a 4k monitor, and now you're stuck having to push 20gbps through your displayport cable on a 500 MHz processor. And text also looks much better at 4k. Also you could say this is a bit unnecessary but compositing window managers are pretty great and I would say a core requirement of modern GUIs, and having eye candy like wobbly windows, window shadows (those are actually very helpful, try turning them off), etc. is expensive. My Dell XPS laptop from 2017 couldn't handle wobbly windows without visible stutter at 4k60; my desktop with a $400 GPU from 2019 can keep up at 4k@144hz.

- new technologies - VR/AR, fast voice recognition and neural networks - this is all cutting edge stuff but the use cases are obvious and they have started to be applied more commonly. Also the failures like eye tracking, Kinect - they may have failed commercially but they were good ideas and a valid use of fast computers. Also online meetings with many participants, each with their own video streams that require decompression.

- obviously, video games - not that you can't have fun games with shitty graphics, but good graphics are nice.

I thought of a few more but can't remember them at the moment. Oh, also, Bluetooth audio - 192kbps audio, but you need to decode it since it takes compression to get it to that level, and then you have the additional overhead of sending it over a digital protocol instead of just having your ADC do the work. You would need expansion cards or a dedicated audio core in your CPU to accomplish this if your computer wasn't fast.

And of course, science needs big data and big hardware to process all of that data.

If you're only working with text then slow computers are fine, but as soon as media gets involved they are not practical.


Don't get me wrong - I like hi-res images, and bluetooth (when it works), and I'm still amazed that my phone has more grunt that a Cray.

But the stuff we do day-to-day (excepting graphics) is appallingly inefficient. And most of the look-at-this stuff (speech recognition, and you mention VR and eye-tracking) is not actually used for anything day-to-day; we don't routinely talk to our computers ("Hello Computer" https://youtu.be/v9kTVZiJ3Uc?t=10 ).


> we don't routinely talk to our computers

That's perhaps because doing so in an office or cafe environment isn't great.

Kids are using text-to-speech more while doing their homework. Word has been beefing up its transcription feature for a while now[1].

I got to see this happen during lockdown, where teachers I know recommended transcription for younger kids who don't have touch typing skills and were still expected to turn in work on a computer. Talking to your computer is much more natural while doing your homework in your room.

[1] https://www.theverge.com/2020/8/25/21400623/microsoft-transc...


When I was a young person with executive function issues (late 90s/early 2000s) voice to text was a lifesaver for my ability to write coherently. I found it much more natural (even then) to "write" by speaking my mind to a computer than to struggle to focus while typing slowly. It wasn't until I was older and using IM heavily as a social outlet (which was much more in the "bursty" configuration that works well for my brain) that my typing ability caught up with my pace of dictation.


Saddest part of speech recognition is with all the gigahertz and storage we have, we send it out to the internet to be processed by something else then get the result.


> Right now every single computer is backdoored by the governments of the most powerful countries: we have Intel MEs and AMD PSPs in the hardware, and the frameworks everything uses - Linux, OpenSSL, etc. - are almost all huge complex codebases that the NSA has tons of zero days for.

You've giving the gubment too much credit. It's easy to throw up your hands and say "aww geez I guess everything is backdoored anyway, why bother?" -- this is exactly what they want. The truth is a lot more complex, and as a lot of leaks have revealed, their capabilities are far from the supernatural omnipotence you seem to be implying.


We all think our own use case is all that matters and everything else is bloat. I certainly do.


We use Big Data for science too...


Also, the new systems have higher perceived latency than the old.


Not perceived: actual, effective latency is higher. Today's hardware is capable of better latencies, but most software guys don't care about latency all the way from firmware to OS to apps to websites. It does matter to some degree in games and vehicles but it's hard to find an industry that values latency outside of toys or industrial applications. Practitioners say this is because "99% of users don't care" but that's only because 99% of users have either (1) never experienced fast latencies or (2) have forgotten them because latencies have gotten gradually slower over the last three decades.


It’s more than “software guys don’t care”

Think of all the abstractions in a modern system. From the USB Bus, to the Network Stack and everything in between.

We have just duct taped on duct tape and we should be happy it all works this well!


All large systems of systems have "duct tape", I'd go as far as to say it's an emergent property of systems of systems.

It comes about because unless you design every sub-component system in lockstep with every other (which is impossible, the romans didn't lay out londons for cars, the victorians didn't put in sewers with respect to where we'd want to run fiber) you end up with an impedance mismatch at the boundaries.

It's why back-hoe operators from a gas network rip up fiber depressingly often, why (in the UK) the UK transport system has choke points between road, rail and air and on and on.

Modern computers aren't a unified system, they are lots of seperate systems that talk to each other and frankly having some (minimal understanding) of what has to happen for Gnome to appear on my screen when I press my power button I'm amazed that it ever works never mind mostly without fuss.


Exactly. Even if you do care, you can only exert so much influence on the platform you're developing for, unless you're working for a company like Apple or Sony. Even then, there are many, many layers to the latency problem.


Military/aerospace systems and high frequency trading care about latency as well.


Our systems are doing so much now, and all that "extra" lets us do numerous things at once without blinking an eye and it allows new comers to step up to a computer and almost immediately start using it after some exploring. Not clicking something and waiting 20-30 seconds to see what the program is or a blinking green cursor wondering what to do next.


Few things that spring to mind which aren't possible on MHz systems:

* Graphics design

* Video editing

* Animation

* Monero mining

* 3D CAD

* High end (physics/etc) simulations

* 3D Rendering

* Multitasking

Sure you might be able to do some of those things at a very low resolution on a Mhz system, but not in any meaningful way.


> Few things that spring to mind which aren't possible on MHz systems.

You must be really young.

>* Graphics design

We did

>* Video editing

Ditto

>* Animation

Ditto

>* 3D CAD

Older than you think

>* 3D Rendering

3D rendering was born here, and we were playing 3D games perfectly in 1996-2002

>* Multitasking

Linux, KDE3.


Grandparent also said

> Sure you might be able to do some of those things at a very low resolution on a Mhz system, but not in any meaningful way.

> You must be really young.

Maybe they are, maybe they aren't, but that's an unfounded assumption. There are plenty of middle-aged people who haven't got a solid idea of the limits of pre-Ghz-scale hardware for use cases that weren't all that mainstream back then, and I'm sure there will also be young people who've tinkered with some old Windows 98 box and could give us a pretty solid opinion.

At any rate, I've used a succession of Mhz-scale systems (starting with a 386 that struggled valiantly to run a copy of Windows 3.1 that someone had installed on it) and I'm definitely not very young.

> >* Graphics design

> We did

With rough and simple tools, limited to low-res designs, very limited brushes, super crude brush simulations (if any). Procreate on the iPad blows all of it out of the water effortlessly with its worst brushes using a finger for a stylus, and that realtime brush simulation has been taxing for earlier iPads, that have always been orders of magnitude above Mhz systems. I used Photoshop CS and CS2 a lot back in the day; they were resource hogs and still very crude compared to current entry-level apps. We've really gained a huge lot in that department.

> >* Video editing

> Ditto

At what, 640x480, 15fps? I remember having annoying render times for things that wouldn't hardly even count as filters nowadays at such resolutions, maybe 800x600, but I'm sure that ball-shaped logitech webcam could do no more than that. Snapchat replaces whole faces in realtime with eerie accuracy on what must be much higher res.

Color grading Full HD, 30fps? As if. I guess you couldn't even play that without stutters unless with dedicated silicon to begin with.

> >* Animation

> Ditto

Low-res, crude, and anything halfway detailed and interactive: Low-fps. Manipulating anatomy models like those at https://www.zygotebody.com/ in realtime? I doubt my Core 2 Duo laptop would have been up to that. I had similar software for Windows 98, and it was absolutely primitive and still taxing the CPU and GPU.

> >* 3D CAD

> Older than you think

With less precision, much, much lower model complexity, and much cruder tooling. A large number of components, complex shapes, joints, ... that's going to hit hard limits very quickly. I'm hitting hard limits with that sort of thing nowadays, but I'm hitting them somewhat later than even a few years ago.

> >* 3D Rendering

> 3D rendering was born here, and we were playing 3D games perfectly in 1996-2002

I don't play a lot, but between Gothic 1 and Witcher 3, graphics have improved by an incredible amount, it's day and night, and I can't even go to full details in Witcher due to my aging GPU. Technically, those systems could do it, sure – but only with super short visibility, very crude models and extremely limited shaders and animations, crude collision detection, ... Gothic required at least a 700mhz Pentium 3, so it's pretty representative I think. Of course, those limitations work better for some games than for others, but they still were brutal limitations.

> >* Multitasking

> Linux, KDE3.

Also quite limited, though. Just what fits on a single Mhz-scale core. On my Windows 98 PC, I had to close CPU-intensive applications all the time because things would start to stutter; I believe Windows 95 would sometimes just bluescreen under such conditions. Things got a lot better on Windows 2000, but I think that may have been on my 1Ghz Athlon already. Those early Linux desktops were pretty unstable with lots of multitasking as well, I faintly remember lots of freezing. Things did slow down perceptibly at any rate when doing multiple resource-intensive things. Technically possible, sure – but it absolutely helps to have lots of fast cores.

Of course, a lot of the legwork to make those use cases perform is nowadays done by GPUs or huge amounts of RAM, and profit lots from multiple cores, but I'd say a Mhz-scale system should have a period video card, too, and MB-scale RAM, and be single-core, otherwise it's kinda pointless. And under those conditions, all of above things were technically possible, but really severely limited – still are, in some cases (CAD...), but wayyy less than back then.

Does that qualify as "not in any meaningful way"? I guess it depends. It was meaningful for me back then, and it made possible things that hadn't been possible before, and of course we were always content with what we had, not like thouse young'uns nowadays, and walked uphill to school, in the snow, both ways, every day – but looking back, the capabilities of my Mhz computers feel incredibly crude and primitive by today's standards, and even a lower-end gaming PC has little problem running Autodesk Fusion 360 (for which there's even a free hobbyist license) with models of surprising complexity, and I'm sure that enables many many more things that wouldn't have been feasible on Mhz hardware.


TBH w98 was a joke on multitasking. You can't even compare NT4 or W2000 (not mention to Linux) with Windows 98, where it struggled even under a Pentium 4.

On freezing, Mandrake was a joke, but Slackware and Debian were rock solid.


My parents got their first computer in the 90s so my mom could do freelance work editing medical texts on nights and weekends after her day job. She used Word Perfect and she taught me to type with it, I still remember the exact shade of blue and the white text. She’d use Reveal Codes and I thought all the special characters were so cool. I can still picture her sitting at the desk working on it while I’d play video games. I was maybe 7 or 8 years old. They’re vivid memories that are kind of meaningful that I haven’t thought about in decades, tied directly to that application. Weird how that works.


Somehow WordPerfect/MultiMate with reveal codes is still a more intuitive text formatting tool than anything I've used since. Or maybe it's just nostalgia. There's nothing I hate more these days than trying to format text in WYSIWYG editors, in-browser confluence and jira, notepads, markdown, etc.


I wish there was a mandate that companies had to register sourcecode, e.g. with the Library of Congress, so that it could be released as Public Domain after a couple of decades. We are losing quite a lot of our software heritage when companies bury projects like WordPerfect or milk older games for a few more bucks. We could have a thriving, open-source ecosystem around the sometimes excellent software that is currently locked away.


We've decided 20 years is long enough for drug patents, and in my opinion, it should be good enough for almost all intellectual property, really.

Create something new? Great! You're officially granted a monopoly on that thing for 2 decades - plenty of time to monetize it, and use the money to create some other new things. If you can't figure out a way to monetize it in that timeframe, then we'll allow others to compete with you and take a crack at it.


Given that Disney made its fortune by adapting public domain stories and then abusing the copyright protection of their adaptation of these stories, I think there are more loopholes to patch than just the expiration date.


Take a look at : https://www.softwareheritage.org/ They've made it their mission to collect and preserve source code.


Very cool! I couldn't find a trove where I could browse by category or other metadata, although I did find the search interface. Is there a categorised trove?


I had no idea dosemu2 could translate VGA to S-Lang. That's very nice, and I will look into it.

WordPerfect was a great word processor, I remember using version 5.1 for years because it was wicked fast, rock stable, and most importantly, it was predictable. By contrast, MS Word -- even today -- seems to have a mind of its own. You move an image, and suddenly all your numbered bullet points appear in a different font except for the last one, and nothing short of retyping the whole thing in another document seems to fix it.


That’s a problem that no WYSIWYG has been able to solve yet. It’s probably an unavoidable abstraction leakage.

Hence the unbelievable popularity of markdown.

Imagine telling your 90s self your current machine configuration and how awesome it is. Then explain that we, with all this tech, voluntarily chose to type in plaintext format, with notations that kind of work as a more readable form of markup language, because we got fed up trying to make WYSIWYG work.

Oh yes, and GUIs are for noobs, pros use terminal emulators, but that’s another topic :)


My 90s self had already been using GUIs since the 80s courtesy of systems like Acorn RiscOS, Amiga, MacOS and even Windows 2.

The thing is I still enjoyed using quality DOS and Unix terminal software.

GUIs are not the one true way to represent a UI. They’re not even new technology any more.

They’re just different and better suited to certain tasks but not others.

It’s like asking why are we still using steering wheels to drive instead of speech recognition?


I miss Impression Style on RiscOS.

I also miss extending the supervisor prompt with BBC BASIC and arm26 ASM though :D


Totally, I loved the combination of Impression Style and Artworks.


> That’s a problem that no WYSIWYG has been able to solve yet. It’s probably an unavoidable abstraction leakage.

I think it comes from conflating layout with writing. In QuarkXPress or InDesign it really isn't an issue. Those tools are quite uncomfortable for sitting to write in, though.

LyX does a decent job of this, I think, though it's not exactly WYSIWYG.


You’re probably onto something.

There’s also an issue with black box language vs something like HTML, which has semantic value besides pure presentational function.

I don’t care if the auto generated PostScript is ugly as long as the printed work is perfect.


FrameMaker had this stuff figured out.


You don’t deal with PostScript directly. PostScript is akin to assembly.


My 90s self had experienced both LaTeX and Microsoft Word. I fled from Visual Basic to vi and Perl.

I would have been easy to convince.


Have you actually used the GUI version of WordPerfect?

I was forced to use it heavily for a year of pure writing for a previous job. It's incredibly unpredictable. It's like whack-a-mole. You make text in one place bold, and all the sudden some of your footnote text on a different page becomes bold.

The people at that job who were really good knew all of the tricks, digging into the codes that WordPerfect inserts to address various issues. But even then, it was an extra step, and I never became as productive in WordPerfect as I had been in Word.

Plus WordPerfect has been on maintenance mode with Corel for decades at this point, catching lots of bugs and half-implemented features.


Replying to say you are correct and GUI/Windows WordPerfect was completely 100% broken when dealing with 'closing tags'. Some people knew & liked to use the codes, but the basic GUI editing was just defective and caused your formatting to spill out randomly. (at least in vs 5 & 6)

Obviously I get this with HTML, but the Mac & MS Word approach of 'object oriented formatting' was just a much better execution for mouse operation.


The only version of WordPerfect I ever used was 5.1 for MS-DOS, and my comments apply to that one only. I heard the GUI version left a lot to be desired, so I never upgraded.


5.1 was the magic version of word processors.

Word 5.1 on the Mac was amazing. Fast, powerful, probably everything one would need even today, aside from track changes.


> You make text in one place bold, and all the sudden some of your footnote text on a different page becomes bold.

Tbqh, this is my experience in just about every rich text editor. Well, maybe not quite that bad, but I’ll always bold a word, and then while editing I have to go back and change the word after it, and it will suddenly be bold.

I wish bold/italics/etc acted like caps lock. It’s either on or it’s off, and the computer doesn’t try to guess for me.


> I wish bold/italics/etc acted like caps lock. It’s either on or it’s off, and the computer doesn’t try to guess for me.

It's starting. Outlook is doing some kind of "autocapitalisation" to me and I have no idea how to turn it off.


Outlook has a setting to automatically capitalize the beginnings of sentences. I turned that off. But it's been long ago, I forgot where that setting is. But I recall it was near the setting that enables smart quotes.


They even do it on the web version of Outlook. The setting to change it is 4 levels deep and two of the options are at the bottom of lists and labelled "more options".


On display? I eventually had to go down to the cellar to find them... It was on display in the bottom of a locked filing cabinet stuck in a disused lavatory with a sign on the door saying "Beware of the Leopard."


Well this has been better tech support than I usually get.


I haven't seen it with Word, which I use quite a lot, but there is one similar and extremely long-standing bug. When you use a keyboard shortcut to turn bold/italics on, then type a word immediately before a non-whitespace character, then use the shortcut to turn it back off, it will un-bold the thing you just bolded. So annoying.


I am having flashbacks to late 90s Linux. One of the distros back then was Caldera OpenLinux. It had a Linux port of WordPerfect. It got a lot of criticism at the time for being nonfree.


The port was so perfect! Not the Corel ones, but the one from SD Corp. I'd love to have that running again.


Can it be run on modern Linux?


I looked into it just now.

Somebody has WordPerfect 8 binaries around.

It requires libc5 - the c library that was common before glibc became common at the end of the 90s. And then needs X libraries that are compiled against libc5.

Probably you could find an old Red Hat 5 or something and pull out those libraries to run on a recent kernel.


I think WordPerfect 8 was where Corel rewrote it. It didn't work nearly as well. I vaguely recall they switched from native to some kind of compatibility layer (maybe Wine?) which just wasn't very good.


for everyone that bitches about docker and ubuntu snaps, this is a perfect application of them


I wonder if libc5 is so old that it has problems with docker. But I remember loki games being statically linked to avoid problems with dependencies. You only needed like linux syscalls and X11 for them to work.


i dont see why it would have an issue.

docker has no concept of the binaries that actually run, it just sets up namespaces, cgroups and a userspace. worse case scenario, you have to run the docker container as privileged, still no difference from running on host (at least with a properly constructed image, where the binaries wouldn't run as root)


libc5 was just a linux-specific fork of glibc. glibc2 == libc6


> […] just because it's a full-featured and highly configurable word processor that I can use in a terminal.

The 'reveal codes' functionality is something that I always liked with WP, and that no other word processors seems to have implemented.


Nota Bene is another text-based word processor from that era. It defaults to showing embedded formatting commands. The primary means of operation is via an integrated DOS-like command prompt.


Nota Bene although no longer text based still exists and is actively developed. Nota Bene used the XyWrite word processor engine. XyWrite was very popular in news rooms back in the day.


That's a description that also matches Arnor's Protext. The command window was the bottom half of the screen, and one embedded rulers and suchlike into the document text with ">" lines.


This is testing my memory but I think WordStar from the early to mid 80s had something similar. Codes all started with a ^ (again from memory) and their visibility could be toggled on and off.


This week I built a home-made computer using an ESP32 instead of a Z80 and hodge-podge of 74xx logic chips to run CP/M and started playing with Wordstar 3 and 4. I used it many decades ago briefly on the Amiga and it was more graphical but I'm only just getting to grips with it now.

Obligatory screenshot here: https://mastodon.social/@stevelord/104824411022687268


I have known lawyers and legal secretaries who just would not give up on WordPerfect, I think because of the footnotes. The two that I most recently had dealings with have since retired, but I'd be anything but surprised to learn that their are recent .wpd (is it?) files on the General Counsel's Office's shared drive.

I liked WordPerfect well enough from 4.2 through 5.1. Version 6 on Windows was not good. Even a slow typist (me) could get ahead of the cursor. But at some point my employer switched to Word and I haven't been back.


During the 4.X and 5.X era at WP, the company paid very close attention to the needs of the legal market and WP could do things with a couple of keystrokes that would take 10 minutes with MS Word.

WP also had a potent macro capability and a good macro editor as well. I was practicing law at this time and had created 80-99 WP macros that made legal docs appear like magic.

There was no e-filing and court clerks and judges could be more than a little picky about the formatting of the paper documents they received. If a doc wasn’t formatted in the proper manner, some clerks would refuse to file it and return the doc to the lawyer’s office for a re-do.


The acquisition of WP by Novell was a disaster. Way different corporate cultures and leadership styles.

Ashton and Bastian had been generous with stock grants to key people, particularly on the technical side. Most of the key people below the two founders had enough money that they didn’t have to put up with the sharp elbows and confrontational styles that predominated at Novell.

Combine a brain drain with a Microsoft push for Windows and built-in integration with the first passable version of Word and you had a recipe for disaster. Some of the WP people who remained claimed that MS had mislead them about Windows vs DOS roadmap and been slow in providing Windows info necessary for WP engineers to build a decent first release of WP for the new release of Windows, but I could never determine if this was what had really happened or just sour grapes because WP stumbled badly after the ownership change.


It was real. They built their own language and compiler to compile to undocumented windows API calls because MS wouldn't share in the early days while they established a WYSIWYG foothold.

In the corel days they finally decided to port to actual win32 code and put aside the custom transpilation


According to this history written by one of the founders they were given a pre-release copy of Windows to get an early heads up on porting, but the engineers generally preferred to work with DOS:

http://www.wordplace.com/ap/ap_0intro.shtml


Lawyer here. Still miss WP. I could do things in WP in the late 90's that I still can't do in Word.


Word counts were a major part of this, if memory serves.


can confirm, i worked for a company producing an online version of their legal books, we could export pages to word and... word perfect!


I had that version of WordPerfect on my 286! I remember how you could install the Trident video drivers and get some pretty high console resolutions. F11/Reveal Codes was super helpful too. I wish modern word processors had something similar. It basically shows you the equivalent of formatting tags.

I have the Win3.1 version of WP from that era in a DosBox. I've been meaning to compare it's grammar check to modern word processors. It was way ahead of its time.

I know Corel bough it, but it looks like it spun off again. I wonder if the current release still has any of the old source code, or if it was just rewritten and has the name for the brand: https://www.wordperfect.com/en/


Still owned by Corel (they have a weird "Corel Advantage" logo at the bottom of that marketing site and the Contact Us link goes to a bunch of Corel info).

The Corel of 2020 seems a fascinating Enterprise Licensing zombie. They appear to make all their revenue from (very) legacy Enterprise Licenses and government contracts and all of the software seems stuck in a time warp stasis with only the bare minimum of maintenance (presumably just enough to keep the Enterprise Licenses alive and well). It's almost sad to see how many classic computing brands they own and let languish in this state.


> I'm not aware of any other full-featured modern word processor that can run in an xterm

Word Perfect was available for UNIX as well. The SCO binaries might run with iBCS. I don't know if that would provide a significantly different experience than running the DOS version in dosemu though.


I tried running WP for SCO under iBCS 10-15 years ago. I couldn't get it to work. Don't remember all the details but it seemed they cheated with some of the video routines and I mostly got a corrupted screen. I didn't spend a lot of time on it, though, and I very well might have just not hit on the right incantations.


Version 4 got me through the first couple years of college. My freshman year I actually had a steady stream of people coming into my dorm room to use it on my PS/2. Eventually I switched from English to Math and moved on to vi and the various roff markup tools and then LaTeX, but I think what ultimately killed the magic of that generation of WordPerfect was the change in keyboard layouts that moved the function keys from the left side to the top. Before that when combined with Ctrl, Alt, and Shift there were 40 commands available just a pinky reach away, and it came with a cheat sheet card that fit neatly around them.


Versions 4.1 and 4.2 got me through the first couple of years of college also. I briefly updated to 5.0 but quickly reverted to version 4.2. I have never been as productive in a word processor as I was using WordPerfect and a Model F XT keyboard.


I guess I'm not the only one who misses those keyboards, they go for around ~$300 on eBay[1]

[1] https://www.ebay.com/b/IBM-Model-F/175690/bn_55189929


IBM spun out their keyboard division decades back when they they spun out printers (into Lexmark). The current inheritor of the keyboards is Unicomp (having bought it from Lexmark/IBM): https://www.pckeyboard.com/

It's a bit cheaper to buy from them than to pay for an antique on eBay.

Also, there's a wave of mechanical options from more recent companies like Das Keyboard or the various DIY kits with choose your Cherry switch adventures.


There is a modern-ish clone of Wordstar around as well called WordTsar http://wordtsar.ca


Woah. Apparently the guy who made it also uses it to write his novels.

http://geraldbrandt.com/


Doesn't RR Martin use the original Wordstar? I remember reading about him using an old school physical word processor and USB-floppy drive when he needs to backup/transfer his work.


Came here just for the GRRM comment :)

But for I can see, he is NOT really using it...


Reveal codes should be part of any rich text editor. Ever sat wondering why pressing return added a new bullet to a list rather than a gap to the text preceding it? I see it all the time while watching engineers and manager using tools like Jira, Confluence etc. Reveal codes would make it plainly obvious - the cursor was after the list start but before first bullet.


Nothing more delightful to work with than a requirements management tool whose wysiwyg editor changed the text of the bullet points when the indentation level changed. Never figured out what was going on, but there was somehow two different texts occupying the same line, and if you pressed tab to make the line deeper in the hierarchy it would display the other text. You start wondering how bad DOORS really is if this is how the "modern" web app replacement works.


I am finding that semantic HTML and CSS Grid where you style the elements works nicely for documents. This goes against what Tim Berners Lee imagined in that I am typing in the HTML tags and happy to navigate them in text.

I prefer the tags as they describe my work. Visually there is no evidence that something is a 'section' or an 'aside' but I make sure my words do have document structure, so a 'section' will probably have a 'h3' at the start of it.

I wish there was a 'what you see is what you mean' editor that was visual but only allowed you to add HTML semantic elements. WYSIWYG editors tend to just add local styles and 'br' elements, resulting in HTML that is useless for my style of HTML.

I can do diagrams in various ways including SVG that I just write out manually. Or I can use the wonders of CSS Grid and a 'dl' list with first of type/last of type CSS selectors to style up a simple linear flow diagram.

I really wish there was a super neat word processor or HTML editor that only wrote lean HTML with semantic markup for document structure. But here I am in 2020 using my IDE or Vim to write documents and even craft diagrams.

I am impressed that I am not alone in spurning drag and drop wonder gadgets. As much as I would like to give Wordprerfect a go, I feel the world has moved on from documents printed out as memos and put in pigeon holes. Stuff has to look good in a browser and with stripped down HTML that uses the elements and a simple stylesheet that loads a font and puts in some CSS Grid layout I feel I have the tools I need.


I really don't want a word processor, per se, so much as I want really robust (realtime) text wrapping/unwrapping as I edit paragraphs. I've tried to get Vim and Emacs to do reflows but both were very flakey compared to something like WordPerfect for DOS.


I think I know what you mean. I try to wrap at 80 col. and I often need to go back and edit a paragraph, which breaks all my wrapping.


Have you tried soft line wrapping in GNU nano (M-S)? As long as you're running nano in an 80 column wide window, that should work fine.


BBEdit might be just right for you.


My apologies, I was specifically referring to text-based/console-based applications. All the GUI text editors seem to wrap and unwrap just fine.


Not exactly a word processor but it seems to me like you could try (if you can find it) the TSE or QEdit:

https://en.wikipedia.org/wiki/The_SemWare_Editor

what Qedit (often renamed simply to Q.exe) was capable of doing in the (good) old times was incredible.


Word perfect 5 and 6 in many respects are still vastly superior to any version of word, especially when it comes to aligning text left and right, which are properly treated as line flow instead of a properties of a tab stop. Without leaving the keyboard I could easily align text on the same line to left aligned, centered and right aligned.

For typing / journaling the non-wysiwyg was a feature. I honestly think wordperfect 5.1 is a big reason why sublime is so popular now.


Also I'd use WP 6.1 for DOS in Spanish if it wasn't a "little" inconvenience: we don't use Pesetas as a currency anymore, but I think there's an Euro patch somewere.

EDIT:

On graphics, that's a solved problem. Everyone uses framebuffer today, even on obsolete chipsets it's doable. Just map VGA calls to FB and everything would be fast enough, I think.


I use WordStar when programming in CP/M. I use 'jstar' (one of the 'joe' alter-egos) which uses the same commands when programming in Linux.


You'll love Wordtsar then.


I used WordPerfect in my first job as a tech writer. It was beneficial to have a separation between writing and the layout done by desktop publishing systems.

As a programmer building websites, I do the same thing, writing in vim. Learning how to make your editor work for you is a great investment.


I use troff (groff) under Linux. The really freaky thing is that you already have groff on your Mac -- it's installed by default.


What macros do you use? I have some simple programs that generate documents to print out, and I use LaTeX and a bunch of packages. It takes hundreds of megabytes of software with all the packages. I've long toyed with using the MOM macros but can't get motivated to rewrite my programs and learn troff.

It used to bother me that groff doesn't support Unicode (as far as I can tell) but then realized that all I write is English so why am I fetishizing Unicode? groff will get me any accented character that I would realistically need.


You can have UTF-8 with groff using the preconv [1] tool, just pass the "-k" or "-Kutf8" paramater. It will preprocess your source and replace Unicode symbols with its groff special character form. Of course, the built-in standard fonts do not support many glyphs but it is no big thing to use a more Unicode friendly font like DejaVu Sans or Noto, check [2].

So this is my setup using MOM macros for nice typesetting.

But I fully agree on your last point, in the end I seldom need the Unicode universe for most of my documents.

[1] https://www.man7.org/linux/man-pages/man1/preconv.1.html [2] http://www.schaffter.ca/mom/momdoc/appendices.html#fonts



Get heirloom-doctools. It supports Unicode just fine.

https://n-t-roff.github.io/heirloom/doctools.html


Just the plain .ms macros. Plus "pic", I like that a lot.


> The really freaky thing is that you already have groff on your Mac -- it's installed by default.

No way, you're right. And of course it has a very detailed and well formatted man page. The one for troff even has tables.


Little-known fact: groff (more particularly grotty) does colour, albeit that people have secretly turned this off for you. grotty's own man page is in fact in colour.

* http://jdebp.uk./Softwares/nosh/italics-in-manuals.html


groff isn't really comparable, though, because it isn't a word processor but rather a typesetting system.


Using WordPerfect on the Amiga: https://www.amigalove.com/viewtopic.php?f=7&t=101

This whole site is wonderful, btw


I believe George R. R. Martin has said he uses some kind of ancient word processing software to write the GoT novels.


DOS machine, disconnected from the internet, running WordStar 4.0

https://www.cnet.com/news/george-r-r-martin-writes-with-a-do...


Might want to use the past tense of "to write", otherwise you're giving GRRM the benefit of the doubt when you say he is writing GoT novels...


WP6.2, fond memories. How is this mode called where you can see and edit every control sequence you entered like bold and italics formatting?



While I've seen its real name now, for me this will always be the Under Water Screen.


What about Protext from Arnor? Another text only word processor now available for free for the Atari ST, Amiga, and PC.

https://web.archive.org/web/20140825104412/http://www.tigert...


I have very fond memories of WP5.1.... I still have some Model-M keyboards, so I could really recreate the "glory days" if I wanted. I could print out the little F-key cheat sheet for the keyboard as well.

But...

What about Unicode? Lack of Unicode seems like a dealbreaker for a lot of use cases. Even if you're not using Unicode, seems like you'd eventually want to work with Unicode text.

As an engineer, a lot of the "word processing" I do involves cutting and pasting text from other sources (citations, code samples, names, etc) that tend to be chock full of non-ASCII characters.

Or has this been solved somehow? I see that there are various hacks involved in getting support for the Euro symbol. What about a general purposes solution? I know that WP5.1 had support for international character sets. Perhaps somebody's cooked up an emulation layer that translates Unicode into whatever it is that WP needs.


Great idea, but how can one get their hands on this software nowadays? I can only assume it's not sold anywhere?


You can still purchase it, but it's a lot of hoops to jump through!

http://www.columbia.edu/~em36/wpdos/links.html#obtain

There is still a community (apparently mostly paralegals) who actively use it and answer questions, e.g. http://wpdos.org and https://www.wpuniverse.com/


> http://www.columbia.edu/~em36/wpdos/links.html#obtain

Looks like they deleted the file ... now 404 not found...


Odd, it works for me.

It's also an odd little page. It includes directions to buy it, if you already have a copy. If you don't have a copy, it includes some details on how to use eBay.



http not https


oops, thanks...had the 'encrypt all sites' setting to ON in https everywhere.


Who is Carol Reese? And how is she related to Corel and WordPerfect?!


I still use Word 97 for writing. I've tried many WPs, but haven't found anything better.

I run it inside a VM and it is still faster and better than LibreOffice by a long shot.

When I'm done writing, I paste the text into one of my blogs with IE6, which my sites still support.


Like your style on this. I'm contemplating similar setup but going bit more oldschool with Word 6.0 Win 3.1 edition, gotta work out what the smoothest way of integration with my Linux (Mint) desktop though. Integration is key for me and have to say my prior exp with Wine have been less than comforting (working but just so, not long term functional feel).


Wine doesn,t work great with Microsoft software, because they would take advantage of undocumented APIs, but W6W31 may be old and basic enough to work...

I would still take the VM route.


Former employee had a Inventory system based on dos6 but the hardware (the pc and the barcode scanner) was getting really old. So i buy a new PC, installed FreeDOS and a new barcode scanner with Dos compatibility and it works perfect. DOS especially FreeDOS is just great.


Is there a DOS working on an UEFI computer? If not there must be a very limited choice of hardware.


UEFI systems generally can boot in compatibility BIOS mode. Macs can't, but they are the only exception I know.


Most modern computers have legacy BIOS support.


Having disposed of a PowerMac G3 running System 9 this week, this strikes a chord with me. I'm already regretting it.

Despite the age and 'underpoweredness' of it, it was still quicker, still more responsive, and still subjectively nicer to use than anything modern.


Wow. I think Wordperfect may be the last time I enjoyed using a word processor. It was also maybe the first time I realized how important good design is. I couldn't you tell a specific reason it was better than Word, just that it always did what I expected.


Nice that you are using WP on dosemu2 on Ubuntu on Windows. I once spoke to someone active in research into keeping data (formats) and programs available for very long time (like centuries) and I always thought it was a no-brainer because one could just run X on Y on Z on A on B on C to get ancient software D working.

Allthough I also have to admit that this approach didn't work for my favorite game [0] but now that I'm googling, I does seem to work! So someone got DirectX 9 or something working?

[0] https://www.dosbox.com/comp_list.php?showID=3140&letter=A


Cryo Engine? Then SCUMMVM will implement that soon.

Compile Scummvm with

        sh configure --enable-all-engines --prefix=/opt
        make
        sudo make install
        /opt/scummvm/bin/scumvmm
https://wiki.scummvm.org/index.php/Atlantis:_The_Lost_Tales

https://github.com/scummvm/scummvm/tree/master/engines/cryom...

Nightly builds (but they may not set the "--enable-all-engines" switch to configure/


My dad still uses Framework at times on an old machine. It just works and is perfect for his use case of large spreadsheets.

https://framework.com/


>nto S-Lang calls

So like SLRN and Jed? Amazing. I used DOSEmu back in the day for Redneck Rampage under an AMD Athlon (Dear DOSBox users, DOSEmu was zillions times faster than DOSBox, I ran it at native speeds). This is impressive.


While I support your travels with Wordperfect (I wrote many term papers on that puppy), I can support the quote by the person you responded on twitter. He just showed arrogance and silliness when he said

"All that was ever really needed by an OS or Office apps was already there 15 years ago. Everything "invented" beyond that was just vanity changes with negligible long term added value, and constant moving around of UI to appear new and better. Change my mind."


I'd almost try this, but I don't write documents enough - auto-correct in modern word processors infuriates me. It almost never does what you want.


Amazing. I used WP6.2 to write papers in high school and remember not only the tables and line drawings but you could also place images and annotate them. It's hard to remember how that worked but in those days it seemed those things were only possible on Macs. As I recall there was a big reference binder that documented the features.


I was a huge fan of Wordstar 6! Wrote so much with that thing ... way better than Word, Pages etc. even still. It's no wonder I prefer Markdown for writing / documents etc.

I should try to see if I can get Wordstar running under dosemu2 (I probably have a 3.5" floppy somewhere with it on, but I don't think I have the disk drive to read it).


Most of what I do in a word processor today, I could have done in Wordstar 3 under CP/M. Perhaps only the printing (no postscript) would have been a problem. Certainly most of what I do in Excel could be done in SuperCalc.


I remember using WordPerfect on VAX/VMS (on a decterm, no less), and I recall them having some flavor of Unix based binary, but, I can't find references to either out there anymore.


This reminds me that WordPerfect 6.2 for Windows 3.1 could possibly work under Wine. That could be interesting.


Those diagrams look powerful, it might even be even better than modern text editors for that :v


I am just very curious how to get a copy of the wordperfect 6.2 for DOS.

Can it be downloaded from Internet Archive?



George R.R. Martin? Is that you?


There is Wordgrinder[0] and can export to LaTeX for printing. In classic Unix tradition, it looks way more plain than the screenshots of WordPerfect you posted :-P.

[0] http://cowlark.com/wordgrinder/screenshots.html


Why did you update from 5.1?


Would you be so kind to post an example Postscript file generated by WordPerfect?


Sure, I normally have this in my .dosemurc:

$_lpt1 = "dosprint"

Which just pipes the printer port output into a bash script that runs ps2pdf, but I changed it to save to a file instead.

https://lock.cmpxchg8b.com/wppostscript.ps.txt


Oh, nicely formatted and readable PostScript. I wish the Windows drivers could do that.


Could you link to an example of an output file?


It really is a beautiful program, in its own way.


Nostalgia: Reveal Codes!


microsoft word today - with "onedrive" and "server-connected DRM" sucks so bad when you are offline just trying use YOUR @#$23 PC it is a travesty.

just wait til the exact same UX comes to your automobile.

and after that...


I had Office decide that checking the license at the start of an Internet-free 6hr flight was a good idea. At least I had brought a good book with me.


...Cool?


Surprised nobody has mentioned the main advantages of web-based apps:

* Collaboration (including real-time collaboration)

* Ease (or more like lack of it) installation process

* Always up to date


>* Collaboration (including real-time collaboration)

Abiword did that

>* Ease (or more like lack of it) installation process

You need an ISP. And an Internet connection.

>* Always up to date

It can be worse for your specs.


If you look at what happened to Word Perfect when it shifted to Windows then you might consider always being up to date a disadvantage.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: