They hadn't been replaced as they were running some custom ISA interface boards developed by a research group in the 90s, and the community was still using the data output by the machine. And since it had been trucking along for ~20 years, convincing them they needed to develop a replacement was hard.
When I left, it was still there, chugging along. And once a day, a tech would wander in with a floppy and copy some data off and wander back to the control room where a USB floppy drive was attached to a workstation specifically to read this floppy and let the tech copy the data files off to a network location.
It was on a redundant on-line UPS as the last time it went completely off and was booted from cold, it ended up needing parts. So no, I'd say it's not likely to outlast modern replacements. Just had lots of support and organ transplants.
Systems like that ultimately end up being a resource drain, as unlike modern hardware it's not simply wandering over to a retailer and buying standard replacement parts, but sourcing used or NOS parts and hoping they work, often requiring far more time and effort by someone who could be doing some other important task.
The problem is that in open use observatories like Arecibo, a lot of equipment like that is put in place by third parties, but becomes our responsibility to maintain. But the science is important, so we maintain it as best we can.
Unlikely. Good quality 'modern stuff' has far better quality than what was available back then. Power supplies have gotten much better (and have more protections), as did motherboards with solid state capacitors. Older motherboards loved their electrolytic capacitors.
Processors had less protections. Thermal monitoring was at its infancy.
Memory sticks back then were very unreliable. As were hard drives.
Then there's just a matter of age. If it has lasted 20 years, chances are it won't last much longer.
It has probably lasted 20 years because it wasn't being powered on and off constantly. It most likely had stable power sources and - most important - a climate controlled environment with low amounts of dust.
Processors had less protections, and thermal monitoring was at its infancy because almost all the 486s wouldn't use enough power to cook themselves to death, even with passive heatsinks.
I don't know about memory sticks and hard drives from that period. It's quite possible they've replaced the hard drive with compact flash or something too, which should be pretty reliable given they apparently only make a floppy disk's worth of data every day.
About 1 one year later it underwent my cleaning routine, vacuuming dust out, cleaning the fans, etc. I took the cpu-cooler off to check for bad thermal grease and discovered it had none at all! No pad either, someone just clipped that thing on, and instead of direct contact it had a small airgap which you could look through from the sides.
When I got it, it must have been in use for years already!
It's why computers from the 80s can last until now without needing any recapping but computers from 2002 or so are ticking time bombs.
That being said, electrolytics were somewhat less common than they are now. But the Capacitor Plague is not the only reason electronics are recapped, especially older/vintage/retro electronics.
A 486 doesn't even need a heatsink, it's passively cooled and perfectly fine with that, as software of the time (mainly DOS) didn't even "idle" the CPU --- it was drawing full power all the time.
That also means if you manage to boot DOS or similar era software on a modern machine (especially a laptop) that was designed to have the CPU idled for most of its life, it is likely to go into thermal throttling immediately.
I don't know if the VM changes anything, but it definitely seems as though it does not halt while waiting for activity.
Surely it does make some sense, but think about the hardware suddenly breaking down and not finding what to replace it with. Even with spare parts it could be a pain as the old generation retires
It can be tricky. I had to work with something like this a few years ago and managed. As I remember I had to get Scitech Display Doctor/Univbe from some abandonware sites to make use of the weird onboard VGA under Windown95. The ISA-card for the CNC-machine worked, it used DMA, among other things. It all worked, and was even more fluid to operate/program.
The cost was just over 400EUR. (for the board)
Anyways, it's industrial, the formfactor often does not fit existing cases and backpanels, but it's doable. More so than hunting ebay or similar for spares of dubious provenance.
Only exception would be if the "App" and the board/card were designed so sloppily that they really only run on exactly that same hardware.
I wonder if an unpatched Win95 box is actually safer than an unpatched modern Windows box.
You'd be surprised how much malware is "legacy" 32-bit and works without issue on WinXP. A lot of malware even shies away from Unicode API's and specifically calls out the *A Win32 API functions.
Considering the Win32 API is backwards compatible to a fault, I don't see why modern malware couldn't absolutely wreck a Win9x box.
I work on software that still supports XP with only a little pain. Visual Studio 2019 will still install the necessary toolchain if you ask nicely. Targeting Windows 95/98 in 2020 sounds like a nightmare.
Could I get a link to how to do this and/or web search terms? Is this just something targeting visual C runtime 20xx or older?
The data is copied off daily via floppy (don't get me started...) and is backed up on local network shares. The specifics of that weren't in my need-to-know and was managed by the IT group.
The system was essentially stateless, so spinning it up from the image wasn't an issue, but the PC itself is essentially part of the "equipment", as it ran the custom drivers and it was specifically designed for that PC and a one-of-a-kind ISA card.
The thing is you could operate it under Windows (98) in protected mode, but then you lost statistics. How come? Well, Windows scheduling meant that the MCA only took data while it was allotted time slices, as a result the dead time of the whole setup increased, and you couldn't tell by how much.
Under real mode DOS it took advantage of DOS being almost an RTOS and the program running as a single task and you could be pretty confident about your data.
I don't think using adapters with this sort of cards is a good idea. It's not a matter of drivers, sometimes the old stuff simply doesn't work well when plugged in newer systems because everything has changed and you need to redesign everything. If the old stuff does the job, the OS being unsupported only becomes an insurmountable problem for the folks that consider using an unsupported OS some sort of crime in itself. Well, there's usually more than just an abstract computer being used in such circumstances.
So, yeah, I'm sure that USB-ISA adapter probably works with that old 3Com ethernet card or that Soundblaster card that 'definitely' sounds better than anything else, but a custom card with custom software and probably unknown tolerance to variations from 'real' ISA? Good luck with that.
My brother's company does IT stuff for the manufacturing sector. Very conservative. I've worked with a number of his customers to source exactly this sort of thing. Years ago they asked my thoughts on an Italian made cloth dying machine the size of a bus with a DOS PC controller and a minimally documented ISA controller. A mid 7-figure US$ setup; the company that built it is long gone. We found a shop that supplied new motherboards, modern (at the time...P3/P4 era I think) with ISA slots the vendor guaranteed it would be every bit as slow and weird as a PC/AT. Migrated everrything over, the company bought a stack of motherboard spares new, and I think they're still using them.
Now...you want a challenge? Migrating the proprietary, mostly undocumented software, all in Italian (we aren't native speakers...though my high school Latin occasionally helped) from an old ESDI disk to something, anything, else. That was much more interesting.
Ultimately it comes down to whether you can get the inputs and outputs you need replicated/attached to the virtual machine and and exact copy of the machine itself into virtual form - the latter challenge can include having to do risky stuff like physically removing the source HDD to plug it into various interface converters and then taking an image from it using an intermediate machine that can read it.
Depending on how your setup works you may run into issues with things like time precision within a VM - if high time accuracy is important in the work it’s doing you’ll need to put attention into ensuring VM time = real world time.
- Use Proxmox. Even free version can go a long way (because backing up VMs periodically is a good thing and Proxmox can do it fine).
- If you're going to run Linux, run XFCE desktops with X2Go server. X2Go can work almost in any network condition.
- Test PCI forwarding extremely well before taking the plunge. Especially if you're going to virtualize specially designed equipment. If it works, it's magic. If it doesn't then you're in big trouble.
- If the PCI cards will be migrated from older equipment to newer one, test to see whether they work well with spread spectrum. If it doesn't make sure that the server you use allows disabling it.
How long ago did you work at Arecibo?
What was it like?
What did you do there?
Did you ever... find.. anything that you can’t talk about? Blink twice if yes!!
I was there for about 3 years and left around 2016. It was one of the best and most interesting places I've worked at so far.
The people were great, the work was great, and I personally felt the work came with a huge feeling of accomplishment and made me feel like I was contributing to a better world by facilitating scientific progress.
I was part of the Electronics Dept. which was ultimately responsible for the scientific hardware, the signal paths, and everything between. There were some people in Physical that were more responsible for the electro-mechanical stuff like motors and such. I was primarily responsible for assisting with the analog signal path and was considered a Receiver Specialist, but the latter half of my time there was spent writing a lot of digital control stuff for newer projects as I was the only one on staff at the time who had the skills and time to take those on.
As for finding anything..... that's classified :p
We did manage to talk to an old NASA satellite from the late 70s. The ISEE-3/ICE reboot project. Some details on Wikipedia here: https://en.wikipedia.org/wiki/International_Cometary_Explore...
Also the name sounded familiar, this was the radio telescope which incurred damage recently.
And yes, this is the one that recently suffered a cable failure that damaged the reflector.
I'm almost certain that '95 machine is still sitting in the Clock Room to this day.
Man, I miss Angel, gotta shoot him an email sometime soon. Miss sitting around and chatting about ham radio with him!
It's a family business, and the software works really well (a medical niche), so the investment to update the stack doesn't make any sense. (And there's no need to bother clinicians neither with things that don't add value) I have a VM for tweaking and working with it. I have multiple ETL scripts that extract the data to a modern ERP.
I don't have access to the source code, but I have tried to reverse engineer it from multiple angles. Essentially it's a 16 Bits client written on Delphi with a database (dBase) that doesn't support multiple readers or writers, shared via a network drive. Clients read files from the shared drive and create locks to avoid multiple clients working in the same data. That's the main source of pain, but the clinicians adapt to it in a week or so.
I haven't seen yet a modern system as complete as this one. There are multiple clinical centers in the area using the same software (25ish) and the 2-3 that moved into newer systems regretted greatly.
The market is too small for any new developments. But the guy who built it in the 90's maintains it and earns enough to live happily.
There was a fad for a while of not using PLCs, and instead using 'regular computers', at least for a portion of the system (usually a PLC still handled the most performance sensitive parts).
When you have a production line that's 100 feet of steel, motors, ovens, and other equipment, and the cost of redoing all the control systems is quoted at hundreds of thousands of dollars, then why change it? It still works.
If you're looking for replacement parts for old PC equipment, you don't need a million units, you just need a couple units. Finding a handful of units for replacement parts is pretty easy.
It works beautifully under dosemu2, which has a terminal mode that can translate various VGA modes into S-Lang calls (S-Lang is like ncurses, so no X11 required). I find this technically impressive and makes a lot of old DOS software indistinguishable from native linux software; stdin/stdout, parameters, host filesystem access, etc all work transparently.
Here's a screenshot: https://twitter.com/taviso/status/1272670107043368960/photo/...
It can import TTF fonts and print to PostScript, which I just pipe into ps2pdf and then handle on the host.
I'm not aware of any other full-featured modern word processor that can run in an xterm. I know about wordgrinder but it's very very basic. You could use a text editor, but it's not ideal for layout because it doesn't understand things like proportional font geometries - you need that to know how lines/glyphs will fit on the physical page when it's printed. You could write it in some form of markup, html, TeX, markdown, whatever, but if I'm just trying to format a document I prefer a word processor.
(Note: dosemu2 doesn't require virtual 8086 mode, so it works fine on x86-64)
The only "new" thing enabled by our bigger systems is "big data", which is largely a process of finding patterns that will fool the user/purchaser/customer.
> our GHz computers, we were doing previously with old MHz
I do also, but then I do have to pinch myself and remember some of the cutting edge software that really does make use of the hardware - i.e. games, video editing, neural networks, etc. I know we could already do these things on lesser machines, but there is no doubt that the level of which they are currently done could not be replicated on a lesser machine. And also remember the efficiency of systems such as web servers, todays average usage would be a DoS attack of the past.
I do look in disappointment though at text editors, window managers, file viewers, etc - that despite having much more computing power, offer not many more features but still eat tonnes of resources.
I'm currently (slowly) working on a Window Manger for X11 which tries to bring back some of these ideas but for modern devices: https://github.com/danielbarry/oakwm/ It's built on top of wm2 (which itself has roots going back to Bell Labs plan9). A lot of the work is in ripping out the unnecessary features and making it touch friendly. The idea is to run it on the PineTab Linux tablet.
I would like to see better JS support, but the scope of JS is simply insane for such a small browser. It's unfortunate much of the web is completely unusable without running JS. Perhaps it's possible to first run the page through a larger browser engine and then send the processed content to the small browser (such as Dillo), that would massively widen the scope of what it could display.
>But the scope of JS is simply insane for such a small browser.
Edbrowse has JS support thanks to duktape.
Seems to be still pretty old...
> Edbrowse has JS support thanks to duktape.
Huh, very interesting: https://duktape.org/
I initially assumed by ducktape you meant "barely held together"!
*correction, according to https://developer.chrome.com/multidevice/webview/overview they are still separate, it references android L dev preview so not the latest source tho...
While 60MB is still humongous, it does make sense that Google would at least care slightly on Android, as the majority of the Android market are low-to-mid-tier smartphones with reasonable performance instead of the "desktop-in-a-pocket" that are the current flagship phones.
But while they care slightly on Android, they don't care at all on desktops.
Remember kids: Large binaries are slow binaries.
And yes, there is clearly only an incentive to care on Android. The automated tooling to track Chrome binary size only works on the Android build.
False. Loop unrolling is always faster.
Blatantly false, which is why heuristics decide when to unroll.
Large code trashes instruction caches, and cache misses are very expensive. More code, more misses.
For very small loops, the unroll can pay off, but these are often too small to have significant size detriment.
See, the hardware folks have listened in on the compiler people and their problems. They've done things like identify loops and rewritten them in microcode for optimization. If a short loop can fit entirely within the CPU code buffer, speed goes way up. Unroll the loop and blow the CPU code buffer, defeat the optimization and lose all that.
> Think about how much a "10 MB" only browser took up in main
> memory back in the day.
This is exactly the point, software has swelled to use the resources available to it, so with each new iteration your machine gets faster but what it runs gets slower. It doesn't feel like this is something we should settle for.
But apart from the HD video, the search, the aqueducts, the roads, the education, the browsers that don't crash all the time, and the wine... what have the Romans ever done for us?
I disagree. Searching for keywords is practically impossible nowadays, with all smart AI-powered™ search engines that unhelpfully return results that are not relevant to your query. And SEO spam, which has killed the niche high signal-to-noise handcrafted websites for content aggregators.
IE5-6 were a crashfest.
I lost track of how many times I saw the spinning beachball of doom. The only solution for which was the power off and on again.
I'm guessing anyone who didn't get browser crashes in the 90s was only browsing a couple of sites or something. Certainly anything before IE5.5 was crash city - in particular Netscape 2.0!
Today I use Electron and React Native, which needless to say are not very popular, but for sure I could never have developed the kind of cross-platform software I write today with the technologies I used many years ago. Partly for lack of skills, but also of time (cross-platform development was way more difficult), or simply because computers back them were not powerful enough.
I don't have any special nostalgia for old technologies, some stuff were good, some not so much. And as OP is showing you're still free to use old software if you don't like what's being done today.
They still aren't.
But ultimately this would all fail. Aside from the network effects - nobody's going to use it - the fast processors and tons of memory are necessary. We just don't think about it unless we're running out of it.
- high resolution photo and video - photo editing programs like GIMP and Darktable can spend some time processing photos; these days even with complex effects you usually never experience lag more than a few hundred ms on many megapixel photos, because our hardware is fast. Same for video, the memory and storage space and bandwith is a hard requirement, and going back to 360p is not really acceptable.
- high resolution monitors - no point in having great looking 4k 10 bit color video without a 4k monitor, and now you're stuck having to push 20gbps through your displayport cable on a 500 MHz processor. And text also looks much better at 4k. Also you could say this is a bit unnecessary but compositing window managers are pretty great and I would say a core requirement of modern GUIs, and having eye candy like wobbly windows, window shadows (those are actually very helpful, try turning them off), etc. is expensive. My Dell XPS laptop from 2017 couldn't handle wobbly windows without visible stutter at 4k60; my desktop with a $400 GPU from 2019 can keep up at 4k@144hz.
- new technologies - VR/AR, fast voice recognition and neural networks - this is all cutting edge stuff but the use cases are obvious and they have started to be applied more commonly. Also the failures like eye tracking, Kinect - they may have failed commercially but they were good ideas and a valid use of fast computers. Also online meetings with many participants, each with their own video streams that require decompression.
- obviously, video games - not that you can't have fun games with shitty graphics, but good graphics are nice.
I thought of a few more but can't remember them at the moment. Oh, also, Bluetooth audio - 192kbps audio, but you need to decode it since it takes compression to get it to that level, and then you have the additional overhead of sending it over a digital protocol instead of just having your ADC do the work. You would need expansion cards or a dedicated audio core in your CPU to accomplish this if your computer wasn't fast.
And of course, science needs big data and big hardware to process all of that data.
If you're only working with text then slow computers are fine, but as soon as media gets involved they are not practical.
But the stuff we do day-to-day (excepting graphics) is appallingly inefficient. And most of the look-at-this stuff (speech recognition, and you mention VR and eye-tracking) is not actually used for anything day-to-day; we don't routinely talk to our computers ("Hello Computer" https://youtu.be/v9kTVZiJ3Uc?t=10 ).
That's perhaps because doing so in an office or cafe environment isn't great.
Kids are using text-to-speech more while doing their homework. Word has been beefing up its transcription feature for a while now.
I got to see this happen during lockdown, where teachers I know recommended transcription for younger kids who don't have touch typing skills and were still expected to turn in work on a computer. Talking to your computer is much more natural while doing your homework in your room.
You've giving the gubment too much credit. It's easy to throw up your hands and say "aww geez I guess everything is backdoored anyway, why bother?" -- this is exactly what they want. The truth is a lot more complex, and as a lot of leaks have revealed, their capabilities are far from the supernatural omnipotence you seem to be implying.
Think of all the abstractions in a modern system. From the USB Bus, to the Network Stack and everything in between.
We have just duct taped on duct tape and we should be happy it all works this well!
It comes about because unless you design every sub-component system in lockstep with every other (which is impossible, the romans didn't lay out londons for cars, the victorians didn't put in sewers with respect to where we'd want to run fiber) you end up with an impedance mismatch at the boundaries.
It's why back-hoe operators from a gas network rip up fiber depressingly often, why (in the UK) the UK transport system has choke points between road, rail and air and on and on.
Modern computers aren't a unified system, they are lots of seperate systems that talk to each other and frankly having some (minimal understanding) of what has to happen for Gnome to appear on my screen when I press my power button I'm amazed that it ever works never mind mostly without fuss.
* Graphics design
* Video editing
* Monero mining
* 3D CAD
* High end (physics/etc) simulations
* 3D Rendering
Sure you might be able to do some of those things at a very low resolution on a Mhz system, but not in any meaningful way.
You must be really young.
>* Graphics design
>* Video editing
>* 3D CAD
Older than you think
>* 3D Rendering
3D rendering was born here, and we were playing 3D games perfectly in 1996-2002
> Sure you might be able to do some of those things at a very low resolution on a Mhz system, but not in any meaningful way.
> You must be really young.
Maybe they are, maybe they aren't, but that's an unfounded assumption. There are plenty of middle-aged people who haven't got a solid idea of the limits of pre-Ghz-scale hardware for use cases that weren't all that mainstream back then, and I'm sure there will also be young people who've tinkered with some old Windows 98 box and could give us a pretty solid opinion.
At any rate, I've used a succession of Mhz-scale systems (starting with a 386 that struggled valiantly to run a copy of Windows 3.1 that someone had installed on it) and I'm definitely not very young.
> >* Graphics design
> We did
With rough and simple tools, limited to low-res designs, very limited brushes, super crude brush simulations (if any). Procreate on the iPad blows all of it out of the water effortlessly with its worst brushes using a finger for a stylus, and that realtime brush simulation has been taxing for earlier iPads, that have always been orders of magnitude above Mhz systems. I used Photoshop CS and CS2 a lot back in the day; they were resource hogs and still very crude compared to current entry-level apps. We've really gained a huge lot in that department.
> >* Video editing
At what, 640x480, 15fps? I remember having annoying render times for things that wouldn't hardly even count as filters nowadays at such resolutions, maybe 800x600, but I'm sure that ball-shaped logitech webcam could do no more than that. Snapchat replaces whole faces in realtime with eerie accuracy on what must be much higher res.
Color grading Full HD, 30fps? As if. I guess you couldn't even play that without stutters unless with dedicated silicon to begin with.
> >* Animation
Low-res, crude, and anything halfway detailed and interactive: Low-fps. Manipulating anatomy models like those at https://www.zygotebody.com/ in realtime? I doubt my Core 2 Duo laptop would have been up to that. I had similar software for Windows 98, and it was absolutely primitive and still taxing the CPU and GPU.
> >* 3D CAD
> Older than you think
With less precision, much, much lower model complexity, and much cruder tooling. A large number of components, complex shapes, joints, ... that's going to hit hard limits very quickly. I'm hitting hard limits with that sort of thing nowadays, but I'm hitting them somewhat later than even a few years ago.
> >* 3D Rendering
> 3D rendering was born here, and we were playing 3D games perfectly in 1996-2002
I don't play a lot, but between Gothic 1 and Witcher 3, graphics have improved by an incredible amount, it's day and night, and I can't even go to full details in Witcher due to my aging GPU. Technically, those systems could do it, sure – but only with super short visibility, very crude models and extremely limited shaders and animations, crude collision detection, ... Gothic required at least a 700mhz Pentium 3, so it's pretty representative I think. Of course, those limitations work better for some games than for others, but they still were brutal limitations.
> >* Multitasking
> Linux, KDE3.
Also quite limited, though. Just what fits on a single Mhz-scale core. On my Windows 98 PC, I had to close CPU-intensive applications all the time because things would start to stutter; I believe Windows 95 would sometimes just bluescreen under such conditions. Things got a lot better on Windows 2000, but I think that may have been on my 1Ghz Athlon already. Those early Linux desktops were pretty unstable with lots of multitasking as well, I faintly remember lots of freezing. Things did slow down perceptibly at any rate when doing multiple resource-intensive things. Technically possible, sure – but it absolutely helps to have lots of fast cores.
Of course, a lot of the legwork to make those use cases perform is nowadays done by GPUs or huge amounts of RAM, and profit lots from multiple cores, but I'd say a Mhz-scale system should have a period video card, too, and MB-scale RAM, and be single-core, otherwise it's kinda pointless. And under those conditions, all of above things were technically possible, but really severely limited – still are, in some cases (CAD...), but wayyy less than back then.
Does that qualify as "not in any meaningful way"? I guess it depends. It was meaningful for me back then, and it made possible things that hadn't been possible before, and of course we were always content with what we had, not like thouse young'uns nowadays, and walked uphill to school, in the snow, both ways, every day – but looking back, the capabilities of my Mhz computers feel incredibly crude and primitive by today's standards, and even a lower-end gaming PC has little problem running Autodesk Fusion 360 (for which there's even a free hobbyist license) with models of surprising complexity, and I'm sure that enables many many more things that wouldn't have been feasible on Mhz hardware.
On freezing, Mandrake was a joke, but Slackware and Debian were rock solid.
Create something new? Great! You're officially granted a monopoly on that thing for 2 decades - plenty of time to monetize it, and use the money to create some other new things. If you can't figure out a way to monetize it in that timeframe, then we'll allow others to compete with you and take a crack at it.
WordPerfect was a great word processor, I remember using version 5.1 for years because it was wicked fast, rock stable, and most importantly, it was predictable. By contrast, MS Word -- even today -- seems to have a mind of its own. You move an image, and suddenly all your numbered bullet points appear in a different font except for the last one, and nothing short of retyping the whole thing in another document seems to fix it.
Hence the unbelievable popularity of markdown.
Imagine telling your 90s self your current machine configuration and how awesome it is. Then explain that we, with all this tech, voluntarily chose to type in plaintext format, with notations that kind of work as a more readable form of markup language, because we got fed up trying to make WYSIWYG work.
Oh yes, and GUIs are for noobs, pros use terminal emulators, but that’s another topic :)
The thing is I still enjoyed using quality DOS and Unix terminal software.
GUIs are not the one true way to represent a UI. They’re not even new technology any more.
They’re just different and better suited to certain tasks but not others.
It’s like asking why are we still using steering wheels to drive instead of speech recognition?
I also miss extending the supervisor prompt with BBC BASIC and arm26 ASM though :D
I think it comes from conflating layout with writing. In QuarkXPress or InDesign it really isn't an issue. Those tools are quite uncomfortable for sitting to write in, though.
LyX does a decent job of this, I think, though it's not exactly WYSIWYG.
There’s also an issue with black box language vs something like HTML, which has semantic value besides pure presentational function.
I don’t care if the auto generated PostScript is ugly as long as the printed work is perfect.
I would have been easy to convince.
I was forced to use it heavily for a year of pure writing for a previous job. It's incredibly unpredictable. It's like whack-a-mole. You make text in one place bold, and all the sudden some of your footnote text on a different page becomes bold.
The people at that job who were really good knew all of the tricks, digging into the codes that WordPerfect inserts to address various issues. But even then, it was an extra step, and I never became as productive in WordPerfect as I had been in Word.
Plus WordPerfect has been on maintenance mode with Corel for decades at this point, catching lots of bugs and half-implemented features.
Obviously I get this with HTML, but the Mac & MS Word approach of 'object oriented formatting' was just a much better execution for mouse operation.
Word 5.1 on the Mac was amazing. Fast, powerful, probably everything one would need even today, aside from track changes.
Tbqh, this is my experience in just about every rich text editor. Well, maybe not quite that bad, but I’ll always bold a word, and then while editing I have to go back and change the word after it, and it will suddenly be bold.
I wish bold/italics/etc acted like caps lock. It’s either on or it’s off, and the computer doesn’t try to guess for me.
It's starting. Outlook is doing some kind of "autocapitalisation" to me and I have no idea how to turn it off.
Somebody has WordPerfect 8 binaries around.
It requires libc5 - the c library that was common before glibc became common at the end of the 90s. And then needs X libraries that are compiled against libc5.
Probably you could find an old Red Hat 5 or something and pull out those libraries to run on a recent kernel.
docker has no concept of the binaries that actually run, it just sets up namespaces, cgroups and a userspace. worse case scenario, you have to run the docker container as privileged, still no difference from running on host (at least with a properly constructed image, where the binaries wouldn't run as root)
The 'reveal codes' functionality is something that I always liked with WP, and that no other word processors seems to have implemented.
Obligatory screenshot here: https://mastodon.social/@stevelord/104824411022687268
I liked WordPerfect well enough from 4.2 through 5.1. Version 6 on Windows was not good. Even a slow typist (me) could get ahead of the cursor. But at some point my employer switched to Word and I haven't been back.
WP also had a potent macro capability and a good macro editor as well. I was practicing law at this time and had created 80-99 WP macros that made legal docs appear like magic.
There was no e-filing and court clerks and judges could be more than a little picky about the formatting of the paper documents they received. If a doc wasn’t formatted in the proper manner, some clerks would refuse to file it and return the doc to the lawyer’s office for a re-do.
Ashton and Bastian had been generous with stock grants to key people, particularly on the technical side. Most of the key people below the two founders had enough money that they didn’t have to put up with the sharp elbows and confrontational styles that predominated at Novell.
Combine a brain drain with a Microsoft push for Windows and built-in integration with the first passable version of Word and you had a recipe for disaster. Some of the WP people who remained claimed that MS had mislead them about Windows vs DOS roadmap and been slow in providing Windows info necessary for WP engineers to build a decent first release of WP for the new release of Windows, but I could never determine if this was what had really happened or just sour grapes because WP stumbled badly after the ownership change.
In the corel days they finally decided to port to actual win32 code and put aside the custom transpilation
I have the Win3.1 version of WP from that era in a DosBox. I've been meaning to compare it's grammar check to modern word processors. It was way ahead of its time.
I know Corel bough it, but it looks like it spun off again. I wonder if the current release still has any of the old source code, or if it was just rewritten and has the name for the brand: https://www.wordperfect.com/en/
The Corel of 2020 seems a fascinating Enterprise Licensing zombie. They appear to make all their revenue from (very) legacy Enterprise Licenses and government contracts and all of the software seems stuck in a time warp stasis with only the bare minimum of maintenance (presumably just enough to keep the Enterprise Licenses alive and well). It's almost sad to see how many classic computing brands they own and let languish in this state.
Word Perfect was available for UNIX as well. The SCO binaries might run with iBCS. I don't know if that would provide a significantly different experience than running the DOS version in dosemu though.
It's a bit cheaper to buy from them than to pay for an antique on eBay.
Also, there's a wave of mechanical options from more recent companies like Das Keyboard or the various DIY kits with choose your Cherry switch adventures.
But for I can see, he is NOT really using it...
I prefer the tags as they describe my work. Visually there is no evidence that something is a 'section' or an 'aside' but I make sure my words do have document structure, so a 'section' will probably have a 'h3' at the start of it.
I wish there was a 'what you see is what you mean' editor that was visual but only allowed you to add HTML semantic elements. WYSIWYG editors tend to just add local styles and 'br' elements, resulting in HTML that is useless for my style of HTML.
I can do diagrams in various ways including SVG that I just write out manually. Or I can use the wonders of CSS Grid and a 'dl' list with first of type/last of type CSS selectors to style up a simple linear flow diagram.
I really wish there was a super neat word processor or HTML editor that only wrote lean HTML with semantic markup for document structure. But here I am in 2020 using my IDE or Vim to write documents and even craft diagrams.
I am impressed that I am not alone in spurning drag and drop wonder gadgets. As much as I would like to give Wordprerfect a go, I feel the world has moved on from documents printed out as memos and put in pigeon holes. Stuff has to look good in a browser and with stripped down HTML that uses the elements and a simple stylesheet that loads a font and puts in some CSS Grid layout I feel I have the tools I need.
what Qedit (often renamed simply to Q.exe) was capable of doing in the (good) old times was incredible.
For typing / journaling the non-wysiwyg was a feature. I honestly think wordperfect 5.1 is a big reason why sublime is so popular now.
On graphics, that's a solved problem. Everyone uses framebuffer today, even on obsolete chipsets it's doable. Just map VGA calls to FB and everything would be fast enough, I think.
As a programmer building websites, I do the same thing, writing in vim. Learning how to make your editor work for you is a great investment.
It used to bother me that groff doesn't support Unicode (as far as I can tell) but then realized that all I write is English so why am I fetishizing Unicode? groff will get me any accented character that I would realistically need.
So this is my setup using MOM macros for nice typesetting.
But I fully agree on your last point, in the end I seldom need the Unicode universe for most of my documents.
No way, you're right. And of course it has a very detailed and well formatted man page. The one for troff even has tables.
This whole site is wonderful, btw
What about Unicode? Lack of Unicode seems like a dealbreaker for a lot of use cases. Even if you're not using Unicode, seems like you'd eventually want to work with Unicode text.
As an engineer, a lot of the "word processing" I do involves cutting and pasting text from other sources (citations, code samples, names, etc) that tend to be chock full of non-ASCII characters.
Or has this been solved somehow? I see that there are various hacks involved in getting support for the Euro symbol. What about a general purposes solution? I know that WP5.1 had support for international character sets. Perhaps somebody's cooked up an emulation layer that translates Unicode into whatever it is that WP needs.
There is still a community (apparently mostly paralegals) who actively use it and answer questions, e.g. http://wpdos.org and https://www.wpuniverse.com/
Looks like they deleted the file ... now 404 not found...
It's also an odd little page. It includes directions to buy it, if you already have a copy. If you don't have a copy, it includes some details on how to use eBay.
I run it inside a VM and it is still faster and better than LibreOffice by a long shot.
When I'm done writing, I paste the text into one of my blogs with IE6, which my sites still support.
I would still take the VM route.
Despite the age and 'underpoweredness' of it, it was still quicker, still more responsive, and still subjectively nicer to use than anything modern.
Allthough I also have to admit that this approach didn't work for my favorite game  but now that I'm googling, I does seem to work! So someone got DirectX 9 or something working?
Compile Scummvm with
sh configure --enable-all-engines --prefix=/opt
sudo make install
Nightly builds (but they may not set the "--enable-all-engines" switch to configure/
So like SLRN and Jed? Amazing. I used DOSEmu back in the day for Redneck Rampage under an AMD Athlon (Dear DOSBox users, DOSEmu was zillions times faster than DOSBox, I ran it at native speeds). This is impressive.
"All that was ever really needed by an OS or Office apps was already there 15 years ago. Everything "invented" beyond that was just vanity changes with negligible long term added value, and constant moving around of UI to appear new and better. Change my mind."
I should try to see if I can get Wordstar running under dosemu2 (I probably have a 3.5" floppy somewhere with it on, but I don't think I have the disk drive to read it).
Can it be downloaded from Internet Archive?
$_lpt1 = "dosprint"
Which just pipes the printer port output into a bash script that runs ps2pdf, but I changed it to save to a file instead.
just wait til the exact same UX comes to your automobile.
and after that...
* Collaboration (including real-time collaboration)
* Ease (or more like lack of it) installation process
* Always up to date
Abiword did that
>* Ease (or more like lack of it) installation process
You need an ISP. And an Internet connection.
>* Always up to date
It can be worse for your specs.