It works beautifully under dosemu2, which has a terminal mode that can translate various VGA modes into S-Lang calls (S-Lang is like ncurses, so no X11 required). I find this technically impressive and makes a lot of old DOS software indistinguishable from native linux software; stdin/stdout, parameters, host filesystem access, etc all work transparently.
Here's a screenshot: https://twitter.com/taviso/status/1272670107043368960/photo/...
It can import TTF fonts and print to PostScript, which I just pipe into ps2pdf and then handle on the host.
I'm not aware of any other full-featured modern word processor that can run in an xterm. I know about wordgrinder but it's very very basic. You could use a text editor, but it's not ideal for layout because it doesn't understand things like proportional font geometries - you need that to know how lines/glyphs will fit on the physical page when it's printed. You could write it in some form of markup, html, TeX, markdown, whatever, but if I'm just trying to format a document I prefer a word processor.
(Note: dosemu2 doesn't require virtual 8086 mode, so it works fine on x86-64)
The only "new" thing enabled by our bigger systems is "big data", which is largely a process of finding patterns that will fool the user/purchaser/customer.
> our GHz computers, we were doing previously with old MHz
I do also, but then I do have to pinch myself and remember some of the cutting edge software that really does make use of the hardware - i.e. games, video editing, neural networks, etc. I know we could already do these things on lesser machines, but there is no doubt that the level of which they are currently done could not be replicated on a lesser machine. And also remember the efficiency of systems such as web servers, todays average usage would be a DoS attack of the past.
I do look in disappointment though at text editors, window managers, file viewers, etc - that despite having much more computing power, offer not many more features but still eat tonnes of resources.
I'm currently (slowly) working on a Window Manger for X11 which tries to bring back some of these ideas but for modern devices: https://github.com/danielbarry/oakwm/ It's built on top of wm2 (which itself has roots going back to Bell Labs plan9). A lot of the work is in ripping out the unnecessary features and making it touch friendly. The idea is to run it on the PineTab Linux tablet.
I would like to see better JS support, but the scope of JS is simply insane for such a small browser. It's unfortunate much of the web is completely unusable without running JS. Perhaps it's possible to first run the page through a larger browser engine and then send the processed content to the small browser (such as Dillo), that would massively widen the scope of what it could display.
>But the scope of JS is simply insane for such a small browser.
Edbrowse has JS support thanks to duktape.
Seems to be still pretty old...
> Edbrowse has JS support thanks to duktape.
Huh, very interesting: https://duktape.org/
I initially assumed by ducktape you meant "barely held together"!
*correction, according to https://developer.chrome.com/multidevice/webview/overview they are still separate, it references android L dev preview so not the latest source tho...
While 60MB is still humongous, it does make sense that Google would at least care slightly on Android, as the majority of the Android market are low-to-mid-tier smartphones with reasonable performance instead of the "desktop-in-a-pocket" that are the current flagship phones.
But while they care slightly on Android, they don't care at all on desktops.
Remember kids: Large binaries are slow binaries.
And yes, there is clearly only an incentive to care on Android. The automated tooling to track Chrome binary size only works on the Android build.
False. Loop unrolling is always faster.
Blatantly false, which is why heuristics decide when to unroll.
Large code trashes instruction caches, and cache misses are very expensive. More code, more misses.
For very small loops, the unroll can pay off, but these are often too small to have significant size detriment.
See, the hardware folks have listened in on the compiler people and their problems. They've done things like identify loops and rewritten them in microcode for optimization. If a short loop can fit entirely within the CPU code buffer, speed goes way up. Unroll the loop and blow the CPU code buffer, defeat the optimization and lose all that.
> Think about how much a "10 MB" only browser took up in main
> memory back in the day.
This is exactly the point, software has swelled to use the resources available to it, so with each new iteration your machine gets faster but what it runs gets slower. It doesn't feel like this is something we should settle for.
But apart from the HD video, the search, the aqueducts, the roads, the education, the browsers that don't crash all the time, and the wine... what have the Romans ever done for us?
I disagree. Searching for keywords is practically impossible nowadays, with all smart AI-powered™ search engines that unhelpfully return results that are not relevant to your query. And SEO spam, which has killed the niche high signal-to-noise handcrafted websites for content aggregators.
IE5-6 were a crashfest.
I lost track of how many times I saw the spinning beachball of doom. The only solution for which was the power off and on again.
I'm guessing anyone who didn't get browser crashes in the 90s was only browsing a couple of sites or something. Certainly anything before IE5.5 was crash city - in particular Netscape 2.0!
Today I use Electron and React Native, which needless to say are not very popular, but for sure I could never have developed the kind of cross-platform software I write today with the technologies I used many years ago. Partly for lack of skills, but also of time (cross-platform development was way more difficult), or simply because computers back them were not powerful enough.
I don't have any special nostalgia for old technologies, some stuff were good, some not so much. And as OP is showing you're still free to use old software if you don't like what's being done today.
They still aren't.
But ultimately this would all fail. Aside from the network effects - nobody's going to use it - the fast processors and tons of memory are necessary. We just don't think about it unless we're running out of it.
- high resolution photo and video - photo editing programs like GIMP and Darktable can spend some time processing photos; these days even with complex effects you usually never experience lag more than a few hundred ms on many megapixel photos, because our hardware is fast. Same for video, the memory and storage space and bandwith is a hard requirement, and going back to 360p is not really acceptable.
- high resolution monitors - no point in having great looking 4k 10 bit color video without a 4k monitor, and now you're stuck having to push 20gbps through your displayport cable on a 500 MHz processor. And text also looks much better at 4k. Also you could say this is a bit unnecessary but compositing window managers are pretty great and I would say a core requirement of modern GUIs, and having eye candy like wobbly windows, window shadows (those are actually very helpful, try turning them off), etc. is expensive. My Dell XPS laptop from 2017 couldn't handle wobbly windows without visible stutter at 4k60; my desktop with a $400 GPU from 2019 can keep up at 4k@144hz.
- new technologies - VR/AR, fast voice recognition and neural networks - this is all cutting edge stuff but the use cases are obvious and they have started to be applied more commonly. Also the failures like eye tracking, Kinect - they may have failed commercially but they were good ideas and a valid use of fast computers. Also online meetings with many participants, each with their own video streams that require decompression.
- obviously, video games - not that you can't have fun games with shitty graphics, but good graphics are nice.
I thought of a few more but can't remember them at the moment. Oh, also, Bluetooth audio - 192kbps audio, but you need to decode it since it takes compression to get it to that level, and then you have the additional overhead of sending it over a digital protocol instead of just having your ADC do the work. You would need expansion cards or a dedicated audio core in your CPU to accomplish this if your computer wasn't fast.
And of course, science needs big data and big hardware to process all of that data.
If you're only working with text then slow computers are fine, but as soon as media gets involved they are not practical.
But the stuff we do day-to-day (excepting graphics) is appallingly inefficient. And most of the look-at-this stuff (speech recognition, and you mention VR and eye-tracking) is not actually used for anything day-to-day; we don't routinely talk to our computers ("Hello Computer" https://youtu.be/v9kTVZiJ3Uc?t=10 ).
That's perhaps because doing so in an office or cafe environment isn't great.
Kids are using text-to-speech more while doing their homework. Word has been beefing up its transcription feature for a while now.
I got to see this happen during lockdown, where teachers I know recommended transcription for younger kids who don't have touch typing skills and were still expected to turn in work on a computer. Talking to your computer is much more natural while doing your homework in your room.
You've giving the gubment too much credit. It's easy to throw up your hands and say "aww geez I guess everything is backdoored anyway, why bother?" -- this is exactly what they want. The truth is a lot more complex, and as a lot of leaks have revealed, their capabilities are far from the supernatural omnipotence you seem to be implying.
Think of all the abstractions in a modern system. From the USB Bus, to the Network Stack and everything in between.
We have just duct taped on duct tape and we should be happy it all works this well!
It comes about because unless you design every sub-component system in lockstep with every other (which is impossible, the romans didn't lay out londons for cars, the victorians didn't put in sewers with respect to where we'd want to run fiber) you end up with an impedance mismatch at the boundaries.
It's why back-hoe operators from a gas network rip up fiber depressingly often, why (in the UK) the UK transport system has choke points between road, rail and air and on and on.
Modern computers aren't a unified system, they are lots of seperate systems that talk to each other and frankly having some (minimal understanding) of what has to happen for Gnome to appear on my screen when I press my power button I'm amazed that it ever works never mind mostly without fuss.
* Graphics design
* Video editing
* Monero mining
* 3D CAD
* High end (physics/etc) simulations
* 3D Rendering
Sure you might be able to do some of those things at a very low resolution on a Mhz system, but not in any meaningful way.
You must be really young.
>* Graphics design
>* Video editing
>* 3D CAD
Older than you think
>* 3D Rendering
3D rendering was born here, and we were playing 3D games perfectly in 1996-2002
> Sure you might be able to do some of those things at a very low resolution on a Mhz system, but not in any meaningful way.
> You must be really young.
Maybe they are, maybe they aren't, but that's an unfounded assumption. There are plenty of middle-aged people who haven't got a solid idea of the limits of pre-Ghz-scale hardware for use cases that weren't all that mainstream back then, and I'm sure there will also be young people who've tinkered with some old Windows 98 box and could give us a pretty solid opinion.
At any rate, I've used a succession of Mhz-scale systems (starting with a 386 that struggled valiantly to run a copy of Windows 3.1 that someone had installed on it) and I'm definitely not very young.
> >* Graphics design
> We did
With rough and simple tools, limited to low-res designs, very limited brushes, super crude brush simulations (if any). Procreate on the iPad blows all of it out of the water effortlessly with its worst brushes using a finger for a stylus, and that realtime brush simulation has been taxing for earlier iPads, that have always been orders of magnitude above Mhz systems. I used Photoshop CS and CS2 a lot back in the day; they were resource hogs and still very crude compared to current entry-level apps. We've really gained a huge lot in that department.
> >* Video editing
At what, 640x480, 15fps? I remember having annoying render times for things that wouldn't hardly even count as filters nowadays at such resolutions, maybe 800x600, but I'm sure that ball-shaped logitech webcam could do no more than that. Snapchat replaces whole faces in realtime with eerie accuracy on what must be much higher res.
Color grading Full HD, 30fps? As if. I guess you couldn't even play that without stutters unless with dedicated silicon to begin with.
> >* Animation
Low-res, crude, and anything halfway detailed and interactive: Low-fps. Manipulating anatomy models like those at https://www.zygotebody.com/ in realtime? I doubt my Core 2 Duo laptop would have been up to that. I had similar software for Windows 98, and it was absolutely primitive and still taxing the CPU and GPU.
> >* 3D CAD
> Older than you think
With less precision, much, much lower model complexity, and much cruder tooling. A large number of components, complex shapes, joints, ... that's going to hit hard limits very quickly. I'm hitting hard limits with that sort of thing nowadays, but I'm hitting them somewhat later than even a few years ago.
> >* 3D Rendering
> 3D rendering was born here, and we were playing 3D games perfectly in 1996-2002
I don't play a lot, but between Gothic 1 and Witcher 3, graphics have improved by an incredible amount, it's day and night, and I can't even go to full details in Witcher due to my aging GPU. Technically, those systems could do it, sure – but only with super short visibility, very crude models and extremely limited shaders and animations, crude collision detection, ... Gothic required at least a 700mhz Pentium 3, so it's pretty representative I think. Of course, those limitations work better for some games than for others, but they still were brutal limitations.
> >* Multitasking
> Linux, KDE3.
Also quite limited, though. Just what fits on a single Mhz-scale core. On my Windows 98 PC, I had to close CPU-intensive applications all the time because things would start to stutter; I believe Windows 95 would sometimes just bluescreen under such conditions. Things got a lot better on Windows 2000, but I think that may have been on my 1Ghz Athlon already. Those early Linux desktops were pretty unstable with lots of multitasking as well, I faintly remember lots of freezing. Things did slow down perceptibly at any rate when doing multiple resource-intensive things. Technically possible, sure – but it absolutely helps to have lots of fast cores.
Of course, a lot of the legwork to make those use cases perform is nowadays done by GPUs or huge amounts of RAM, and profit lots from multiple cores, but I'd say a Mhz-scale system should have a period video card, too, and MB-scale RAM, and be single-core, otherwise it's kinda pointless. And under those conditions, all of above things were technically possible, but really severely limited – still are, in some cases (CAD...), but wayyy less than back then.
Does that qualify as "not in any meaningful way"? I guess it depends. It was meaningful for me back then, and it made possible things that hadn't been possible before, and of course we were always content with what we had, not like thouse young'uns nowadays, and walked uphill to school, in the snow, both ways, every day – but looking back, the capabilities of my Mhz computers feel incredibly crude and primitive by today's standards, and even a lower-end gaming PC has little problem running Autodesk Fusion 360 (for which there's even a free hobbyist license) with models of surprising complexity, and I'm sure that enables many many more things that wouldn't have been feasible on Mhz hardware.
On freezing, Mandrake was a joke, but Slackware and Debian were rock solid.
Create something new? Great! You're officially granted a monopoly on that thing for 2 decades - plenty of time to monetize it, and use the money to create some other new things. If you can't figure out a way to monetize it in that timeframe, then we'll allow others to compete with you and take a crack at it.
WordPerfect was a great word processor, I remember using version 5.1 for years because it was wicked fast, rock stable, and most importantly, it was predictable. By contrast, MS Word -- even today -- seems to have a mind of its own. You move an image, and suddenly all your numbered bullet points appear in a different font except for the last one, and nothing short of retyping the whole thing in another document seems to fix it.
Hence the unbelievable popularity of markdown.
Imagine telling your 90s self your current machine configuration and how awesome it is. Then explain that we, with all this tech, voluntarily chose to type in plaintext format, with notations that kind of work as a more readable form of markup language, because we got fed up trying to make WYSIWYG work.
Oh yes, and GUIs are for noobs, pros use terminal emulators, but that’s another topic :)
The thing is I still enjoyed using quality DOS and Unix terminal software.
GUIs are not the one true way to represent a UI. They’re not even new technology any more.
They’re just different and better suited to certain tasks but not others.
It’s like asking why are we still using steering wheels to drive instead of speech recognition?
I also miss extending the supervisor prompt with BBC BASIC and arm26 ASM though :D
I think it comes from conflating layout with writing. In QuarkXPress or InDesign it really isn't an issue. Those tools are quite uncomfortable for sitting to write in, though.
LyX does a decent job of this, I think, though it's not exactly WYSIWYG.
There’s also an issue with black box language vs something like HTML, which has semantic value besides pure presentational function.
I don’t care if the auto generated PostScript is ugly as long as the printed work is perfect.
I would have been easy to convince.
I was forced to use it heavily for a year of pure writing for a previous job. It's incredibly unpredictable. It's like whack-a-mole. You make text in one place bold, and all the sudden some of your footnote text on a different page becomes bold.
The people at that job who were really good knew all of the tricks, digging into the codes that WordPerfect inserts to address various issues. But even then, it was an extra step, and I never became as productive in WordPerfect as I had been in Word.
Plus WordPerfect has been on maintenance mode with Corel for decades at this point, catching lots of bugs and half-implemented features.
Obviously I get this with HTML, but the Mac & MS Word approach of 'object oriented formatting' was just a much better execution for mouse operation.
Word 5.1 on the Mac was amazing. Fast, powerful, probably everything one would need even today, aside from track changes.
Tbqh, this is my experience in just about every rich text editor. Well, maybe not quite that bad, but I’ll always bold a word, and then while editing I have to go back and change the word after it, and it will suddenly be bold.
I wish bold/italics/etc acted like caps lock. It’s either on or it’s off, and the computer doesn’t try to guess for me.
It's starting. Outlook is doing some kind of "autocapitalisation" to me and I have no idea how to turn it off.
Somebody has WordPerfect 8 binaries around.
It requires libc5 - the c library that was common before glibc became common at the end of the 90s. And then needs X libraries that are compiled against libc5.
Probably you could find an old Red Hat 5 or something and pull out those libraries to run on a recent kernel.
docker has no concept of the binaries that actually run, it just sets up namespaces, cgroups and a userspace. worse case scenario, you have to run the docker container as privileged, still no difference from running on host (at least with a properly constructed image, where the binaries wouldn't run as root)
The 'reveal codes' functionality is something that I always liked with WP, and that no other word processors seems to have implemented.
Obligatory screenshot here: https://mastodon.social/@stevelord/104824411022687268
I liked WordPerfect well enough from 4.2 through 5.1. Version 6 on Windows was not good. Even a slow typist (me) could get ahead of the cursor. But at some point my employer switched to Word and I haven't been back.
WP also had a potent macro capability and a good macro editor as well. I was practicing law at this time and had created 80-99 WP macros that made legal docs appear like magic.
There was no e-filing and court clerks and judges could be more than a little picky about the formatting of the paper documents they received. If a doc wasn’t formatted in the proper manner, some clerks would refuse to file it and return the doc to the lawyer’s office for a re-do.
Ashton and Bastian had been generous with stock grants to key people, particularly on the technical side. Most of the key people below the two founders had enough money that they didn’t have to put up with the sharp elbows and confrontational styles that predominated at Novell.
Combine a brain drain with a Microsoft push for Windows and built-in integration with the first passable version of Word and you had a recipe for disaster. Some of the WP people who remained claimed that MS had mislead them about Windows vs DOS roadmap and been slow in providing Windows info necessary for WP engineers to build a decent first release of WP for the new release of Windows, but I could never determine if this was what had really happened or just sour grapes because WP stumbled badly after the ownership change.
In the corel days they finally decided to port to actual win32 code and put aside the custom transpilation
Word Perfect was available for UNIX as well. The SCO binaries might run with iBCS. I don't know if that would provide a significantly different experience than running the DOS version in dosemu though.
But for I can see, he is NOT really using it...
It's a bit cheaper to buy from them than to pay for an antique on eBay.
Also, there's a wave of mechanical options from more recent companies like Das Keyboard or the various DIY kits with choose your Cherry switch adventures.
what Qedit (often renamed simply to Q.exe) was capable of doing in the (good) old times was incredible.
I prefer the tags as they describe my work. Visually there is no evidence that something is a 'section' or an 'aside' but I make sure my words do have document structure, so a 'section' will probably have a 'h3' at the start of it.
I wish there was a 'what you see is what you mean' editor that was visual but only allowed you to add HTML semantic elements. WYSIWYG editors tend to just add local styles and 'br' elements, resulting in HTML that is useless for my style of HTML.
I can do diagrams in various ways including SVG that I just write out manually. Or I can use the wonders of CSS Grid and a 'dl' list with first of type/last of type CSS selectors to style up a simple linear flow diagram.
I really wish there was a super neat word processor or HTML editor that only wrote lean HTML with semantic markup for document structure. But here I am in 2020 using my IDE or Vim to write documents and even craft diagrams.
I am impressed that I am not alone in spurning drag and drop wonder gadgets. As much as I would like to give Wordprerfect a go, I feel the world has moved on from documents printed out as memos and put in pigeon holes. Stuff has to look good in a browser and with stripped down HTML that uses the elements and a simple stylesheet that loads a font and puts in some CSS Grid layout I feel I have the tools I need.
For typing / journaling the non-wysiwyg was a feature. I honestly think wordperfect 5.1 is a big reason why sublime is so popular now.
On graphics, that's a solved problem. Everyone uses framebuffer today, even on obsolete chipsets it's doable. Just map VGA calls to FB and everything would be fast enough, I think.
As a programmer building websites, I do the same thing, writing in vim. Learning how to make your editor work for you is a great investment.
It used to bother me that groff doesn't support Unicode (as far as I can tell) but then realized that all I write is English so why am I fetishizing Unicode? groff will get me any accented character that I would realistically need.
So this is my setup using MOM macros for nice typesetting.
But I fully agree on your last point, in the end I seldom need the Unicode universe for most of my documents.
No way, you're right. And of course it has a very detailed and well formatted man page. The one for troff even has tables.
This whole site is wonderful, btw
What about Unicode? Lack of Unicode seems like a dealbreaker for a lot of use cases. Even if you're not using Unicode, seems like you'd eventually want to work with Unicode text.
As an engineer, a lot of the "word processing" I do involves cutting and pasting text from other sources (citations, code samples, names, etc) that tend to be chock full of non-ASCII characters.
Or has this been solved somehow? I see that there are various hacks involved in getting support for the Euro symbol. What about a general purposes solution? I know that WP5.1 had support for international character sets. Perhaps somebody's cooked up an emulation layer that translates Unicode into whatever it is that WP needs.
There is still a community (apparently mostly paralegals) who actively use it and answer questions, e.g. http://wpdos.org and https://www.wpuniverse.com/
Looks like they deleted the file ... now 404 not found...
It's also an odd little page. It includes directions to buy it, if you already have a copy. If you don't have a copy, it includes some details on how to use eBay.
Despite the age and 'underpoweredness' of it, it was still quicker, still more responsive, and still subjectively nicer to use than anything modern.
I run it inside a VM and it is still faster and better than LibreOffice by a long shot.
When I'm done writing, I paste the text into one of my blogs with IE6, which my sites still support.
I would still take the VM route.
Allthough I also have to admit that this approach didn't work for my favorite game  but now that I'm googling, I does seem to work! So someone got DirectX 9 or something working?
Compile Scummvm with
sh configure --enable-all-engines --prefix=/opt
sudo make install
Nightly builds (but they may not set the "--enable-all-engines" switch to configure/
So like SLRN and Jed? Amazing. I used DOSEmu back in the day for Redneck Rampage under an AMD Athlon (Dear DOSBox users, DOSEmu was zillions times faster than DOSBox, I ran it at native speeds). This is impressive.
"All that was ever really needed by an OS or Office apps was already there 15 years ago. Everything "invented" beyond that was just vanity changes with negligible long term added value, and constant moving around of UI to appear new and better. Change my mind."
I should try to see if I can get Wordstar running under dosemu2 (I probably have a 3.5" floppy somewhere with it on, but I don't think I have the disk drive to read it).
Can it be downloaded from Internet Archive?
$_lpt1 = "dosprint"
Which just pipes the printer port output into a bash script that runs ps2pdf, but I changed it to save to a file instead.
just wait til the exact same UX comes to your automobile.
and after that...
I have the Win3.1 version of WP from that era in a DosBox. I've been meaning to compare it's grammar check to modern word processors. It was way ahead of its time.
I know Corel bough it, but it looks like it spun off again. I wonder if the current release still has any of the old source code, or if it was just rewritten and has the name for the brand: https://www.wordperfect.com/en/
The Corel of 2020 seems a fascinating Enterprise Licensing zombie. They appear to make all their revenue from (very) legacy Enterprise Licenses and government contracts and all of the software seems stuck in a time warp stasis with only the bare minimum of maintenance (presumably just enough to keep the Enterprise Licenses alive and well). It's almost sad to see how many classic computing brands they own and let languish in this state.
* Collaboration (including real-time collaboration)
* Ease (or more like lack of it) installation process
* Always up to date
Abiword did that
>* Ease (or more like lack of it) installation process
You need an ISP. And an Internet connection.
>* Always up to date
It can be worse for your specs.