That's an early version of the system. I've seen pictures of a later version, which was an IBM 3270 display with a phone handset, but no keyboard. The idea was that the executive would pick up the phone and be connected to someone in a call center who would then do spreadsheet-type operations for them. Don't know if that was deployed much.
I'm trying to find a reference for this. I remember it from some ancient IBM ad.
The system in the article sounds fancier but more like a one-off demo. The later system was just a second remote display plus a voice line; more deployable.
The concept comes from NASA's Apollo Mission Control in the 1960s. These screens on the consoles were all just TV receivers. All the display data went onto a cable TV network. Any console could view any source. The network was remoted out, and displays outside the control room could look, too. Any display could be routed to the big screens, too.
The same technology was still in use in some USAF facilities well into the 1980s. (Long story. Short version: the 1970s upgrade project failed.)
That kind of switching remains a feature of military command and control centers. Some display may suddenly become important, and others need to look at it.
I work at a TV station and we use something like these [1]. Basically hundreds of video ins and outs that you can mix and match with the press of a button.
Many high energy physics/accelerator institutes have public accessable status dashboards online, if one want to see this concept in action.
For example the one for CERN: https://op-webtools.web.cern.ch/vistar/
I think Asciinema live streaming is a new feature I used it this summer to share terminals from one of my jumphosts. For the security part I only did "anyone can watch" which was usable but felt a bit lacking when I tried to use it. There are lots of UX gotchas with doing screen recording of terminals worst one is that a good broadcast is always a small screen, but when I work I want lots of data.
Kubernetes is awful at displaying secret stuff when sharing live terminals for showing ops.
In the early 1990s, I was working on GUI email software. Not much different than the email software at university, just the pixel resolution was higher with GUI.
The dev team went out "into the field" to help roll out the software to the company. This also allowed us to see how others used the software.
At the end of the day, one of the devs reported back that one personal assistant would maximize the email app's window (back when 17" CRT monitors were large) and after each email was processed, she'd print out the email and file it the appropriate spot in a filing cabinet.
All the devs were, "But... But... she can just file the email in an email folder in the program. Why does she need hardcopy? Email was supposed to save trees!"
Right after graduating college my wife was looking for work and ended up taking a job as a secretary shared between two chairs at our local university. They thought it was super important that their secretary had a bachelors degree for some reason.
One of the chairs would read emails on his iMac, then would handwrite a return message and give it to my wife who would type it into email and send it as him. He didn’t want to type anything. This was around 2008 to give you an idea of timing. My wife didn’t stay for long, but my understanding is he was doing this until he retired sometime in the 20 teens.
But I do remember going back to the 90s that there was at least one senior exec at a computer company I worked for who basically didn't touch his terminal as I understand it. His admin printed out and typed everything.
This attitude is still presenet among doctors, and is one reason why electronic Medical Records still suck, and why Obama's "Affordable Care Act" has made American healthcare simultaneously the most expensive in the world as well as among the worst in the world. Doctors consider their time too valuable to be used in slow and fiddly data entry, so they offload it to additional staff.
They're not entirely wrong in this regard - modern EMR web UIs are arguably inferior in many ways to some light pen driven systems of the 1970s-80s (I'm thinking especially of the old TDS system, which nurses (and the few docs that used them) loved because it was so easy and quick - replacing or "upgrading" it was like pulling teeth, and the nurses fought hard to keep it in every case I ever saw.)
Physician time is valuable. There is essentially a fixed supply and other bottlenecks in the healthcare system make adding more doctors a very slow process. That's why forward-thinking health systems employ medical scribes to offload data entry.
The TDS Health Care System had some unique advantages but unfortunately it was tied to obsolete technology and ultimately a dead end. Web UIs aren't necessarily a problem. Some of the most popular EHRs such as Epic use native thick client applications. The fundamental issue is that healthcare is inherently more complex than almost any other business domain, with every medical specialty needing a different workflow plus beyond the clinical stuff there are extensive documentation requirements imposed by payers and government regulators. Sometimes clinicians and administrators insist on certain functionality even when it makes no sense due to ego or ignorance. EHRs can be improved but I know from painful experience how expensive and time consuming it is to get everything right.
The younger docs seem more amenable but there still seems to be a ton of electronic paperwork for the benefit. That said, my "community hospital" got bought by one of the two big systems in my area and, from a patient standpoint, things like prescriptions and labs especially seem much more automated than in the past.
The amount of electronic paperwork seems to be much more than when it was all on paper.
When I was a kid my medical chart was paper. When I was around 13 years old the pediatrician’s office moved to an EMR.
It was more or less a digital version of the same chart.
As I have grown older, and with the benefit of having medical professionals in my family, I’ve seen how EMRs have changed from a distance. From an anecdotal perspective it seems like charting is more time consuming than it used to be. I’ve witnessed many different medical professionals using many different EMR platforms, and poor design seems to be a factor there.
They also deal with more information on a patient and in an aggregate form than paper charts ever did. From what I’ve observed I would venture a guess that more than a little of that is the result of neuroses and anal tendencies on the part of healthcare executives rather than quality improvement initiatives or research oriented objectives. There are other externalities like bad vendor implementation for CMMS requirements, or the continued granulation of conditions into ever more ICD codes, which then need crosswalk databases and interfaces and cross checks.
On the patient side, I’ve only ever truly been impressed by Epic’s portal. Every other one I’ve used is comparative garbage. I have recently been having a conversation with a manager at my doctor’s office trying to understand why and what changed so that chart data that used to be visible to me are now only visible to them, and why they can’t change that. It seems like the vendor implemented a forced change and I may just have to live with having ambiguously incomplete access to data I used to have access to, with no insight into what’s incomplete unless I already know.
With all of that said, at least there’s some access to one’s own health data. And comparing that to my birth records, which are functionally illegible (likely forever), at least what records are kept will be decipherable twenty years from now. Presuming they’re not mangled by a migration, which I’ve seen happen several times.
In the early 90s (maybe '92 or '93) my elementary school had a program where we'd go to the computer room and email kids in another school. There was nothing else to do on those computers that involved the internet (no web browsers), these were (relatively) state of the art 386s running DOS.
Anyway I remember we used to write our weekly emails on paper first and then type them into the computer- your quote reminded me of that!
I do work in public sector archiving, mainly retirement of software systems that have been replaced but hold information that needs to be stored for archival purposes.
The archiving software in this area is quite obnoxious and user unfriendly, so it happens every now and then that counties or government agencies decide to just print the lot of it on paper and put it in physical archives.
I once met with NEC who was wanted to hire some consultants to help them on your cloud journey. They wanted to become a cloud managed services and hosting provider - but had never done anything in 'cloud' before, this seemed odd to me and as I dug deeper things got weirder.
They demanded that their 'engineers' must be able to build out and manage both their own and their managed infra on AWS but never write any code - in fact they thought automation was outright dangerous, they said their engineers would never write any terraform, cloud formation or similar and that they wanted to become a MSP of cloud services preferring to write everything down in runbooks... and print those runbooks out.
The managers would turn up to meetings with huge stacks of paper that were just AWS documentation converted to pdf and printed.
We refused to work with them and essentially walked out. I'm sure this is something that someone like an Accenture or Deloitte would and probably did jump on.
In 2012 I was at a company that entered data into a custom program backed by sql. The user would then take a screenshot of the main card after saving it. They would then print the screenshot, hole punch it, hand write names and reference numbers and then file it in cabinets in the file room.
> Why does she need hardcopy? Email was supposed to save trees!
Old habits take a while to change. Managers and executives were used to reports and memos on paper. So when email arrived, it was very common for secretaries to print emails for their bosses to read. Even at one of my early jobs in the 1990s, changes deployed to production had to be documented in memo form, and a copy of the memo printed, along with diffs of the code changes, and filed in a filing cabinet.
We got there eventually. I'd say that for all but the oldest generation still working, printing any kind of document to hardcopy has become pretty rare, at least where I'm working.
Paper is a lot easier to read than a screen, even a modern 4k monitor is harder on the eyes than paper (I have no tried epaper displays). Paper also provides a lot more resolution, sometimes when the code is tricky the only sane option is to print out all 3 chains worth of that class (you can should turn that into sensible measurements via your favorite unit converted to get a sense of scale, but I think you will agree chains is the correct measure), spread it out on the floor with a pen and start reading and cross referencing things.
I worked at large retail store, massaging excel files for sales dept. One day I got to their floor, only to see their A3 printers working all day long. They made all excel sheets into paper because the screen aren't large enough, then write down fixes with a pencil and later update the spreadsheet on computer. 2010. (learned about cultural inertia and corporate "efficiency")
In the early days at my company, we had an arrangement where the CEO had 3 monitors on his desk. Two faced him, and one faced the other direction and mirrored one of his screens. There was a second keyboard and mouse attached so that someone could sit on the other side of the desk and collaborate. He could keep private data like email on one monitor and other applications on the shared screens. Somewhat comically these were enormous 21" CRT's so by "desk" I actually mean "dining table"
I really enjoyed working this way and kind of wish the same experience could be replicated between multiple machines and with more than 2 people. I would like it if anyone could drag an application onto a shared screen where multiple people could control/interact with it while still having the option to take a window from the shared display back to a private display, ie passsing an application from one system to another.
A lot of time and money was wasted over the years on communications concepts. It wasn't until the mainstream collaborative editing tools and PC-based video conferencing (as accelerated by COVID) that, for now at least, everything sort of came together.
I find it interesting how Dunlop was trying to solve the same kinds of problems Engelbart was, with the added constraint of preserving the shifgrethor of the top IBM executives. The fact that late-20th-century businessmen viewed such things as typing to be subordinates' work has had a more profound effect on the adoption of computer technologies, their development, and their marketing that we in modern times could guess without having known.
I'm also reminded of the Ashton-Tate software package Framework, which is one of my favorites from the 1980s. It's what they used to call "integrated software", which was a package of several productivity applications: word processor, spreadsheet, maybe a communications program or database or graphing capability, bundled together and sold as a unit. Unlike, say, Microsoft Works or DeskMate, Framework featured powerful versions of these tools and the ability to create composite documents, as well as a programming language with Lisp-like semantics to automate workflows. Because of this, Ashton-Tate pitched Framework as an executive decision-making tool, which was quite a bit different from how competitor programs like Lotus 1-2-3 were marketed:
Near as I can tell, shifgrethor means something like personal dignity, prestige among peers, legitimacy, autonomy, and authority -- all at once. King Argaven considers Genly Ai's existence (and his offer of union with the Ekumen) a threat to his sovereignty as king, because of that Karhide's sovereignty as a nation, also because of that his worth as an individual. He can't separate these concepts because they are all one, they are all shifgrethor.
This insight helped me understand the mindset of the IBM executives, which I wouldn't have before; just dismissed it as wrongheaded pre-boomer silliness. The executives saw demeaning themselves with the scutwork of looking things up for oneself as an attack on their position, their dignity and worth as individuals, and the organization as a whole -- perhaps even society as a whole. Those filthy hippies with their (sissy voice) "collaborative work environments" and their "interactive terminals". They're working for the Reds, I tell ya, trying to unravel the nation from the inside!
I owe LeGuin a profound debt for opening my mind to mentalities vastly different to my own, yet still essential to the history of the computing world I live in.
Have you read Stranger In A Strange Land? The alien word "grok" from that book has a similar way of being useful, and that one actually managed to make it into general speech somehow - at least by hacker types. In the book it's an alien word that literally means "to drink", though it really means something like "attain a real understanding of."
Yes, I read Stranger in a Strange Land, but I grokked "grok" before actually reading the book: the Jargon File has an entry for it and uses it liberally.
Lotus Symphony was a (later?) incarnation. Basically things evolved to more loosely coupled incarnations of office suites which ended up being pretty much from Microsoft (OK, LibreOffice) and then Microsoft and Google online. The end result was pretty much the same. If you weren't in a dominant office suite, you pretty much didn't exist except for specialized users.
Executives still consider being able to tell subordinates what to do in person more important than the work itself. See Back to Office vs Work from Home.
I visited the Computer History Museum this year during Vintage Computer Festival West. When not only can you tour the museum, but the upstairs rooms are crammed full of hundreds of amazing personal collections of vintage computing hardware all powered up and usable. It was a religious experience.
It will be interesting to see the durability of print vs digital content of time.
Many web properties are no longer accessible due to M&A activity and Small/solo publishers unable or unwilling to maintain their assets. Archives like WayBack Machine mitigates some of the loss of digital content so long as the archives themselves are still maintained.
Not sure how long microfiche lasts for but someone posted a link here not too long ago about how record companies had embraced magnetic hard drives in the 1990s to store music masters and are starting to find that the drives are no longer readable.
It depends a lot on the humidity and heat or light in the environment where the microfiche are being stored. But they should be able to retain their data for 500 or so years.
CDs and Laserdiscs are also seeing bitrot. The layer of material that is etched does degrade over time. Error correction helps some, but if it's a writable CD or DVD it's only likely to last a decade or two. M-Drives are CDs that are designed to retain their data for about 1000 years and can be writable by specific consumer drives. Not sure how long the professionally pressed CDs last but it's not that long.
ah, thanks for catching the typo, it was getting late for me, I should have pulled up a link or something because I haven't worked with these discs in a decade or so..
yeah those are the ones I'm referring to -- if you're archiving something like family history or data that needs to be good for centuries (without having to re-copy and juggle), those are a better choice than just about anything else.
Nothing comes to mind that you can interface with a computer, but when I wrote the phrase I was thinking of projects on the scale of Long Now [0], requiring physical etching on materials and very careful storage.
Alternatively, tell people that they can't store something and you're likely to find it robustly mirrored by many.
As photography was largely switching to digital, I sometimes wondered whether--whatever the preservation possibilities that digital offered--to what degree photos would really be preserved in practice relative to prints and slides.
Most photos are terrible. Colors can start fading in at little as 10 years if they were hanging on your wall that long. B&W can last longer, but still will fade. Of course there are different process, if you use the best process photos will last longer, but still they are not very stable.
Digital makes it cheap and easy to have multiple in many locations. While any one media may fail, you still have a copy - I have on this computer all the data from whatever computer I was using 15 years ago. (most of it I have not looked at in 20 years and I could safely delete, but it is still here, and on other backup systems I have)
Kodachrome is an amazing archival color film that when stored properly will last centuries. b&w negative film is even more stable.
You make a good point about the lack of durability and instability of many types of chemical photo processes (especially color negative and print processing). I do think many digital formats will be lost to time when a color transparency or b&w negative will still be viewable without much aid into the future.
One of my favorite photo books is the re-photographic survey project by Mark Klett. He went around re-capturing the exact locations (and camera position) of notable images of the American West from the early days of the US geological survey when they had a plate photographer on the team. We are talking about a time period just after the US Civil War. So we see a landscape captured in time 10 decades or more after the original.
I've been a pro photographer for over 30 years. All my earliest digital work is archived in RAW so I have the original shooting data. It all triple backed up and I have a friend that allows me store one of my backups at his home. I've been amazed at how many photographers lost track of or throw away their older work. I'm still licensing my work hundreds of times a year and some of this older material is becoming even more valuable simply due to scarcity. The redundancy of digital is great of you take archiving seriously.
Yet, I still have drawers of original film from the late 80's - to early 2000's I'm scanning a few but will probably let many be disposed of . . .
My point was there's the capability to do all this backup preservation but it doesn't just happen. And it's less visible in many cases than the proverbial shoebox full of photos will be.
What is the difference between photos on a crashed harddrive, and photos in a shoebox that that just burned in a house fire? Photos are vulnerable to many different attacks just like digital data.
These days your photos are probably backed up by facebook, google, or are such major players. (there are a lot of privacy concerns with the above, but they do tend to have good backups)
There is a lot of serendipitous backing up with social media. There was also a lot of serendipitous passing on to relatives of physical media. Not sure which better stands the test of time. (And I'm sure it varies.)
Often passing on to relatives is done with the only copy (well you retain the negative). School pictures come in packages of many, but otherwise you typically only print one copy.
Huge "Control" vibes from this article. If you like the aesthetics, action gaming, and the paranormal...yet for some reason have not played this game yet, definitely give it a try.
I didn't know that song before playing the game (honestly before reading this), but it's exactly the game's aesthetic indeed. Thanks for mentioning it!
Oh, there's a LOT more where that came from. One of my favourite bands. I saw them play live last year and Porcupine Tree have so much great material (and yes, they played Fear of a Blank Planet).
‘If you like action gaming’ would actually be a contraindication here. Control’s gameplay is utterly unremarkable, and I suspect you’d actually recommend it to someone who’s never played a 3rd person shooter before, so the pedestrian shooting isn’t so noticeable.
Hmm... consider instead the possibility that it is actually remarkable but just isn't your cup of tea, as reviews for the combat system have been overwhelmingly positive.
I will concede that the aiming specifically isn't S-tier, but then again this is an "action-adventure" game, not a shooter, and everything else in the combat system more than makes up for that one less-than-perfect feature. Not to mention the fact that the game is much more than just the combat system. "Action" was just one of the characteristics I listed. The aesthetics and paranormal lore are reason enough to play it regardless of any combat.
It's incredibly satisfying to destroy the environment, throw objects and enemies around, levitate, dash in mid air... just thinking about it has me wanting to replay the whole thing even if I already know the mystery.
I think you may be thinking of a different game? There’s nothing that really
plays like Control. You can even nerf the shooting difficulty entirely if you want — I don’t personally like the combat particularly, but it isn’t the primary thing happening in the game.
Very interesting contrasting visions between IBM's hierarchical approach and Englebart's Mother of All Demos. The IBM vision isn't really even computing based. it's obvious from the video which shows an on-demand point-to-point analog video link between a senior executive's office and a central reference library. The video is only from the library to the executive but the audio is bi-directional allowing the researcher to receive requests, assemble materials which could include documents placed on a video camera stand, transparencies, microfilm or the display output of a video terminal and then display them on the video feed using a video source switch box. It's really more a demo of a dedicated corporate video calling system.
> Dunlop’s 1968 video demonstration of the Executive Terminal and the Information Center proceeds in three acts.
The article doesn't make this clear but the linked videos are not a video demonstration but instead unedited B-roll shots without audio probably captured to be cut-aways edited into a narrated video demonstration. Unfortunately, that video demonstration isn't part of this collection (or was never created).
Uppercase characters are represented using a bar/macron over the top - I was a bit slow to work that out and I don't remember seeing that convention before.
Edit: pulvinar said "It's clearly a vector display". You can see a graph using vector lines at 24:13, zooming at 20:50, and there's graphic lines mixed with text at 28:36.
IIRC, it was a vector display in front of a raster camera. The same arrangement was used throughout the Gemini and Apollo mission control and up to the early shuttle program - images would be rendered in the RTCC (real-time computer complex(?)) and piped to the slow-scan CRTs in the panels. At the panel the operator could select which video stream they wanted to see. One of the streams was a "channel guide".
I love that people properly document important stuff like that. My grandma died last year aged 94 or so, and in her inheritance was a load of stuff that she wrote tiny notes on. I've got a plastic ibex head with a barometer on my wall now from the 60's or whatever with a tiny handwritten note taped to the back when and where it was bought. I mean it's worthless in both collectability and sentimental value, but the little note gives it a bit more personality.
I should do the same with anything I think is collectible / not trash / may end up in someone else's hands. For example, I bought some LPs over time, I should document when and where I bought them from at least. Maybe print out some information about the band / artist and include it, as the music themself is only part of the "product".
I'm not disagreeing, but I'm not necessarily convinced there's real value "for future generations". I love nostalgia, but it seems pretty useless beyond the entertainment value.
What benefits do you, or others, see in looking back at these computer systems?
This is no different to me than other historical artifacts. Old furniture, cars, clothes, books and so on tell a lot about the time they were created, and the people that lived during those times. It is not just about nostalgia. It is about knowing about the past. History and archeology are scientific disciplines where this is crucial.
Agreed on the first sentence... I like history too (now I'm middle-aged). I see some benefit, but mostly that seems to be entertainment too. One perhaps can't separate the useful bits from the other bits.
Like, those who don't study history are doomed to repeat it. But, those who do are mostly doomed to watch from the sidelines as other people repeat it. And even the things that are possibly obviously bad ideas without historical analogues get done...
I think history is worth mining for future ideas for producitvity software – especially when we finish mining everything LLMs and RAG can do, we might go back to past experiments in information retrieval. We might know the history now that we're reading this thread... but who's to say that a developer in 2030 who's never read HN has?
Interesting video. It seems like they imagined some sort of pair programming but with the boss sitting behind you.
I wonder if it failed it practice due to no boss having the patience of watching a programmer slowly writing out a program. Like, the video reminds me more of scifi computer interaction than actual programming. The boss voice sounds like the robot cops when beating the protagonist in TXH123 or whatever it is called.
> Once the results were assembled, the information specialist conveyed all this information to the executive, cutting from one video feed to another, guided by the executive’s interest and direction.
One very small correction: QUIKTRAN wasn't a “mathematical utility”, but an early timesharing system, I think running on a 7044 (coincidentally, my first mainframe). It offered an interactive Fortran system, with editing and debugging facilities. IBM's later CALL/360 system was a successor to this, adding PL/I and Basic.
Interesting UX fact: IBM researchers looked at user satisfaction on this system. They found that it wasn't poor response time that bothered people, but variability of response times. If users couldn't predict how long an operation would take, that bothered them. So they inserted delays so that average response times were maybe longer, but variance was lower. And users were happier.
-Dunlop saw the opportunity to run another experiment in 1967-68, which he called the “Executive Terminal.”
Accessing Dunlop's archives on the Xerox Star that would not have been a stand-alone system ended up requiring a Memorex machine that was accessed through multiple time-sharing CRT models. Piecing together the original audio in archival footage moved restoring the tape in an information management system to Englebart's accelerated NLS database.
I think the biggest "shock" is how quickly these things got normalised, but this is in part down to how we used to see this stuff; back in the 90's I first saw stuff on TV about video calls and computers and the like (but turns out that was decades after that kinda thing was first presented and probably a hundred years since it was used in sci-fi), but the way it's presented is all in marketing fashions, like, very intentionally sitting at a desk, dialing a number for a very formal conversation.
"real" video calling sort of snuck in through the back doors once people got webcams and MSN / Skype, and became mainstream / common in the 2000's with always-on internet, remote work, etc. And at one point the smartphone and mobile internet got in people's hands and (video) calls became casual.
I think the other part there is that it's normal people using them. What I mean by that is that in these videos, it's all very formal corporate people. And then the first people that really get interested in this kind of technology or who have an interest in futuristic stuff are / were the "nerdy" types. (I am probably living in a bubble though). But it was the average joe that normalised this technology.
This was what engineers were barely capable of, with the technology that did exist, but even most executives who were the type of user it was envisioned for never knew anything about it, much less had anything like this much desktop technology ever.
Everybody else in the non-executive category, even more of a complete fantasy.
IOW the difference between what you see there vs now is minor, compared to the real "backward" state back then.
Even though things like transistor radios were already common, you have to realize that in a huge percentage of dwellings in the US, and way more in the rest of the world, there was still not yet a single transistorized product.
I was a young math & electronics geek and was aware of more stuff like this than average.
Along with all the much more mature people, like the extremely rare engineering students who might want to work for IBM or something, this was exactly the kind of thing that was inspiring the movie "2001: A Space Odyssey" which came out the next year.
Anyone who had any clue something like this was already possible, could basically agree how cool it would be and was really looking forward to the 21st century when it would be here.
If the world was not destroyed by nuclear war before the 21st century got here :\
Most of what we do with computers (maybe with the exception of the current AI and ML stuff) was invented or prototyped in the 1970s or earlier. It's just gotten faster, a bit more polished, and a lot more affordable.
It's clearly a vector display, and my guess is that the beam is being turned off a little too early at the end of each character's final stroke, leaving it lopsided.
The bar over a letter must mean that it's true upper-case. Cheesy, but it's what we did when characters were expensive.