No React, no flexbox, no REST API, no microservices, no Docker, no Kubernetes... but the reason it didn't need any of those things is because it had a terrible UI, didn't do all that much, and only needed to support trivial numbers of users whose modems were probably the main bottleneck anyway.
Trying to make modern users happy with '90s era tech would be impossible and deeply painful.
But it was nice, just for a while, to have a world where people were thrilled that even a super-basic web application was a thing that existed.
Given my experience users can be delighted in 2019 by well-designed and lightweight UI built with just CSS and vanilla JS, instead of an overweight buggy SPA (usually built by engineers more interested by the code itself than solving the business problem in the leanest way). Source: I made that my business and I couldn’t be happier.
You're exceeding their expectations by providing less than they expected. That's easy to do when the user expects to be accosted by a slow loading page that comes onto his screen janky because of deferred stylesheets and 4 different Google fonts and 1.6mb worth of ads.
The thing that consistently kills site performance is ads and trackers. Of the sites I work on, GTM, Optimizely, FullStory, and other similar crap routinely are responsible for >75% of the JS weight and >90% of the HTTP requests. It doesn’t matter how much I prune our npm dependencies and tweak React to make sure it’s rendering fast—marketing is going to make the page suck and I can’t do anything about it besides generate page speed reports that the client will ignore.
Brutalism for President 2020.
> No React
True, but when I remember how we used 1x1px Java applet to allow reloading data without refreshing the page I would not call that simpler... And don't even get me started on Flash.
> no REST API
True, but when I remember fumbling about with SOAP I would hardly call that simpler...
> no microservices
True, but what we had instead -- monolithic PHP "applications" with insane hacks to integrate with other monolithic services, Microsoft' and Adobe's attempts to abstract the backend/frontend separation away and so on -- I would not call that simpler...
> no Docker, no Kubernetes
True, but when I remember the deployment mechanisms (upload via FTP), problems with scaling once our server became not enough (usual approach: set-up everything on a separate machine and then have a downtime until it was back up) and so on, I would certainly not call that simpler...
There are always trade-offs, and you continuously have to learn new technologies and unlearn others...
I just wanted information density and scannability, not hamburger buttons and pages and pages of responsive cards.
Similarly, I'm no fan of Reddit's redesign or the huge lag when loading IH and yet more and more content sites are going the same direction.
XP was a sane and useful thing before the sc(r)?umbags came along and shat in the Agile pool.
You could ctrl-u any web page, understand it, and do all the cool things you actually cared about yourself.
Nobody played Matryoshka doll's with VM's and Containers, they did real work that mattered instead.
Scheme/Lispy Lambda's just worked decades before these halfarsed modern languages came up with dozens of borked half- baked versions of it.
GUI's were an almighty colossal pain in the proverbial to write, so almost nobody did that and as a result were about 20 times more productive than they are today.
Threading implementations were borked, so nobody did that. They used processes and everything was much better. If you have any sense you will _still_ be using processes and not threads and your life still will be much better.
An, oh yes, pen plotters were fun to watch. Laser printers are absolutely wonderful.... but it's like the difference between a wood fire and panel heater... Which would you rather sit with a whisky in hand and just watch?
I might be an old curmudgeon, but I do have a very longer list of things that are much better than they were in the Bad Old Days if anyone cares.
Now GET OFF MY LAWN!
Laser printers are absolutely wonderful.
I miss programming in assembly. Slapping together frameworks with some sample code from Stack Overflow just doesn't provide the same mental satisfaction.
I miss interviewing in the 90s. No white board algorithm puzzles or such nonsense. Just a conversation to see if your interests matched the role and you're hired.
(I guess I don't miss the low salaries which were on par with any office workers in non-tech fields, nothing like today. But the good side was that everyone was in tech for the sheer love of it, not to make a buck.)
But what I truly miss (nostalgia aside) is when Silicon Valley used to be about technology. The big names in Silicon Valley were all actual technology companies: Sun, SGI, HP (the good original HP, not the ink maker). The product was the technology and the culture reflected that. Engineers were in charge and the goal was to make better technology.
Today the product is advertising and the goal is just to drive more eyeballs. Engineers have become commoditized and micromanaged (agile) by PMs who are driven by advertising goals. (PMs as we know them today did not exist.) Except for Apple, none of the big names today (FAANG) are tech companies anymore.
Does anyone know any tech companies or industries that aren't like this in the US?
I've been considering saying goodbye to the American market and going to Europe for a few years and see how things are there first hand. The one thing that would make me stay is the discovery of an industry or large company that doesn't micromanage and commoditize programmers.
We were much more positive in the olden days. Back in the 90s you could post an idea in a developer-oriented usenet group or discussion forum and get pretty decent feedback. If you do that today there's good chance you're going to either get nothing back or you're going to be flamed for using an "anti-pattern". People are far too quick to dismiss things now.
It's probably a function of how everything gets marked with a score now and every post is social proof, but it's quite annoying regardless.
There was a genteel, supportive quality of having a participation base comprised mostly of sincere, interested readers and posters. There were trolls even then, for sure, but they were few and were generally embarrassed into an inert mumbling.
I still remember fondly the debates in comp.lang.c about the first ANSI C standard proposals. Even Henry Spencer's rants against NOALIAS were entertaining.
And where user participation got unproductively unruly, moderated groups served well, like rec.humor.funny.
There was also an open, transparent process for forking groups, generally along natural fracture lines, and for creating new subgroups.
What ultimately ruined Usenet (IMHO) was a falling signal-to-noise ratio, and not just because of spam. The opening of the AOL gateway and the resulting Eternal September proved to be mortal wounds.
Things have been dumbed down considerably.
any question can be answered within seconds
In the Usenet days, you'd find a FAQ with answers curated by the combined reviews of dozens of participants over time.
Learnt a lot of interesting stuff.
Most of their code was utter shit to read and maintain... but then so is that from most recent comp sc graduates...
They must have been well-written; I didn't really have Internet access, so there was nowhere else to go if I got stuck. Though I was probably a much more determined learner back then.
Someone here mentioned StackOverflow, but really, a handful of manuals, maybe a good book or two, and some persistence, /usually/ gave you all you needed to figure out a problem, and do something magical.
Now, those resources (if they're available) will likely only help you understand a very small subsystem.
Also things were not as "professional". When I started out my boss told me to figure something out and I would report back weeks or months later. I had the time to make mistakes and learn from them. Today the young devs are often very micromanaged and have no freedom.
On the other hand we can do a lot of cool stuff today and it's amazing how much info is out there. But I think the 90s were more creative. Stuff like StackOverflow is great. Documentation was MUCH better in the 90s.
But I lived through that era and being forced to write efficient code is a god-damn nightmare. There were so many ideas flying around that you simply couldn't do because the computing power just didn't exist.
Today people revere things like "vi" but when you were forced to use such basic utilities because your human/machine interface was a 300 baud modem, or even a paper teletype, life wasn't so good.
Nostalgia isn't what it used to be.
* Is trivial to bootstrap
* Is totally serverless/scalable
* Can be run effectively for free with low levels of traffic, like an S3 static website
* Uses a non-broken PHP-like templating language (maybe jinja?)
* Uses Postgres instead of MySQL (ideally without infrastructure to manage, a la Aurora Serverless)
* Somehow intelligently figures out how to split between server-side and client-side rendering - ergo, you write templates like PHP and it seamlessly figures out bits that can be AJAX-ified to avoid full page-loads
If someone could just, like, make that, web development could be as fun as it was in the days of yore!
Nowadays, such esoterica is relegated to the tiny niches of systems and embedded development.
I am really enjoying working inside of Turbo Macro Pro. It is an interesting challenge to make the assembly code efficient and easy to understand.
I was pro-closed source in Turbo Pascal days.... once the licences turned to complete shit I went pure opensource.
Also, being able to trivially modify almost anything I used (because it was largely open source, and simple).
Lots of things were worse, but less abstraction was pretty nice.
I miss "the front panel" where I could single-step my program and read the octal / hex off the front panel lights.
I miss debugging my program with a plastic block and a wire to hand-punch patches into the binary paper tape.
I miss the "user manual" that had circuit diagrams as part of the documentation.
I miss doing "machine vision" on boxes of punched-card images.
I miss sitting "at the console" of an auditorium-sized "machine room" with a sea of DASD, watching the PSW flicker as the program counter changed.
I miss analog computers.
I miss coordinating 24 IBM Selectric consoles all "typing out" classical music (each one tapped out an orchestra instrument). Beethoven's fifth on selectrics....
I miss playing music by holding my radio next to the mainframe while my program was running, adjusting the program so it played a song.
I miss hand-designing a 16x16 multiply chip in MOS.
I miss programming plated-wire memory in binary switches to drive a Unimate robot.
I miss "scoring a complete copy" of the listing of Lisp 1.5 and reading the source code.
I miss running "the Hadoop algorithm" on a room full of punched card equipment (Google didn't invent it).
I miss "real programmers" who could solder. And could replace a failing memory address chip on your core memory board.
But now I'm proving a computer algebra system correct. So it's all still good.
That sounds like an interesting project! Can you expand on it? When was this? How many TTL chips did it take? What other components did you use? What did you do with the end result?
There's even a kit for you to build one.
There's an entire series on YT of him building it, very informative!
I don't get the same sense that stringing lines of code together can change the world, largely because, where it could, it mostly already has (I am not a believer in the latest wave of AI hype).
What we did used to be magic and was met with gasps and now, it's mostly just expected and complained about if it breaks :/
Whippersnappers these days are so gosh darn polite.
Writing C code on a SPARCstation and Green Hills compiler/debugger as an undergrad in 1989, I got used to step-forward, and step-back, in a GUI environment.
I had no idea that I'd never see that magic "step-back" button again, in C, C++, MFC, Java, Scala, or a litany of scripting languages and development environments.
Other people have touched on it, but you could be proficient in literally all the programming languages back then. There were less than 20. The later explosion of languages and techniques in common use can be overwhelming, but has stretched the spectrum available to our craft.
As an aside, reading and understanding the entire disassembled "kernal" and BASIC rom for the C-64. All of it taking up less space than just about anything on just about any site nowadays - in that respect we took a wrong turn somewhere IMnsHO. Yes, storage is cheap and bandwidth abundant but the same is true for clean water yet nobody drinks 50 litres of the stuff per day 'just because it is cheap and abundant'...
I mean, obviously, why do you need more than 300? You can't read text faster than 300 baud.
Now I'm now so sure... sometimes I think the rate of interesting new stuff to _read_ is slowing down again and I may yet catch up again.
Also, I miss the joy of learning all the deep features and idiosyncrasies of a computer - its instruction set, I/O routines, timing and interrupt tricks, etc. These days I am focused more on Big Hairy Goals and complex systems. Fun of a different sort.
In comparison, modern software development is almost all about composing or stitching together a bunch of different services. There are other complexities around orchestration and breadth of the systems involved, but very rarely one goes deep into a particular technical problem.
After a really long time, I'm getting a kick out of working on a highly technical problem: an open source search engine from scratch (https://github.com/typesense/typesense). It has been deeply rewarding and definitely something that's missing in a typical web oriented software development career today.
My experience is that it was much better for concentration, creativity and productivity...
I can still do it better today, but the stack is so deep now, I just don’t have the time.
Would I go back if I could? Nah.
Hacking is a blast, but never forget it is to achieve a greater purpose. A greater purpose for our software’s users, for ourselves, and for for our subroutines.
Old hackers never die— we merely gosub without return.
The dump reading was done with a cardboard pamphlet that listed the machine instructions, arguments, address offsets.
You'd find the hex abend on the greenbar, mark it up with a pencil, and use the machine instructions to figure out what went wrong (and the record that caused it).
In my business (banking) this was done sometime around 2 a.m. after you got called back to work by the night operators.
Ah, the good old days.
I quickly hardened the shit out of our interfaces so that we (a) wouldn't break and (b) had automatic workarounds for when feeding systems blew up. Oh, how my project leader loved me.
enscript --landscape --line-numbers --media=A4 --font=Courier7 --highlight-bars --verbose --highlight=c myFileName.c
For example: On an IBM PC, running DOS. You could know all there was to know about the BIOS, hardware ISRs, DOS ISRs, DMA, how a file system works, and maybe a bit about the graphics system, and you could make the machine do anything you wanted.
Today, if my kid wanted to learn all those things, I wouldn't even know where to tell him to start. You simply can't get that low (easily) anymore, or know that much about numerous subsystems. And the higher level stuff, like you say, is buried in layers of abstraction.
Of course, now we have an unprecedented access to cheap, powerful microcontrollers and SBCs, which make up for it to some degree - but it's still harder to translate that knowledge into something of perceived usefulness on a 'proper' computer.
"We build our computers the way we build our cities -- over time, without a plan, on top of ruins."
You know, it’s pretty interesting that IBM was painted as the devil, but that they made this standard, interoperable computer system that you could touch at such a low level. Now it’s all Windows 10 and MacBooks and Linux userland, and even if it were possible, good luck finding documentation for any of that.
—Sent from my iPad.
I remember all I used to need was a C compiler and a task and basically alone I would be able to complete it with the standard stuff that came with the compiler.
During the era of shipped software (long cycle times) a hardware / software co-design model worked well to work out the customer requirements, dependencies, documentation, training materials, marketing, sales sheets, software, installation tools, and all manner of support modifications. Some of that cross-team communications takes more time and coordination than may be available in a two week sprint.
There were also the constraints of the release schedule burnt into everyones brain. So many things depended upon your shipping date that you dared not miss it. Sure, agile / scrum / whatever enables you to ship something a bit later, but there was nothing like the pressure cooker of everyone working as a team to make their delivery date. Requirements, in my rosy recollection, became very clear - things were in or out, not maybe.
Moving from large shipped software to web applications in 1999 offered any number of advantages, but it was quite difficult to keep the non-coders up to date with features and capabilities. Understanding doesn't always move at the same speed for all people.
What I'm going to say next may not be everyone's experience, but the two week sprint has started to seem like a convenient way to set and miss objectives; i.e., we can put it into the next sprint. Sure, I understand that planning is a black art in some cases, but always missing your commitments and making the same statement indicates, to me, that there's a problem with the absorption of the agile development model.
Product development and execution is a very hard process for all involved. Brooks had it wright, there are no silver bullets and the web seems to proliferate the belief that such things actually do exist.
I miss Tops-20 CMDJSYS sometimes.
I don't miss VAX/VMS.
SunOS was pretty darned scrappy at the beginning.
Usenet prior to the 1994 Apocalypse On Line (AOL).
I miss Fujitsu mechanical keyboards (so much noisy awesomeness).
I miss having an office with two or three people, we had so much fun and worked so darned hard together. We remain terrific friends 30+ years later.
Since this is phrased as "old guys" I'll also add that I miss working with more women. Remember when I mentioned the early days of SMP? In those days we were still working closely with the people who were figuring out how to physically connect the pieces, keep caches coherent, what kinds of lock instructions (let alone higher-level patterns) would be useful, etc. Hard core stuff, and maybe a third of the people I was working with were women. That's still not a majority, but it's down to less than half that in most teams I've been on for the past decade. Things have gotten worse, and that really makes me sad. In so many ways, I feel this industry has been going backward.
Yeah, there are some excellent, well documented OSS libraries out there, but they are the minority and there's no one who feels obligated to help you if you need a small improvement or moderate bug, cause no one took money from you.
Imho, the trade-off isn't worth it (free but self serve support) for professional development.
The transition from monochrome monitors to color.
The explosion of new peripherals that did new things, not just the same thing slightly better.
The first (consumer) hard disks.
For that matter, a computer that a person could own, rather than a corporation, was an absolutely amazing shift. If you had an idea, you could try to make it a reality - not a reality for a company, but a reality that might become your company. You could do that with just a bit of money, in your spare time (it would eat all your spare time, though, and some more besides.)
I miss the 68000 series of chips. They were great to work on. (Others have said that the 32000 were even better, but I never worked on them.)
I miss SGI workstations. For their day, they were awesome.
I worked on both. The NS32K had a more complete instruction set, but I wouldn't say that made it more fun. Then again, maybe I'm a bit weird. The chip I most enjoyed working on was the MC88K, which everyone else hated because of its exposed pipeline. Part of me would like to tinker around with a 6809 or Z80 some day, not because I'd expect anything useful to come out of it but just because fitting within those constraints is a fun kind of puzzle for me.
A real, ent-to-end development environment, with the ability to do the database, logic, forms and reports just 1 second after install. No package managers requiered!
My current dream is to build a spiritual successor (http://tablam.org) and doing it I start to appreciate how hard and how awesome it was. Also, the current me can't shake off the new ways (so, already diverted from Fox in a lot of ways) and must rely on imprecise memory of the past doing it. But still, I think a dbase-alike environment is surely missed....
(I still do it, but now people think I'm crazy.)
Turned to shit pretty rapidly, but for a day or three I was wondrously happy.
massive code listings or memory dumps anymore?
It took a fair number of pages to get up to speed and a largish chunk of a box of paper just to stop.
I miss the simplicity of a 320x200x256 screen. Want to access the pixel at (0, 0)? $A000:0000.
I miss FIDONet.
I miss the big GOPHER network.
I miss the the variety of monthly ANSi packs.
I miss ACiD.
I miss the wonder and amazement of the demo scene.
I miss the community feel.
Hop on a Telnet/SSH BBS and check it out!
Things were much simpler back then. But, simple did mean easy. It meant you had to do every little thing manually from scratch.
Just kidding, of course. It's a terribly bad practice. Don't do it.
However, there was something about the simplicity that came with it (in case everything went smoothly, that is) as well as the ability to make minor changes without having to go through a deployment pipeline.
And then walk down the corridor to the terminal room to turn thoughts into programs and/or ask the other guys or gals.
Strangely enough there were more ladies around in those days.
Actually had an office with a door you could close and a window you could open to actually think in.
Stuff that was a major challenge back then, like rendering a polygon or keeping the state of an HTML form, is trivial to do today. I wish I would be paid to write early 90s programs with today's tools :)
But mostly, I miss the fact that 20+ years ago, programming was still considered a professional, white collar role that engendered respect from the entire organisation. Nowadays, I feel that most coders are lumped together with "support people" and are considered nothing more than replaceable factory line blue collar workers.
IOW language advocacy and flavor of the month were rampant in the late 80s onwards from my POV.
These days at least there is enough community for all of the major languages and even the minor ones.
Also I find programmers get a lot more respect now than they did in the 80s and 90s. Programmers can make $300k+ USD at a top Bay Area company today, and starting salaries are well over $100k. Often they talk directly to users - no handlers or analysts in between to shield people from the “strange programmers”. Attitudes have shifted. Though given your experience, I guess not uniformly.
Pushing off compute to the browser because it’s cheaper, nevermind the cost to non-powerhouse phones.
Avoid putting any thought into optimization because compute and memory is cheap. Nevermind that you’ll have to do a complete re-write when a few dozen 600+ ms microservice responses results in 20 second page loads.
Store all the things! You might need them someday, after all. GDPR what?
I work with developers who think that DB migrations are “of the devil”. As a result, they have pivoted to use the RDBMS as a rudimentary k/v store and create the relational data structures in memory. All they need to do is pull a few dozen GB of rows from the DB with every container restart.
So, yeah. I miss having fewer abstractions; having more constraints. The software seems somehow nicer through the CRT-colored lenses.
Inventing the wheel, rather than re-inventing it. There was a lot more "doing something for the first time" and less "doing something for the Nth time, slightly differently".
(Or so it felt to me. But there was still a bunch of "doing what the mainframe people did a decade ago, but doing it badly". Still, it felt different.)
Edit: Well, to be precise, not XSLT per se -- but I'm missing a similar generic transformation DSL for structured data, ideally able to deal with any form of structured data like JSON, XML, CSV etc...
The prevalence of applications over webpages. Don't get me wrong the web has massive benefits in terms of standardization, distribution, and security but I miss the power and flexibility of traditional applications. And yes, I know they still exist but I'd say they're not the primary delivery mechanism of most user-facing software today.
It was all very inefficient but what a way to learn!
It was probably like the 1930s and 1940s with radio: people really knew how it worked at the nitty gritty level and could friggn make an impromptu radio out of debris from a crashed army jeep (probably on youtube now).
That’s civilization: layer upon layer upon layer of knowledge and advancements until we all roast the planet from our greed, driven by the 4% (the sociopaths).
Before about, 1996-- (I'm estimating)-- before network connectivity was the norm-- I understood how every processor in every system I dealt with worked. Two things happened in the mid to late 90s.
First-- an explosion of silicon. The days of understanding how the system worked, in its entirety, were over. I had a copy of "80386 systems designers guide", and I can't remember the name- but the Michael Abrash's guide to VGA video cards and a PC Interrupts were all that were needed to master computer architecture. If you were super fancy you understood the Pentium math extensions (whose names I cannot recall), that let you do a crossbar in a few cycles, and you understood how common chips like the 16550 worked, which is an addendum to PC Interrupts if I recall.
Second, and this is the key thing, technology started working AGAINST us. As complexity exploded, so did network connectivity. We had this era in which Operating Systems complexity exploded (win95), silicon complexity exploded, and connectivity exploded. The thing about connectivity is this-- that's when our computers went from isolated things to these things that are always online-- they started working against us. They could now talk to other computers, other people, and that changed computing fundamentally.
Computers went from these things that were our helpmates to our masters. This is what I lament the most. I don't miss bit-banging, assembly programming (well, a little). I fudging love Ruby/Python, as opposed to C++ being considered "High Level." I love that I can buy a computer for 35$ (RPI) that is fantastic. RPi's are so cheap I employ half a dozen just to run my 3d printers (yes I have a problem). But I do so much not miss being scared of my computer. Miss being scared of the ne5work. I miss the sense of wonder at what a computer could be or do. You have to understand technology as it exists now, is beyond my wildest dreams. I watched Star Trek TNG as a child and the devices we have now, legitimately have exceeded my wildest dreams. The simplest cell phone now has as much computational power as every computer combined at the time I graduated high school.
But, I do miss simplicity. So much so, I've been considering writing an NES or Sega game; I'm precisely 40 years old, and I've finally come to understand that art and constraints are intimately connected. That they press on each other, and neither is possible without each other.
I have so few restraints on modern systems that I... am constrained. I miss the constraints of my earlier years that were, in fact, my freedom. I miss software that shipped and worked on the first day. I, like everyone, miss my childhood. Because the universe is an explosion of complexity-- and when we look back, we will always feel like things were simpler-- because they god damned actually were.
Also, I am old.
A famous Australian (well, he was born in England but spent his life here) named Paddy Pallin nutshelled it. “The best place to be is here. The best time to be here is now.”