1) We just expected single core performance to double forever every 2 years or so. Many of us were ignorant of single-core performance scaling and the memory wall issues. I want to be clear: Computer Architects were warning people about this in the early 90s but many ignored their inconvenient truth.
2) 2D GUIs would evolve into 3D GUIs - maybe VR? Maybe something else? And the CLI would be gone
3) Low/no code drag-and-drop style programming tools would take away many dev jobs
4) Microsoft taking over and Unix dying
5) All programming would be object-oriented (We are talking 90s style definition here)
I started programming as a kid in the early 90s, and went to college in the late 90s/early 00s. It might sound crazy in hindsight, but between these tools and outsourcing, everyone I talked to thought being a programmer was going to be a dead end minimum wage type job, and I was strongly advised by many to pick a different major.
Equally, tools like Wordpress has killed off/deskilled the old 'webmaster' role - I mean sure, there are still people making money doing Wordpress sites for businesses but it's nowhere near as lucrative as it was at the start of the dot-com boom era.
Yes, now they are all Cloud admins.
He: Civil engineer, ending up working in China (stealing their jobs!)
Me: MechE, but fell into data science, and now work with a local team that's plurality 1st gen Indian, and majority 1st gen non-US.
How would I grade her prediction? ¯\_(ツ)_/¯
Don't worry, the big tech companies are spending ungodly amounts to make this happen.
In the 90s and 00s, every company with more than 1-2000 employees would have an internal dev staff. I worked in some companies with 20 devs, some with half a dozen, but there was always a dev team in the IT group.
Today, companies do low code via SharePoint and Salesforce. The do BI with Tableau and Power BI instead of internal BI teams. Their external web presence is done with Wordpress instead of a custom site. Sprinkle in some SaaS, and the internal dev teams of corporations are way smaller than they used to be, with some companies not having any internal devs.
Now you can hire content writers with experience in using CMS systems and leave the security/infrastructure part of it to developers.
I expected IPv6 to be adopted quickly.
I expected 6GB to be more storage than I could ever fill.
I expected Windows to be quickly dismissed and Unix to become the standard.
I expected to have a screen built into my glasses by now.
My response then is mostly as it is now - it's just too confusing to read and write. The UIs - even at the time - telling people to put in a dotted quad - were... manageable. 4 is an easy number to understand. Numbers only is easy to understand.
I still maintain that had some intermedia come out adding 2 (or possibly 4) more spaces and we transitioned from
18.104.22.168 to 0.0.187.43.250.198, and defaulted most UIs to just prefixing with the 0.0 then grew from there, we'd have had much faster adoption, and still given ourselves 64k 4 billion address spaces.
But... I'm not a network technician, nor am I on the committees... I'm just someone who's had to live with the last 20 years of "ipv6 is coming, we're running out of ipv4 addresses!!" and... the last 5-10 years of trying to mess around with ipv6 and realizing it's still mostly out of my control (home ISPs, ime, do not, support it still).
tldr, I never expected ipv6 to be adopted quickly. I'm surprised it's made it this far.
It's actually simpler than IPv4 in many respects. For example, prefix delegation: My router is getting an IPv6 /56 block from my ISP using DHCPv6. It is then handing out /64 blocks of on several different subnets with minor configuration.
The average user doesn't care about IP addresses. They're using DNS.
How often do you watch other people browse the web?
If you count iOS and Android having a Unix core...
However, as far as I can tell every major tech company still expects that problem to be solved and for them to be the Next Big Thing. I don't know why else they'd be putting so much money & PR into consumer AR efforts when it's niche at best, so long as you have to hold up a device to use it, unless they fully expect hardware development to come through in the nearish future and make AR glasses the next smartphone, in terms of becoming must-have and driving the next wave of human-computer interaction.
Problem was the only stories about it were about the “glass-holes” walking into bars with a camera on their face. I thought it was an interesting “intervention” style art piece that showed people still expected obscurity, if not privacy.
I've tried the glasses from North, which has been bought by Google a few years ago. The projection of the screen on the lens looks cool, but the glasses had no traction in the market and the company tried to charge me C$1100+ for a pair of prescription smart glasses.
Starting from some point (Pentium Pro I suppose?) even x86 CPUs became RISC internally CISC externally.
I remember working with Silicon Graphics gear back then and the 3D guys were quite enthusiastic about it. Of course, we had all devoured William Gibson’s novels like Neuromancer so we were naturally attracted.
We even built some 3D desktop apps as portals to the wider internet. They were too static and too hard to build so the Internet with HTML had much more pull for producers and consumers as well.
In practice, I think the 3D stuff that really worked at the time was more games like Quake and avatar-based web apps like Habbo Hotel.
It would still be fun to go back and read Jaron Lanier’s writing from that era.
When I was a kid my dad refused to teach me to program, he thought it would be a useless skill to learn because he believed in the future there would be no more programmers, as everyone would be able to program anything they wanted through drag and drop interfaces.
I don't get that. Moore's law is about transistor size shrinking, but that has an obvious end — transistors can't be smaller than one atom across. I feel like "everyone" did know that, even in the 90s. Or, at least, it was mentioned in every explanation of Moore's law itself.
> Low/no code drag-and-drop style programming tools would take away many dev jobs
I mean, they did, but they also created consultant jobs to replace the ones they took away. See: Salesforce.
The need to write source code can be taken away; but the need to do requirements analysis and deliberate architectural design cannot. So you end up with people who do 80% of a programmers' job, except for the coding part.
We are nowhere near an atom well into 2005 at least, but things were getting pipelined and there was a ton of SRAM on each die, there were improvements to fetch pipelines (well, SPECTRE and friends really), everything was happening as fast as it could with out-of-order pipelines, with prefetch and every other speedup out there including auto-vec.
The "how big is an atom?" is sort of post-2010 thinking, at least the way I saw it happen.
Modems was one place where this sort of thing I saw personally happen - 9600 baud modems to 56kbps happened so fast and it almost looked it would keep going that way (Edge was 115kbps, now I can't even do a page load over it) with DSL suddenly dropping in to keep the same copper moving even faster again.
For example, these techniques improve instructions per clock, at the cost of adding transistors:
* pipelining (late 80’s early 90’s) MIPS R3000, Intel 486, Motorola 68040
* superscalarity (early 90’s) Pentium, DEC Alpha 21064, MIPS R8000, Sun SuperSparc, HP PA-7100
* out of order execution (mid 90’s) Intel Pentium Pro and later, DEC Alpha 21264, Sun UltraSparc, HP PA-8000
* SIMD (vector) instructions (mid/late 90’s) Pentium MMX (integer) Pentium III (floating point), DEC Alpha 21264
* multithreading (early 2000’s) Pentium 4, planned chips from DEC and MIPS
We also got improved branch predictors and larger, more-associative caches.
But it feels like most of that progress stopped in the early 2000’s, and the only progress is slapping more cores on a die. I mean, if you can put 8 cores in a consumer level CPU, you have 8 times as many transistor (give or take) as you need to implement a CPU. Nobody seems to be building a higher IPC CPU with 2x transistors, even though they clearly have the transistors to do it.
> Nobody seems to be building a higher IPC CPU with 2x transistors
I mean, there are designs like this, but they run into problems of cache invalidation and internal bus contention.
The way to get around this is to enforce rules about how much can be shared between cores, i.e. make the NUMA cores not present a UMA abstraction to the developer but rather be truly NUMA, with each core having its own partition of main memory.
But then you lose backward compatibility with... basically everything. (You could run Erlang pretty well, but that’s basically it.)
Sure, but that's not helping the general case. Only specific types of workloads. You could argue that adding lots of special-purpose hardware doesn't hurt from a transistor count (we have plenty) or power perspective (turn them off when not needed), but it can make layout tricky and reduce clock speed (which slows down everything else).
> [...] cache invalidation and internal bus contention [...] NUMA
Sure, but the context from spamizbad was specifically single-core performance. (I probably opened up a can of worms by mentioning hypertheading). The problem is many real-world workloads that add business value are not embarrassingly parallel problems. If it worked like that, Thinking Machines would have swept the court since the 1980's (they had NUMA like you are describing).
The point is that, since about 2005-2010ish, single-thread performance has mostly stalled. Intel CPUs can issue slightly more instructions per clock. AMD has a slightly better branch predictor. But performance growth has mostly been the result of adding more cores (Except Apple's M1 has some magic).
The things I previously mentioned gave big IPC gains on a diverse set of real workloads. Some innovations, like pipelining and multi-issue were responsible for 2x-4x IPC each. Pipelining, in particular, was a trick that also helped clock speeds.
All those innovations happened between the late 1980's and early 2000's. So an observer during that time might have just assumed that similar innovations would keep coming. But they haven't. A Pentium III has probably around 15x IPC compared to an i386 (maybe 60x if you include SIMD), in addition to a 40x higher clock speed (some of which came from adding more transistors).
How can you add transistors (say 2x or 3x) to a CPU to double performance on diverse, real-world problems that don't parallelize well? My point is, I don't think anyone knows, so it is irrelevant whether there is a physical limit to transistor shrinkage. We don't even know what to do with the transistors we have, so who cares if we can't have more?
Yes, but by itself that isn't a single core vs. multicore issue. It's a process node question generally.
That said, Intel in particular did try to push single-core frequency at least a beat too far. They demoed a 10GHz prototype at one IDF--and other companies were mocking them for it.
The story that a very senior Intel exec gave me at the time was something to the effect that, of course, they knew that they were going to have to go multi-core but Microsoft was really pushing back because they weren't confident in multicore desktop performance in particular.
this is still something some of us think about today
Ah, HyperCard ... RIP.
I think it's safe to say that we were close with this one, with jetbrains java (awt / swing) and wpf / winforms visual studio.
Then dot com, mobile and responsiveness arrived.
And it did kinda for a while.
However that was more of an afterthought- I remember that when it came out I initially bought into the write-once-run-anywhere hype and the idea that Java was for writing applets.
The UIs were too clunky (AWT) or foreign (Swing, the second attempt) and the Java SDKs buggy so in the end HTML just improved faster and took over the presentation layer together with Futuresplash (later renamed to Flash).
It only did not happened because they messed up with UNIX subsystem on Windows NT/2000.
Those that give Apple money to develop GNU/Linux software would be just as happy with the POSIX subsystem.
During the 00s and early 10s, it felt like it is.
IMHO that didn't happen, the integration morass remained, and I'd consider much of that model a failure in retrospect.
On the infrastructure front, I seem to remember a lot of talk about Frame Relay (EDIT: actually I was thinking of ATM, thanks for correction below). And fiber installs all over the place, lots of money in getting fiber into downtown areas, etc. Also I don't think people really predicted the possibility of the dominance of the "cloud" as it is now, really. I mean, hosting services were I'm sure a growth market but "serious" companies were seen as wanting to have their own DCs, and I don't recall major talk or trend away from that until the mid 2000s ("SaaS" etc. became the buzzword around then, too).
In one sense it failed because all it ever saves you is the typing. You still have to learn the whole model, sometimes it is like learning a programming language from scratch.
On the other hand, it may have succeeded. Objects don't exist in isolation but as a library with other objects and frameworks, and APIs are basically object models. You could think of React or the stripe api as object universes.
What is interesting is that you can't sell a developer product like a commercial products. Developers need access to the code so you either make it open source, and sell consulting, or else you make it a web api and rent it out.
Open source killed it all.
He usually says something like: They showed me three things are Xerox Parc, first, the windows mouse graphic interface, second, networked computing and the third thing I didn't notice at the time which is object oriented software.
Then he mentions how at Apple they did the first, and the internet is the second. At Next, they are doing the third, and then brings up objects.
It was during the NEXT era. Here is the lost interview:
But, as I said, I think he has also said it at other places.
I still re-read this every few years: https://www.wired.com/1996/10/atm-3/
"How do you scare a Bellhead?" he begins. "First, show them something like RealAudio or IPhone.
Haha, I forgot IPhone was a trademark before the iPhone.
Executable architectural models & designs.
Round tripping, models to code and back. (aka two-way translation?)
We even had a slogan for it: "computers are bicycles for the mind": https://www.brainpickings.org/2011/12/21/steve-jobs-bicycle-...
I was so naive...
The problem is something else. The problem is that the internet is more than one thing. It is a couch for the lazy, a distraction for the procrastinator, a massive entertainment delivery device; and most of the monetary value of the internet doesn't come from it being a bicycle for the mind, but for the other things.
Not everyone chooses to ride on the bicycle. Instead of, say, reading Wikipedia, they read Infowars. The world would be a much better place if a lot of people spent their time reading Wikipedia (with all of its faults) instead of scrolling Twitter, Facebook, and slanted news sites like Breitbart, CNN, Fox, etc. etc. etc.
Computers are information. Giving people access to information does not magically make them moral, or curious, or thoughtful. It just amplifies what they were already trying to do.
1. creates a giant, complex vector for the transfer and recombination of ideas, and attitudes: it's an influence and innovation engine
2. facilitates certain kinds of collective action and organizing, from traditional interest groups, to flash mobs, and getting ratioed.
3. facilitates certain kinds of illicit or illegal behaviors, activities, etc while making other kinds much harder. Anonymity is difficult, widespread surveillance is the norm; but you can hold entire pieces of infrastructure hostage, like oil pipelines, from your couch.
4. creates a new set of dynamics that we are only coming to recognize. for example, social media platforms like Facebook make it like we've all become more densely packed together, and we've not really developed a set of norms to accommodate getting along at that level of density.
As a 90's kid who grew up online, this was my thought at the time as well. Turns out computers were more like an op-amp for the existing inadequacies of human nature, rather than a bicycle for the mind.
Most published research is false even if it's peer-reviewed, and it may be better to leave it to people who read it full time and understand it's an ongoing conversation.
Not that the liberal "trust the experts" worked, since Fauci's policy was to only say things that made him sound trustworthy, whether or not they were true. (and he admits this)
Also, I'm not sure you can say that "Most published research is false". There are many degrees between true and false. In the physical sciences, with which I'm familiar, the papers are demonstrably 'mostly true'. For example, as a bi-product of an experiment last week we observed an Ekman spiral (first published in 1905 https://en.wikipedia.org/wiki/Ekman_spiral).
I agree it depends on the field. I was thinking of medicine, where there's a well known paper about this that's led to some improvements like pre-registered experiments, although there's some newer ones that show ongoing problems in social sciences.
 https://journals.plos.org/plosmedicine/article?id=10.1371/jo... (Why Most Published Research Findings Are False)
 https://advances.sciencemag.org/content/7/21/eabd1705 (Nonreplicable publications are cited more than replicable ones)
I admit that I too was once a believer...
Fast-forward today, most of the people I know who were in electronic have switched to software.
They all had to learn to code in school and had background domain knowledge in some specialty. In contrast myself and my CS friends had, well, we had knowledge of Turing machines with infinite tapes.
As the years played out it normalized between the two groups as each filled in their own weak spots with experience. But in those early days I was pretty jealous.
Yeah. Going to school for Computer Science is probably the biggest regret in my life. I could have spent that time & money learning something useful/interesting instead, or better yet, not waste my time in University at all.
For me, CS was largely a waste. Most of the real, required-for-job programming skills I learned were self-taught anyway. I just used CS as an on-ramp to industry. Maybe nowadays CS is required for "signalling" but it definitely wasn't in the 90s.
Wish I had done physics or hell even math. Now I'm middle-aged and the only things I "know" relate to this stupid pile of silicon on my desk.
But yes. As a junior developer where my programming skills were roughly the same as my friends who had non-coding skills as well, there was some second guessing on my part
Uh, what? Apparently computer science gets the credit for, eg, the transistor? Nah.
I still don't entirely understand why it turns out to be so hard to outsource. I get some of it, but it seems like something that we should have figured out by now. I'll be interested to see what the post-pandemic shift to remote work does for that.
* It's difficult for Western businesses to time shift, and there are usually caps on work visas.
* Perhaps until recently, most Western programmers genuinely enjoy the work. What I've heard from Indians is that many Indian programmers are more interested in building a career than in the work itself. Intrinstic vs. extrinsic motivation. I think this may be declining as a factor as software eats the rest of the US economy and brain drains talented people from other fields, though.
* Cultural differences--it turns out that English fluency is necessary but not sufficient to seamlessly slot into an Anglosphere company.
* One way out of this conundrum could be for new software companies to start in India and outcompete Western companies (at which point culture differences and time shift wouldn't matter), except I get the sense there are barriers to that.
My current best theory is that the software industry is overflowing with money, and it's still more beneficial to try to optimize for output quality and quantity (i.e. hire better software engineers no matter the cost, bring people into the same locality instead of hampering progress by splitting work into different timezones), instead of trying to cut costs to improve profits.
Once the growth finally ends, outsourcing to cheaper places might become a very attractive option.
Didn’t listen since I loved programming and computers and it worked out well for me.
I'm not sure that this ever amounted to more than a urban legend. When you actually looked into the supposed outsourcing "trend", all you heard about was TheDailyWTF-grade horror stories.
Urban legend? Entire books were written on the subject, by respected authors, too:
The book is not only pushing 30 years old, it was crap when I bought it in hardcover in '93, and whoo boy, did that one not age well.
The problem was too many people entering engineering degrees after a temporary 'IT boom' decades ago, seeing that as a way towards upwards social mobility.
Everything is already created. They would download movies, music, and games with BitTorrent and chat with ICQ and IRC. What else is there to create?
I should go into administration, because that's where the big money is.
I agree that for a tech-savvy consumer end-user, not that much has changed. But endless porting and migrations aside, there are still plenty of businesses and industries that need new software for new ideas. For example most every new concept in medicine or finance needs software.
The field to look at was cognitive science or neurology, which in fairness makes some sense given where machine learning is today.
Today, people are trying to get into tech companies, and those who take one boot camp course call themselves “software engineers.”
In the early 1990s, it sure felt like there was a limited market for robotics, machine learning, AI, etc. It was mostly industrial robotics and assembly line inspection cameras.
You are either building killer robots like Boston Dynamics (morally reprehensible) and drone striking the middle east, or helping people make things with production line robots.
No educated person in the late 90s could have anticipated how Foxconn would obviate "electronics."
No one knew what the internet really was and what it would become. Not Bill Gates, not anyone else.
Developers believed that processing power, RAM, storage etc. would continue to grow exponentially to the point where there would just be too much of it and we wouldn't need to care about any resource constraints when writing code.
Writing code line by line in a text editor was supposedly on its way out, to be replaced by fancy IDEs, WYSIWYG, UML etc.
All the jobs were supposed to go to India. Programming as a profession has always been on the brink of death for one reason or another, and yet here we are.
I was pretty excited about it too... And I was a kid in the 90s. I once worked with a guy who got excited about new releases of directx... Which I also admit could be a little exciting
In the 90s, not getting an expensive new PC every 2 years or so meant you were screwed if you wanted to play any of the cool new games coming out. I haven't upgraded my system in over 5 years, and I can still play most new titles with a few minor graphic settings turned down or off.
If you're using something like Visual Studio then that's is a very fancy IDE and a long long way from writing C in a text editor and dealing with arcane compile errors.
Didnt Bill Gates send a memo to MS employees in the 90's that internet is going to be very important and they have to take it seriously.
1. Compiled (native) vs. interpreted (bytecode/VMs). It's this forever cycle of "performance required" that shifts to "portability required", which lacks performance, and gradually shifts back, and repeat.
2. Local processing vs. dumb terminals and shared computing. The current "dumb terminal" phase being the idea that you can buy a $100 Chromebook and working entirely online.
The 90s was peak desktop. Everything was on the local hard drive.
Now everything is back on the mainframe(cloud) i fully expect it to cycle back to the desktop in some form. (we just won’t call it that)
Edit: OMG! the term exists! https://en.wikipedia.org/wiki/Fog_computing
Every tech or bigco job i've had there's at least one mainframe codger doing the Kinge George "You'll be back" number from hamilton. Don't think they anticipated the non-MF cloud. Almost always among the best engineers though in the joint though, IME.
I miss genuine differences in hardware. Happy hour discussions about alpha vs sparc vs intel etc. I'm hopeful that diversity will come back around as a thing.
I miss genuine differences in hardware. Happy hour discussions about alpha vs sparc vs intel etc. I'm hopeful that diversity will come back around as a thing.
Can anyone recommend something quite a bit different that a guy could toy with? Or is this why everyone is having so much fun with 8 bit retrocomputers?
But a key problem with alternative OSes is that if you want to use them as daily drivers, your software choices may be limited or nonexistent. Haiku, for instance, has a WebKit-based browser, but no Firefox, Chrome/Chromium, or Safari.
I thought all I ever really did was VS Code, so I got code-server installed and everything else was mostly online anyway (Office 365, Exchange, Wordpress, etc), but the biggest pain point was the lack of a native terminal and ssh.
Even though VSCode has a great terminal built in, it seemed so wasteful to load into it just to open a terminal window.
After a couple of months, I just switched to MacOS, and still use my code-server install instead of VSCode, and 100% prefer it.
Hacker culture calls this the Wheel of Samsara.
Joke was on the proponents of this since web-apps bloated to require a damn super computer to run them with acceptable performance, until that YC/PG-backed startup that was posted on here recently came along selling mainframe-hosted-browsers-as-a-service-over-VNC.
And as you say, there's definitely a cycle between "everything local" and "everything central".
The second point is true as well. No matter where the pendulum is at the moment, there are always going to be people trying to push it in the other direction. Sun was again ahead of the time by pushing the Javastation, which was trying to do what the Chromebooks are doing. Then there was the Sunray of course, which was really cool, but a bit too specialised.
Read more Hacker News and you'll see complaints about both languages.
The debate today is mostly AoT vs JIT compilers, I don't see anyone bring up VMs unless they're talking about how slow Python is.
I also don't see anyone bringing up "portability" anymore - compilers support all the architectures people care about, while libraries and APIs are designed to be platform agnostic as far as they can be while developers recognize there is no such thing as portable software (and where portable software is absolutely required, it shall run in the browser).
I thought the future was making that publishing easier. I wasn't wrong about that- twitter and facebook are easy. I just missed the consequences of "easy" becoming "walled garden."
Wikipedia is still up and running ;)
Nothing is preventing society from using the internet in that way, but it's sooo much easier for the average non-tech person to not self-host an email server versus using something like gmail. It's the same reason I do not own a lift for my car.. sure technically I can install a lift at my house for my car but then I have to maintain it's hydrolics and I'd only use the lift once or twice a year ... who has time for that? Especially if my thing is fixing piano's and not cars. Let the car people handle cars, let the piano people handle piano's... I think we are in the same situation with tech.
You are _technically_ correct, but I think you are minimizing network effects by simply calling it "easier".
These days you are pretty much required to use those same 10 sites if you want to interact with anyone, do business, or find useful information.
Those 10 sites have effectively captured the Internet.
The gp is saying the millions choosing the "easier" action is what causes the network effects.
E.g. another example that applies to tech-savvy programmers: most developers have the knowledge (or can Google a tutorial) to host their own personal git server but most choose not to. The collective millions choosing not to bother with self-hosting leads to emergence of something like Github.
Signing up for a free account on Github is easier than:
- hosting a personal repo at home on a laptop or Raspberry Pi
- buying a $10/month shared VPS server to host it
So even before Github had network effects (say less than 1000 user accounts), it was still easier to create an account on an unproven Github platform than self-host. Those early adopters leads to future strong network effects.
But that also leads to unwanted side effect of Github deleting repositories from DMCA takedowns, etc (aka "censorship").
90s: ask your ISP for a public IP, register your domain, start Apache and off you go
Nowadays: getting a public IP is iffy. All good domain names are taken. Emails from your self-hosted mail server go straight to spam/junk. Fiddling with TLS certificates is close to mandatory. The moment you start your server, you're bombarded by a flood of gratuitous requests, trying out every known vulnerability under the sun. etc. etc.
I can grab a cheap vps within minutes from DigitalOcean or OVH, with a public IP included. Both offer images with pre-configured software, and there are ample guides for setting up just about anything. Certbot/Let's Encrypt will literally setup add the generated TLS cert to your nginx config if you ask it to.
I had a friend who was there at the start of IP-over-cable, and they were particularly excited about the promise that they would be offering symmetric upload/download performance. By the time it became a consumer product, that too had this major asymmetry baked in.
I did it a while ago and it seems to work fine to this day: https://blog.kronis.dev/tutorials/how-to-publicly-access-you... (disclaimer: forwarding ALL ports is usually overkill, unless you're lazy like me)
Alternatively, just ask your ISP for a static IP address, even though that could be more expensive.
Let's shut up and let the other side speak? It's usually the opposite in my experience =-P, haha. Sorry, couldn't resist, I know you probably meant to write the opposite.
Sure, you can say, even yell, whatever you want. I don’t have to listen to you though.
On a larger scale, “I” becomes “we” and “you” becomes “them.” The side that feels they are being censored don’t like the consequences of what they’re saying.
It was naive to think that just because technology progresses, society progresses too. The 20+ year Iraq and Afghanistan wars really made me realize that advanced technology meant nothing if we were going to regress morally.
We thought everything could be abstracted away into a perfect library and no duplication of functionality should ever need to occur. In the end, we could even drag and drop (or write XML) to connect all those components together.
We thought multitasking was possible and even a good thing to design for. Hence the very noisy and distracting desktop OS designs we still have today.
We thought in terms of a much smaller handful programming languages. And anything “serious” would end up in C (or C++, maaaybe Java)
We thought Apple was kind of a weird thing, surprising company that barely hung on, the computer only shown on movies.
We thought democracy “won” so the future could only be one full of enlightenment, right?
We thought CLIs were so passé and every dev tool needs a GUI.
Merges were done with a diff / merge tool and often manually.
Software libraries were bought for big $ or provided by your OS/compiler vendor, not downloaded. Maybe the “reuse” vision of the past actually WAS realized. Instead of and XML file, it’s my packages.json :).
I expected 250 gigabytes to cost less than $4000 by now. ;-)
I became a Perl developer at the tail end of the 90's. I learned a lot of functional development and Lisp before I had any idea what they were.
By isolation I don't mean lack of socialization of course. There was plenty of socializing back then. But the people I met on Usenet, BBS, IRC and phpBB forums were people like me. We worked on tech projects, we talked tech, we had our specific jargon and subculture.
I distinctly remember when people asked me what I wanted to do back then, I'd make up some lie, but knew that my future would be living in a single room with no social interaction but with my online friends. (I did not think this to be bad at all, this was my idea of a good life).
I thought tech would evolve to make the internet more into an alternate reality, more separate from the real world. Maybe I was reading too much cyberpunk sci-fi.
In any case, I certainly did not anticipate social media, dating apps and the digitization of traditional brick and mortar businesses. And to be honest, I'm not sure we wouldn't be better off without them.
It seems there aren't that many "pure internet" projects out there. IRC replacements are all tailored for the workplace and real life interaction (maybe except for discord?). It still feels very strange to me that communication platforms would ask your for your real name (especially when, at this point, there's more "reputation" or "social credit" or whatever you'd like to call it attached to my online persona as to my real world one).
Crypto is perhaps the last bastion of such an autarkic technology (i.e. by netizens for netizens) but even that is slowly moving toward real world asset tokenization and institutional integration.
I also thought IDEs would be in 3D, with better visualizations of code running in parallel.
And, I didn't expect web applications to be the norm for most people. I thought we'd have a better "standard" cross-platform application development approach.
One thing that doesn't surprise me are application stores. After seeing how 90s applications used to take over your computer and wedge themselves everywhere, I expected that OS vendors would lock things down a bit more tightly.
Which is at least one reason why Windows
kept its crown. By the time there were real alternatives (increased popularity of Macs--albeit mostly higher end of the price range--and Linux as a fairly viable option), the desktop OS just wasn't very interesting for a lot of people. And Chromebooks were a reasonable alternative for many, especially in K-12. For those who did still need a "fat client," Windows just tended to win by default.
While we're reminiscing, the thing I remember the most from the late 90's is that it seemed no one knew what they were doing (or perhaps it was just me). The Internet opened up such a frontier in computing, for effectively hobbyists, as I guess we'll never see again. Perhaps that's why I don't recall thinking about what computing would look like in the 21st century--there was too much to be done right in front of my face!
EDIT: There's a mention or two of thin clients. I think only Sun thought that would work :).
Oh ok, I can think of one that actually worked out: Linux on commodity hardware in data centers. It became clear by 98 or so that the combination of Apache (httpd) on a bunch of cheap Linux boxen behind an expensive LB would tear down the Sun/IBM scale up model fairly easily due to both cost reduction and an improved resiliency. Of course there were MySQL bottlenecks to work through, but then memcache landed and pushed that off a cycle. Then there were the 64-bit AMD chips and a large bump in RAM density that pushed it back another. And then I guess true distributed computing landed.
And I thought that conspiracy theories were harmless fun.
And in a sense, that kind of came to pass. Nobody believes _anything_ the government says anymore, and skepticism is at all-time highs. It’s just… everything.
I also bet very heavily against the widespread popularity of video on the net, not due to any technical limitations, but simply because reading is so much faster.
This never came to be, because it has one gaping flaw; it assumes governments and corporations wouldn't also figure out how to use the Internet.
* Thin clients: everybody was going to have a 'dumb terminal', and all the computing/storage would happen elsewhere. There were many variations on this idea.
* PIMs: Palm Pilots and all the variations. I knew a guy who quit his job to develop Palm Pilot wares, claiming it was 'the future'. In a way, he was right.
* Windows: Linux was barely there, Unix was for neckbeards and universities, and it looked like Mac was mostly dead. It looked like Microsoft was about to swallow the world.
* The end of mice and keyboards: Yes, it sounds silly today. but supposedly VR/voice/whatever was going to replace it all.
> Thin clients
> Palm pilots.
> Windows, Mac, Linux
Windows is still for business/economy. Linux is still for neckbeards. Although Mac is bigger than it was, the same people are still waiting for Apple silicon to catch on.
> Keyboards and mice
Have largely been supplanted by touch or gesture and consumer VR is becoming more and more viable every day.
(And there's also ChromeOS, eating the education sector.)
Of course, as a software developer, I've always used computers differently than the masses. The computers I need are far more powerful than anything they need.
But certainly the majority of "stuff" people do at home now takes place in data centers. And as a percentage of all computing cycles day by day, data centers do effectively all of it.
I relatively painlessly can work both from Windows and Linux - most of what I need is in web browsers and remote systems..
certainly a success of "thin clients" - especially the browser.
What's your prediction for the next 20 years?
Components would be “snapped” together by non-programmers to make an application.
I suppose that pieces of this became true with the proliferation and adoption of open source. Rather than a binary protocol, HTTP came out on top.
SharePoint brought us WebParts which tried to put the power in the hands of business users, but it turned out they were still too technical and not flexible enough.
I don’t see the role of software developer/engineer going away anytime soon.
I’ve seen this with tools like Zapier. I know many non-technical people that put together amazing workflows just piecing things together with webhooks and similar tools.
COM still underlies an abhorrent amount of the Windows architecture.
Around that time I too thought I was seeing the future - component-based development that's language and platform agnostic.
Pity that WinDev keeps producing the same tools as back in MDIL 1.0 days though, the only good tool C++/CX, got killed by them.
CORBA - I did not even finish reading "Introduction to CORBA". A book of 500+ pages.
This is how .net got it's crappy generic name. Back in it's early days it (and SOAP) was sold as the glue that would stitch all this together, just add a web reference and you'd have all this functionality in your project. It was incredibly disappointing to get through all this marketing and eventually realize it was just an MS version of java. Enterprise java beans, CORBA, COM and probably some others were also efforts in this direction. These days it's REST and microservices.
Hearing him name drop terms like markup language, stylesheets and internet is mind blowing. If only the promise of “no delay in response” had been achieved :-)
That's basically what Salesforce is.
I can actually see some of this happening in the crypto space. Things like MetaMask and now chainlink combining into some form of low code smart contract app tool. As you say though even in this scenario devs don’t go away.
Then the momentum shriveled up with Perl 6 and it has just become completely irrelevant. Pretty insane trajectory. It's almost as bad as like ColdFusion lol
And then we got things that were "good enough" at everything that anybody wanted a PC to do, and we went past that to silly stuff that very few people wanted a computer to do, like CueCat.
And everywhere you looked, there were new things happening. The internet. The web. MUDs. People were just trying things, and just throwing the results out there for the world to look at.
I just kind of expected more and better to keep coming. I didn't see PS/2 and IBM trying to close the ecosystem. I didn't see Windows taking over the world and then Microsoft trying to close the (web) ecosystem. I didn't see the Facebook/Twitter/Youtube silos taking over. I didn't see cancel culture taking over the diversity. I didn't see hostility, nastiness, and hatred taking over the openness and acceptance that used to be there.
Computers failed to create a space dominated by "the better angels of our nature". We not only expected it to, we tasted it. We expected it because it was what we experienced. And then it got overrun by all the worse parts of human nature that we wanted to get away from. Turns out that making people act better takes more than technology.
Ironically enough, it was free software that powered the building of the web-based world that removed power and control from users and subjected them to the unblinking eye of pervasive surveillance.
Microsoft has given up. I'm convinced that they adopted Chromium for their browser because they see Electron or something like it as the future of application development.
Thankfully Apple seems to still believe in native applications. All the interesting indie software seems to be on the Mac. Apple is trying to make speech recognition and OCR work better on-device than off. They are investing in CPUs because Intel can't figure out how to ship anymore. They still have stores where you can touch and see the stuff they sell. It's the most human big tech company out there right now. It might be the only one.
Microsoft's problem is that nobody cares about Windows anymore. Businesses are moving to web apps which I think makes a lot of sense for businesses. Gamers use it because that's where games run best. Who runs Windows because they are a Windows enthusiast? I know they exist, but not in the same numbers that exist for the Mac even though the Mac's market share is relatively small. (for the record, the only Apple computer I own is an iPad).
So Microsoft mostly wants to keep their users. There really isn't a path for growth which means there's not a lot of reason for investment.
I'm bummed about this and have been thinking about it a lot lately because I've been reading Ellen Ullman's Life in Code. In it she laments the fade out of the PC era (20 years ago!) and the loss of privacy and security and power that users had for a brief moment between mainframes and web apps.
There are still so many use cases that a browser cannot cover.
I agree that web apps can't do everything. I'd go one step further though and say there are lots of things that shouldn't be in the browser.
Businesses love web apps for the ease of administration and (unfortunately) surveillance. Devs like web apps for the write-once, run everywhere aspect and the financial opportunities of SAAS schemes.
Users, I think, are often best served by well written native apps. Unfortunately, outside of Mac apps, iOS apps, and games, they don't seem to be willing to support the development of those types of applications.
I spent 4 years developing in a mix of Forms, WPF and MFC for such domains between 2014 and 2018, and they keep looking for people in 2021, without any plans to migrate to Web technologies.
Regarding MFC, given the way Microsoft killed C++/CX and replaced it with clunky tooling for C++/WinRT, I actually consider it more productive than coding IDL files by hand without any kind of Visual Studio support.
I expected most people to be able to create and publish their own websites using desktop apps like Microsoft FrontPage or Dreamweaver, and that these types of apps would be the dominant way to create content on the web.
The pace of technology change has accelerated dramatically, the complexity of software has grown beyond imagination, and perhaps correspondingly, the fragility and interdepencency has grown out of control.
We (myself included) have become so acclimated to complex software as users that we forget how much work is buried under the surface of a "simple" app. Even accurately describing the behaviors of a system is difficult now (and often not even documented, partly because it changes so often).
When I was taking flying lessons in the 90s, I couldn't believe how antiquated and manual-intensive the aircraft systems were. There seemed to be so many places where pilots needed to follow a checklist to perform manual tasks which could have been automated based on various available data. When I asked my instructor why it was so archaic, he explained that it's a long and arduous process to get new things certified by the FAA.
This difficulty in getting advanced systems approved was mostly due to the processes and tests required to ensure a very high standard of reliability and safety, since lives were at stake. At the time, I thought it was ridiculous. But seeing some of the Airbus automation problems (which cost a few aircraft and some lives), and then seeing the Boeing 737 Max disasters, I see how slower advancement, more testing, and slower release cycles can be beneficial.
But in the software world, the more modern approach is "move fast and break things". Not only is software now never complete (in contrast to when software was released once or once per n-years, on disk or CD), now it is released every 5 minutes with old bugs replaced with new bugs. There are days where every time you open an app, it needs to update first. I'm not sure this is a net improvement.
If I could say one really positive thing, it would be that tooling has gotten really nice. Even I recall when syntax highlighting arrived... and at first was laughed at. On the other hand, the better tooling is virtually required to keep up with the ballooning complexity and volume of the code we write.
I could rant and complain many more paragraphs, but it can be summarized thusly: modern software (and its development) has made some great improvements, but those improvements have been largely offset by reductions in the quality and formality of practices.
I'm not really sure what changed, though.
Part of it is that "Agile" development methodologies became popular in the 2000s, and well, let's just say they've turned out to be a lot less 'agile' than was hoped for.
"...the on-board shuttle group produces grown-up software, and the way they do it is by being grown-ups."
I got it right, but with hindsight it was pretty obvious it was going to be a big thing.
More than a decade later, I finally learned how to do it. HTML+CSS+JS and a bit of PHP for WordPress. I'll never be paid a cent for this knowledge as I'm not a web designer, but it feels good to work on my own sites, just as an expression of myself, and as a hobby.
It really felt like the Internet was a fantastical place, a destination for unlimited possibilities.
I actually own some of the more speculative non-fiction books on "Cyberspace," in particular the representational aspect. I did expect some higher levels of abstraction to be present when it comes to working with data, browsing file systems, making connections, and so on. I was not necessarily expecting wireframe graphics, but yes, more abstraction. The closest I have seen is something like WinDirStat when it comes to looking at a directory, and even that is just colored blocks. I knew the graphics processing would be intense to match the bandwidth of the human visual channel.
I did see the Wild Wild West era of the Internet coming to a close, having donned white and grey hats myself off and on. The early Internet had security holes such you could drive a bus through them honking and dragging a bunch of rusty bikes behind you. The invention of the img tag more or less began the Advertising Era, whether they knew it or not.
I did have an early suspicion about privacy, and permanence, so my Usenet posts from 1989 never had my name on them. So far this has remained a decent decision.
I did not expect a trend in programming languages I have a hard time describing but can only sum as "yes, every programming paradigm and idiom will be crammed into that language over time if allowed."
I have never expected VR to take off until other technologies are in place and cheap. They aren't here yet.
I have never expected the "Fifth Generation" anyone-can-code drag-and-drop thing to happen and I believe that by the time such a thing is available (and effective at scale), it would be AI anyway.
None of the approaches to AI have been promising to me, aside from Cyc, which I would describe as being necessary, but not sufficient.
I still think about that GUI Tom Cruise navigates in Minority Report. The film is almost 20 years old, and yet GUI design today seems to be focused on hiding more and more elements, instead of making the canvas bigger to display more info.
And you're so painfully spot-on about hiding with GUIs. Microsoft continues to wave the utterly unnecessary 3D Objects in my face, and My Videos, and so on and so forth. One Drive appears to jostle for my attention, but meanwhile I have to do additional work just to see where in a directory tree I am.
The grey hat stuff consisted of things like breaking out of the restrictive policies of my campus to get onto the Internet proper, tricking the equivalent of the (yet to be dubbed) script kiddies to incriminating themselves, and sort of Robin-Hooding accounts back away from a gent who spent a lot of time hacking accounts of various campuses to cause havoc before tying everything up in a neat dossier for someone else to deal with -- there was eventual deportation involved, if I recall, but that may have been just rumor.
I did rather discreetly inform the department of my major that their choices of passwords and PINs were not as clever as they thought they were.
We randomized the exchange listing, split up the results, and then proceeded to war-dial the entire exchange until we found a forgotten (and unprotected) modem line in the math department, which then hosted all of computer science. From there out to what were called "telnet gates," which were known only from rumor and just knowing of them was a commodity in and of itself.