Oh god, there were so many boneheaded 90s predictions about computers.
1) We just expected single core performance to double forever every 2 years or so. Many of us were ignorant of single-core performance scaling and the memory wall issues. I want to be clear: Computer Architects were warning people about this in the early 90s but many ignored their inconvenient truth.
2) 2D GUIs would evolve into 3D GUIs - maybe VR? Maybe something else? And the CLI would be gone
3) Low/no code drag-and-drop style programming tools would take away many dev jobs
4) Microsoft taking over and Unix dying
5) All programming would be object-oriented (We are talking 90s style definition here)
> 3) Low/no code drag-and-drop style programming tools would take away many dev jobs
I started programming as a kid in the early 90s, and went to college in the late 90s/early 00s. It might sound crazy in hindsight, but between these tools and outsourcing, everyone I talked to thought being a programmer was going to be a dead end minimum wage type job, and I was strongly advised by many to pick a different major.
In fairness, lots of tech jobs have died out like there are nowhere near as many system administrators, database admins as there were since the cloud has become bigger.
Equally, tools like Wordpress has killed off/deskilled the old 'webmaster' role - I mean sure, there are still people making money doing Wordpress sites for businesses but it's nowhere near as lucrative as it was at the start of the dot-com boom era.
> In fairness, lots of tech jobs have died out like there are nowhere near as many system administrators, database admins as there were since the cloud has become bigger.
During my college internship (2010) I was told that my CS degree would be totally worthless and I should focus on electrical engineering (though that would've been fine too tbh)...oh, and that I should put all the money I could into gold and guns...the dude was a nut is what I'm saying.
I mean, to be fair, gold has sextupled in price in the past twenty years, and Smith and Wesson is worth roughly 20x as much, depending on the point in 2000. So, yes, a nut, but not horrible investment advice, per se
I started my CS degree in 2000, I heard the same, companies would rather fly someone from India over then hire you. Also my CS advisor said "Java will be the only language anyone uses in 20 years"
#3 - I believe this did happen much more than people realize.
In the 90s and 00s, every company with more than 1-2000 employees would have an internal dev staff. I worked in some companies with 20 devs, some with half a dozen, but there was always a dev team in the IT group.
Today, companies do low code via SharePoint and Salesforce. The do BI with Tableau and Power BI instead of internal BI teams. Their external web presence is done with Wordpress instead of a custom site. Sprinkle in some SaaS, and the internal dev teams of corporations are way smaller than they used to be, with some companies not having any internal devs.
Low-code also applies to the day to day management of web content. It used to fall to the 'webmaster', who needed to know HTML/CSS, maybe multiple scripting languages, cybersecurity practices, etc.
Now you can hire content writers with experience in using CMS systems and leave the security/infrastructure part of it to developers.
I remember talking in mid 99 about ipv6, while I was at a web company. Immediately on seeing it I said "this is crazy, and isn't going to take off". "Why not?" I got from some of the network folks there.
My response then is mostly as it is now - it's just too confusing to read and write. The UIs - even at the time - telling people to put in a dotted quad - were... manageable. 4 is an easy number to understand. Numbers only is easy to understand.
I still maintain that had some intermedia come out adding 2 (or possibly 4) more spaces and we transitioned from
187.43.250.198 to 0.0.187.43.250.198, and defaulted most UIs to just prefixing with the 0.0 then grew from there, we'd have had much faster adoption, and still given ourselves 64k 4 billion address spaces.
But... I'm not a network technician, nor am I on the committees... I'm just someone who's had to live with the last 20 years of "ipv6 is coming, we're running out of ipv4 addresses!!" and... the last 5-10 years of trying to mess around with ipv6 and realizing it's still mostly out of my control (home ISPs, ime, do not, support it still).
tldr, I never expected ipv6 to be adopted quickly. I'm surprised it's made it this far.
I've been working with IPv4 since 1993, back when getting a SLIP or PPP connection was uncommon. I set up my first IPv6 tunnel back in 2007. When properly configured, IPv6 is no more difficult to set up than a IPv4 connection with DHCP: It just works.
It's actually simpler than IPv4 in many respects. For example, prefix delegation: My router is getting an IPv6 /56 block from my ISP using DHCPv6. It is then handing out /64 blocks of on several different subnets with minor configuration.
The average user doesn't care about IP addresses. They're using DNS.
our regional TWC didn't offer it, and our local spectrum doesn't offer it for residential. I can get it for our office now, and might, but my own desire to experiment/learn isn't there, and no one else in the office here is asking for it right now.
Today people don't even remember DNS names anymore. They just use the search engine/advertisement provider of choice. DNS is of course still very useful to separate service names from physical architecture so you can move stuff around. But I hardly ever see people typing in DNS names or using bookmarks these days.
Regarding screens in glasses: where is the hold up? I've been thinking of prototyping something for the past few weeks (only thinking, hence my question).
Enough battery for all-day power being able to fit in a device that doesn't look like complete shit (i.e. looks sufficiently like normal glasses or sunglasses) seems to be the problem.
However, as far as I can tell every major tech company still expects that problem to be solved and for them to be the Next Big Thing. I don't know why else they'd be putting so much money & PR into consumer AR efforts when it's niche at best, so long as you have to hold up a device to use it, unless they fully expect hardware development to come through in the nearish future and make AR glasses the next smartphone, in terms of becoming must-have and driving the next wave of human-computer interaction.
I think fewer people would have mocked it if it were in anyway accessible to ordinary folks. They set the wrong price point, which made it an easy target for ridicule because only 'rich' people with sufficient disposable income even had them.
It makes sense that the first version of something like that is going to be expensive, though. And compared to some of the high-end smartphones of today, it wasn't even that outrageous, was it? It was aimed at early adopters, which makes sense. If successful, prices would drop, cheap knock-offs would appear, and more people could afford it.
Yeah and now plenty adults are running around with powered-up Tamagotchi's on their wrists. Probably the same people who laughed at google glass would insist on wearing it now or in the near future.
The device was pretty slick, which it should be for 1500$ (Typical devkit prices I guess), I enjoyed recoding tutorials where I needed both hands - it reliably captured what I was looking at, and it was lightweight and didn’t get in the way, great tool.
Problem was the only stories about it were about the “glass-holes” walking into bars with a camera on their face. I thought it was an interesting “intervention” style art piece that showed people still expected obscurity, if not privacy.
I've tried the glasses from North, which has been bought by Google a few years ago. The projection of the screen on the lens looks cool, but the glasses had no traction in the market and the company tried to charge me C$1100+ for a pair of prescription smart glasses.
#2 - remember the hype around VRML and how we would have virtual words in the browser?
I remember working with Silicon Graphics gear back then and the 3D guys were quite enthusiastic about it. Of course, we had all devoured William Gibson’s novels like Neuromancer so we were naturally attracted.
We even built some 3D desktop apps as portals to the wider internet. They were too static and too hard to build so the Internet with HTML had much more pull for producers and consumers as well.
In practice, I think the 3D stuff that really worked at the time was more games like Quake and avatar-based web apps like Habbo Hotel.
It would still be fun to go back and read Jaron Lanier’s writing from that era.
> Low/no code drag-and-drop style programming tools would take away many dev jobs
When I was a kid my dad refused to teach me to program, he thought it would be a useless skill to learn because he believed in the future there would be no more programmers, as everyone would be able to program anything they wanted through drag and drop interfaces.
> We just expected single core performance to double forever every 2 years or so.
I don't get that. Moore's law is about transistor size shrinking, but that has an obvious end — transistors can't be smaller than one atom across. I feel like "everyone" did know that, even in the 90s. Or, at least, it was mentioned in every explanation of Moore's law itself.
> Low/no code drag-and-drop style programming tools would take away many dev jobs
I mean, they did, but they also created consultant jobs to replace the ones they took away. See: Salesforce.
The need to write source code can be taken away; but the need to do requirements analysis and deliberate architectural design cannot. So you end up with people who do 80% of a programmers' job, except for the coding part.
> Computer perf was doubling from transistors getting smaller. But transistors can't get smaller than an atom. What did y'all expect would save you and keep things going?
We are nowhere near an atom well into 2005 at least, but things were getting pipelined and there was a ton of SRAM on each die, there were improvements to fetch pipelines (well, SPECTRE and friends really), everything was happening as fast as it could with out-of-order pipelines, with prefetch and every other speedup out there including auto-vec.
The "how big is an atom?" is sort of post-2010 thinking, at least the way I saw it happen.
Modems was one place where this sort of thing I saw personally happen - 9600 baud modems to 56kbps happened so fast and it almost looked it would keep going that way (Edge was 115kbps, now I can't even do a page load over it) with DSL suddenly dropping in to keep the same copper moving even faster again.
I’d argue that the larger problem is not that transistors can’t shrink forever, but that we stopped finding ways for additional transistors to usefully increase the speed of processors.
For example, these techniques improve instructions per clock, at the cost of adding transistors:
* superscalarity (early 90’s) Pentium, DEC Alpha 21064, MIPS R8000, Sun SuperSparc, HP PA-7100
* out of order execution (mid 90’s) Intel Pentium Pro and later, DEC Alpha 21264, Sun UltraSparc, HP PA-8000
* SIMD (vector) instructions (mid/late 90’s) Pentium MMX (integer) Pentium III (floating point), DEC Alpha 21264
* multithreading (early 2000’s) Pentium 4, planned chips from DEC and MIPS
We also got improved branch predictors and larger, more-associative caches.
But it feels like most of that progress stopped in the early 2000’s, and the only progress is slapping more cores on a die. I mean, if you can put 8 cores in a consumer level CPU, you have 8 times as many transistor (give or take) as you need to implement a CPU. Nobody seems to be building a higher IPC CPU with 2x transistors, even though they clearly have the transistors to do it.
On the contrary, we got a number of improvements for novel workloads by first inventing new things we expected computers to need to do all the time (e.g. encryption-at-rest, signature validation), and then giving the ISA special-purpose accelerator ISA ops for those same operations.
> Nobody seems to be building a higher IPC CPU with 2x transistors
I mean, there are designs like this, but they run into problems of cache invalidation and internal bus contention.
The way to get around this is to enforce rules about how much can be shared between cores, i.e. make the NUMA cores not present a UMA abstraction to the developer but rather be truly NUMA, with each core having its own partition of main memory.
But then you lose backward compatibility with... basically everything. (You could run Erlang pretty well, but that’s basically it.)
> [...] number of improvements for novel workloads by [...] (e.g. encryption-at-rest, signature validation) [...]
Sure, but that's not helping the general case. Only specific types of workloads. You could argue that adding lots of special-purpose hardware doesn't hurt from a transistor count (we have plenty) or power perspective (turn them off when not needed), but it can make layout tricky and reduce clock speed (which slows down everything else).
> [...] cache invalidation and internal bus contention [...] NUMA
Sure, but the context from spamizbad was specifically single-core performance. (I probably opened up a can of worms by mentioning hypertheading). The problem is many real-world workloads that add business value are not embarrassingly parallel problems. If it worked like that, Thinking Machines would have swept the court since the 1980's (they had NUMA like you are describing).
The point is that, since about 2005-2010ish, single-thread performance has mostly stalled. Intel CPUs can issue slightly more instructions per clock. AMD has a slightly better branch predictor. But performance growth has mostly been the result of adding more cores (Except Apple's M1 has some magic).
The things I previously mentioned gave big IPC gains on a diverse set of real workloads. Some innovations, like pipelining and multi-issue were responsible for 2x-4x IPC each. Pipelining, in particular, was a trick that also helped clock speeds.
All those innovations happened between the late 1980's and early 2000's. So an observer during that time might have just assumed that similar innovations would keep coming. But they haven't. A Pentium III has probably around 15x IPC compared to an i386 (maybe 60x if you include SIMD), in addition to a 40x higher clock speed (some of which came from adding more transistors).
How can you add transistors (say 2x or 3x) to a CPU to double performance on diverse, real-world problems that don't parallelize well? My point is, I don't think anyone knows, so it is irrelevant whether there is a physical limit to transistor shrinkage. We don't even know what to do with the transistors we have, so who cares if we can't have more?
The expectation was low/no code tools would put many devs out of work and it would only be a niche job. The idea that we would have a decade+ labor shortage was unheard of until the craziness of the dotcom bubble (which then popped and people moved to assuming dev work would be permanently outsourced like textiles and never coming back due to high US labor costs)
People seem to be in a permanent state of lump-of-labor fallacy and not understanding comparative advantage. The current one is thinking we'll run out of jobs because of AI. (Their scenarios seem to end up with a few rich business guy types owning all the AIs and trading with each other while the rest of the world is somehow unemployed - which I'm pretty sure was the plot of Atlas Shrugged.)
>Moore's law is about transistor size shrinking, but that has an obvious end
Yes, but by itself that isn't a single core vs. multicore issue. It's a process node question generally.
That said, Intel in particular did try to push single-core frequency at least a beat too far. They demoed a 10GHz prototype at one IDF--and other companies were mocking them for it.
The story that a very senior Intel exec gave me at the time was something to the effect that, of course, they knew that they were going to have to go multi-core but Microsoft was really pushing back because they weren't confident in multicore desktop performance in particular.
IKR. Excel and the other spreadsheet variants are the gateway drug to software development. There is no other tool that has such a low entry to barrier and is so enticing to scale up into a ridiculously complex product. It eventually does get to a point that when the original user/developer is ready to change their job someone says 'OK can we turn this into a webapp or something... this is nuts. Lets hire someone from guru.com to do this.'
However that was more of an afterthought- I remember that when it came out I initially bought into the write-once-run-anywhere hype and the idea that Java was for writing applets.
The UIs were too clunky (AWT) or foreign (Swing, the second attempt) and the Java SDKs buggy so in the end HTML just improved faster and took over the presentation layer together with Futuresplash (later renamed to Flash).
For #1, did people really believe this? In hindsight, granted, it just sounds obvious that the doubling would have to slow down pretty quickly. Was there some magical feeling in the culture that’s no longer present?
What many people thought was that the shift to object-oriented development, object oriented RPC systems (CORBA, "JavaBeans", some stuff in J2EE, OLE, etc) and CASE/modeling tools (UML, Rational Rose etc.) would turn the market into more of an automated component oriented system, with a marketplace of ready made components and more reliable systems with a faster time to market etc. Lots and lots of books sold about OO patterns, methodologies, etc.
IMHO that didn't happen, the integration morass remained, and I'd consider much of that model a failure in retrospect.
On the infrastructure front, I seem to remember a lot of talk about Frame Relay (EDIT: actually I was thinking of ATM, thanks for correction below). And fiber installs all over the place, lots of money in getting fiber into downtown areas, etc. Also I don't think people really predicted the possibility of the dominance of the "cloud" as it is now, really. I mean, hosting services were I'm sure a growth market but "serious" companies were seen as wanting to have their own DCs, and I don't recall major talk or trend away from that until the mid 2000s ("SaaS" etc. became the buzzword around then, too).
Also JavaScript as a serious presentation layer system wasn't much talked about until the 2000s.
There is this Jobs interview when he also discusses how there will be a marketplace for objects and people will buy and sell objects instead of software.
In one sense it failed because all it ever saves you is the typing. You still have to learn the whole model, sometimes it is like learning a programming language from scratch.
On the other hand, it may have succeeded. Objects don't exist in isolation but as a library with other objects and frameworks, and APIs are basically object models. You could think of React or the stripe api as object universes.
What is interesting is that you can't sell a developer product like a commercial products. Developers need access to the code so you either make it open source, and sell consulting, or else you make it a web api and rent it out.
There was a market for components back in the nineties. People were buying VBX's and OCX's for Visual Basic and Delphi applications. Component Source had a huge catalogue.
I think he has said a few times, almost certainly also mentioned in "the lost interview".
He usually says something like: They showed me three things are Xerox Parc, first, the windows mouse graphic interface, second, networked computing and the third thing I didn't notice at the time which is object oriented software.
Then he mentions how at Apple they did the first, and the internet is the second. At Next, they are doing the third, and then brings up objects.
Walked into new multi-campus hospital; fiber and an ATM core. Didn't take much to configure and start up an IP instance, some token ring I think, and associated support daemons. It worked very well and left me impressed. Oh, and synchronizing two sniffers to actually force our management to apply pressure to AT&T when it wouldn't acknowledge dropping packets.
One of my prouder possessions is the Southwest Bell CO badge they issued me to let me go into their phone switching buildings. I was pretty happy to have survived my teens without ending up on their bad list.
I really believed computers would make people smarter, better informed, able to make more rational decisions. They'd understand and trust science more, make better political decisions.
It is a bicycle for the mind. Remember back when if you wanted to learn about a topic, you had to travel to a library, spend a couple hours looking up books in a card catalogue, and then find 1 book, that maybe answers your question. I'm constantly using the internet to learn new things, to learn about all kinds of things (e.g. Californian urchin galls, best way to process small branches for firewood, the use bison horn fire carriers among Plains Indians, the grub hoe, etc. etc.; I'm constantly using Google Lens to identify plants in my area, which ones are native, which ones invasive, etc.)
The problem is something else. The problem is that the internet is more than one thing. It is a couch for the lazy, a distraction for the procrastinator, a massive entertainment delivery device; and most of the monetary value of the internet doesn't come from it being a bicycle for the mind, but for the other things.
Not everyone chooses to ride on the bicycle. Instead of, say, reading Wikipedia, they read Infowars. The world would be a much better place if a lot of people spent their time reading Wikipedia (with all of its faults) instead of scrolling Twitter, Facebook, and slanted news sites like Breitbart, CNN, Fox, etc. etc. etc.
Put another way, computers and the Internet are a tool to accelerate the process of becoming more of what you are. If what you are is a curious, humble person looking to understand the world better, it can help with that. If you're a vicious self-promoter looking to crush everyone on your way to the top, it can help with that too. If you're a terrorist looking for how to build a bomb or fly a plane or use certain guns, again, it can help.
Computers are information. Giving people access to information does not magically make them moral, or curious, or thoughtful. It just amplifies what they were already trying to do.
This sounds too simplistic (idealistic?). Maybe computers at their core are indeed pure, neutral information, but internet is definitely more than just that. It also amplifies all kinds of adverse mass psychology effects, rewards herd behavior, enables propaganda wars, provides means to monitor and control population at unprecedented scale, etc. I'm not saying that you're completely wrong, but I think you might be willing to see only the brighter side of it.
I like this, except that I think it misses that the internet doesn't just give people a way to do whatever they were going to do, but:
1. creates a giant, complex vector for the transfer and recombination of ideas, and attitudes: it's an influence and innovation engine
2. facilitates certain kinds of collective action and organizing, from traditional interest groups, to flash mobs, and getting ratioed.
3. facilitates certain kinds of illicit or illegal behaviors, activities, etc while making other kinds much harder. Anonymity is difficult, widespread surveillance is the norm; but you can hold entire pieces of infrastructure hostage, like oil pipelines, from your couch.
4. creates a new set of dynamics that we are only coming to recognize. for example, social media platforms like Facebook make it like we've all become more densely packed together, and we've not really developed a set of norms to accommodate getting along at that level of density.
>I really believed computers would make people smarter, better informed, able to make more rational decisions. They'd understand and trust science more, make better political decisions.
As a 90's kid who grew up online, this was my thought at the time as well. Turns out computers were more like an op-amp for the existing inadequacies of human nature, rather than a bicycle for the mind.
Reading too much science is actually common with anti-vaxxers and other such people, they love quoting renegade doctors and having absolute faith in arxiv papers they misread or quoted out of context.
Most published research is false even if it's peer-reviewed, and it may be better to leave it to people who read it full time and understand it's an ongoing conversation.
Hah, so maybe the Catholic Church was wiser than we knew by restricting their libraries to initiated scholars, there’s always talk of how the layman will misinterpret the mystical texts, better protect them from self harm.
In this case it's similar to how Qanon recruitment works. They say to "do your own research" but then push them so their research only finds the sources suggesting Obama is actually JFK wearing a mask or whatever.
Not that the liberal "trust the experts" worked, since Fauci's policy was to only say things that made him sound trustworthy, whether or not they were true. (and he admits this)
Is it "Reading too much science" or is it just that human characteristic of confirming ones own biases (ie anchoring bias, confirmation bias or whatever they're called)? Personally, I think its a lack of Baloney detection skills: https://www.openculture.com/2016/04/carl-sagan-presents-his-...
Don't just leave it to the experts!
Also, I'm not sure you can say that "Most published research is false". There are many degrees between true and false. In the physical sciences, with which I'm familiar, the papers are demonstrably 'mostly true'. For example, as a bi-product of an experiment last week we observed an Ekman spiral (first published in 1905 https://en.wikipedia.org/wiki/Ekman_spiral).
> Also, I'm not sure you can say that "Most published research is false". There are many degrees between true and false. In the physical sciences, with which I'm familiar, the papers are demonstrably 'mostly true'. For example, as a bi-product of an experiment last week we observed an Ekman spiral (first published in 1905 https://en.wikipedia.org/wiki/Ekman_spiral).
I agree it depends on the field. I was thinking of medicine, where there's a well known paper about this[1] that's led to some improvements like pre-registered experiments, although there's some newer ones that show ongoing problems in social sciences[2].
When I was choosing my university major, everybody was telling me don't choose computer science. The job market is saturated and everybody is a software engineer, choose electronics. I chose CS.
Fast-forward today, most of the people I know who were in electronic have switched to software.
Anecdote from a mid-90s grad. I found that in the late 90s/early aughts my friends with the coolest jobs had degrees in something computational but not CS. Biomedical Engineering, Electrical Engineering, Physics.
They all had to learn to code in school and had background domain knowledge in some specialty. In contrast myself and my CS friends had, well, we had knowledge of Turing machines with infinite tapes.
As the years played out it normalized between the two groups as each filled in their own weak spots with experience. But in those early days I was pretty jealous.
> In contrast myself and my CS friends had, well, we had knowledge of Turing machines with infinite tapes.
Yeah. Going to school for Computer Science is probably the biggest regret in my life. I could have spent that time & money learning something useful/interesting instead, or better yet, not waste my time in University at all.
I completely agree. With a different degree, one can still do programming. I worked with several good programmers who had no CS degree. Some had no degree at all.
For me, CS was largely a waste. Most of the real, required-for-job programming skills I learned were self-taught anyway. I just used CS as an on-ramp to industry. Maybe nowadays CS is required for "signalling" but it definitely wasn't in the 90s.
Wish I had done physics or hell even math. Now I'm middle-aged and the only things I "know" relate to this stupid pile of silicon on my desk.
Over time I found the CS degree worked out. I was able to pick up the necessary domain knowledge for my own specialty. And having the deeper understanding of how/why things work at a theoretical level enabled me to transition to new technologies faster than people I know who just "learned to code".
But yes. As a junior developer where my programming skills were roughly the same as my friends who had non-coding skills as well, there was some second guessing on my part
I graduated in 1990 with triple major in CompSci, Political Science, and Psychologywhi l it data science!) these gave me the domain and technical knowledge. Fast forward to today, and my biggest regret is either: a) not going directly on to get a PhD in Psychology, which would have been very beneficial now; or, b) not going back and taking over the family farm, which would see me truly working for myself at something which has true meaning and not creating TPS reports.
As an inverse anecdote, when I graduated with my CS degree in '97, I didn't expect to be unemployed 23 years later, as I am now. It's partly health related, and partly a result of my own bad decision making. But my impression after a ~20 year career is that continuous tech employment is quite fragile. Maybe different for those starting out now, and certainly different for those at FAANG.
Graduated 2001, everyone I know who wants to still be in CS still is. It definitely takes some willful effort to keep up though, as things change so rapidly. I remember when I first started working is when everything started becoming a web application "so you didn't have to install anything". I remember thinking that seemed like a crazy tradeoff to accept. But then everyone did it. I still think it was crazy.
The only reason why I took up CS in my majors was 'because' it changes fast. Cuz then this was the only job role where nobody could ever condescend to me that 'ye're just a kid!' Unlike a classical engineering branch, say in Mechanical or Electrical where the last major development happened in 1845. Because everybody is a kid their whole lives in the CS world.
I graduated high school in 1988, and made the same assumption. I don't think anybody explicitly said it to me, but it seemed obvious. They're plenty smart and they're sufficiently good at English (often, flawless).
I still don't entirely understand why it turns out to be so hard to outsource. I get some of it, but it seems like something that we should have figured out by now. I'll be interested to see what the post-pandemic shift to remote work does for that.
* It's difficult for Western businesses to time shift, and there are usually caps on work visas.
* Perhaps until recently, most Western programmers genuinely enjoy the work. What I've heard from Indians is that many Indian programmers are more interested in building a career than in the work itself. Intrinstic vs. extrinsic motivation. I think this may be declining as a factor as software eats the rest of the US economy and brain drains talented people from other fields, though.
* Cultural differences--it turns out that English fluency is necessary but not sufficient to seamlessly slot into an Anglosphere company.
* One way out of this conundrum could be for new software companies to start in India and outcompete Western companies (at which point culture differences and time shift wouldn't matter), except I get the sense there are barriers to that.
I've been thinking the same as I read this discussion.
My current best theory is that the software industry is overflowing with money, and it's still more beneficial to try to optimize for output quality and quantity (i.e. hire better software engineers no matter the cost, bring people into the same locality instead of hampering progress by splitting work into different timezones), instead of trying to cut costs to improve profits.
Once the growth finally ends, outsourcing to cheaper places might become a very attractive option.
> People told me everything would be outsourced to India
I'm not sure that this ever amounted to more than a urban legend. When you actually looked into the supposed outsourcing "trend", all you heard about was TheDailyWTF-grade horror stories.
Now In India, students starting CS are often told "CS has no longer demand" in recent years.
The problem was too many people entering engineering degrees after a temporary 'IT boom' decades ago, seeing that as a way towards upwards social mobility.
In a way they were right. Movie and music have better definition, games better graphics but on a functional level everything we do today (web browsing, text messaging, video call, email, online shopping) already existed 15 years ago. There is more bloat (JS and Electron), more confusing UI design, more privacy invasive software now and but for the end user nothing really changed.
Not true, now you can also do those same things on a cell phone! Seriously though, some things are new but disappointingly few. Voice recognition and machine translation actually work now. VR has gotten pretty good but not many use it. Virtualization has improved a lot. More-or-less functional cryptocurrencies exist. Things like lane keeping and adaptive cruise control are almost standard on new cars. What else am I missing?
I agree that for a tech-savvy consumer end-user, not that much has changed. But endless porting and migrations aside, there are still plenty of businesses and industries that need new software for new ideas. For example most every new concept in medicine or finance needs software.
I graduated in the early 2010s and was told the same thing. Either “everyone” is a software engineer or it was all going to be outsourced to India, causing a race to the bottom in salaries.
The field to look at was cognitive science or neurology, which in fairness makes some sense given where machine learning is today.
Today, people are trying to get into tech companies, and those who take one boot camp course call themselves “software engineers.”
In the mid-1980s my dad (PhD EE) talked me out of majoring in Computer Science initially because "you'd be a glorified typist". Glad I eventually switched to CS.
In the early 1990s, it sure felt like there was a limited market for robotics, machine learning, AI, etc. It was mostly industrial robotics and assembly line inspection cameras.
Several of the comments here echo your father's sentiment to some extent. How did a CS degree pay dividends for you in the 90's as opposed to majoring in math, a hard science, or engineering and learning to program on your own?
You are either building killer robots like Boston Dynamics (morally reprehensible) and drone striking the middle east, or helping people make things with production line robots.
Biggest mistake i ever made in thiknin and telling my dad "Oh, dad i don't want to major in CS because there will be no jobs, there are so many kids majoring in CS". This was in 2004, freshman year. I regret that still today.
I did electronics - haven’t ever been paid to use my electronics degree (well I was paid to do my PhD but I don’t think that counts) - I joe get paid to write software
Definitely thought that consumer software (the thing sold in boxes on store shelves) would continue to get bigger as a category rather than die out completely. Anyone remember that famous photo of the guy jumping with joy after snagging a launch copy of Windows 95?
No one knew what the internet really was and what it would become. Not Bill Gates, not anyone else.
Developers believed that processing power, RAM, storage etc. would continue to grow exponentially to the point where there would just be too much of it and we wouldn't need to care about any resource constraints when writing code.
Writing code line by line in a text editor was supposedly on its way out, to be replaced by fancy IDEs, WYSIWYG, UML etc.
All the jobs were supposed to go to India. Programming as a profession has always been on the brink of death for one reason or another, and yet here we are.
> guy jumping with joy after snagging a launch copy of Windows 95?
I was pretty excited about it too... And I was a kid in the 90s. I once worked with a guy who got excited about new releases of directx... Which I also admit could be a little exciting
Its funny I used to get excited about new API stacks coming out. They enabled such amazing things. Now I look at them as this thing we drag around, (hope you know the latest ones to get a job, and spend 4 months figuring it out). I look at them as this thing that will be quickly abandoned in 4 years and something I will end up supporting once the cool kids have wandered off to something new.
Innovation is exciting. Serious, exciting innovation in most of the computing world has ground to a near halt for over a decade now. Most things, hardware, software, web etc are just iterative, and often a step backward. Nothing to be excited about anymore.
In the 90s, not getting an expensive new PC every 2 years or so meant you were screwed if you wanted to play any of the cool new games coming out. I haven't upgraded my system in over 5 years, and I can still play most new titles with a few minor graphic settings turned down or off.
>Writing code line by line in a text editor was supposedly on its way out, to be replaced by fancy IDEs, WYSIWYG, UML etc.
If you're using something like Visual Studio then that's is a very fancy IDE and a long long way from writing C in a text editor and dealing with arcane compile errors.
Everyone knew the internet was going to change things massively. It’s just that nobody with any sort of power in the industry knew how exactly things were going to change.
A lot of people overshot by trying to work on video streaming, video chat, document editors (Google Docs like) decades before the technology got there. Microsoft had a demo of Excel running in the browser by 1999.
Yeah, and they responded to it by launching WebTV and Windows Mobile. At least the latter became something of a market standard, although firmly in 3rd/4th place behind Nokia, BlackBerry and Palm.
Not a prediction of the future, but rather something I've found quite funny over the years: watching so many things keep repeating themselves in a never-ending cycle. My two favorite have been...
1. Compiled (native) vs. interpreted (bytecode/VMs). It's this forever cycle of "performance required" that shifts to "portability required", which lacks performance, and gradually shifts back, and repeat.
2. Local processing vs. dumb terminals and shared computing. The current "dumb terminal" phase being the idea that you can buy a $100 Chromebook and working entirely online.
My mid-90's telco experience involved a test deployment of citrix winframe (IIRC) dumb terminals. They worked okay, but not great.
Every tech or bigco job i've had there's at least one mainframe codger doing the Kinge George "You'll be back" number from hamilton. Don't think they anticipated the non-MF cloud. Almost always among the best engineers though in the joint though, IME.
I miss genuine differences in hardware. Happy hour discussions about alpha vs sparc vs intel etc. I'm hopeful that diversity will come back around as a thing.
I miss genuine differences in hardware. Happy hour discussions about alpha vs sparc vs intel etc. I'm hopeful that diversity will come back around as a thing.
THIS. Right now there are three mainstream choices: Win, Linux, Mac And I'm sorry- no matter how diverse your Linux distro is or even what hardware you're running it on (x86 vs ARM etc), it's still just Linux. And again, sorry- Linux just isn't interesting anymore. Extremely useful for its purpose, but it's commonplace enough to be ho-hum.
Can anyone recommend something quite a bit different that a guy could toy with? Or is this why everyone is having so much fun with 8 bit retrocomputers?
But a key problem with alternative OSes is that if you want to use them as daily drivers, your software choices may be limited or nonexistent. Haiku, for instance, has a WebKit-based browser, but no Firefox, Chrome/Chromium, or Safari.
I had such high hopes for Chromebooks, I actually moved so much of my day-to-day onto some digital ocean droplets, scrapped my windows pc and bought a $150 intel-based chromebook.
I thought all I ever really did was VS Code, so I got code-server installed and everything else was mostly online anyway (Office 365, Exchange, Wordpress, etc), but the biggest pain point was the lack of a native terminal and ssh.
Even though VSCode has a great terminal built in, it seemed so wasteful to load into it just to open a terminal window.
After a couple of months, I just switched to MacOS, and still use my code-server install instead of VSCode, and 100% prefer it.
> 2. Local processing vs. dumb terminals and shared computing. The current "dumb terminal" phase being the idea that you can buy a $100 Chromebook and working entirely online.
Joke was on the proponents of this since web-apps bloated to require a damn super computer to run them with acceptable performance, until that YC/PG-backed startup that was posted on here recently came along selling mainframe-hosted-browsers-as-a-service-over-VNC.
Oh yes... IBM 3270 terminals sending fields from the screen back to the mainframe. The first time I came across a simple HTML form I wondered where I'd seen the idea before.
And as you say, there's definitely a cycle between "everything local" and "everything central".
Both of these hits home. Working at Sun in the late 90's, Java was a big thing, and while Sun was clearly way too early in their push for a universal virtual machine that could run anything, at the time people were complaining that Java could never be used for real software because of it not being natively compiled and using a GC.
Well, now everyone are using Javascript and Python, and no one seems to be complaining.
The second point is true as well. No matter where the pendulum is at the moment, there are always going to be people trying to push it in the other direction. Sun was again ahead of the time by pushing the Javastation, which was trying to do what the Chromebooks are doing. Then there was the Sunray of course, which was really cool, but a bit too specialised.
> Compiled (native) vs. interpreted (bytecode/VMs)
The debate today is mostly AoT vs JIT compilers, I don't see anyone bring up VMs unless they're talking about how slow Python is.
I also don't see anyone bringing up "portability" anymore - compilers support all the architectures people care about, while libraries and APIs are designed to be platform agnostic as far as they can be while developers recognize there is no such thing as portable software (and where portable software is absolutely required, it shall run in the browser).
Another one along that line that I've seen over and over: "smart" hardware vs. "dumb" hardware. Back in the days of analog modems, we had ones that were built on dedicated modem hardware, and the "Winmodems" that were basically just a DAC and an ADC. Or, the smart laser printers with built in PostScript interpreters vs. dumb printers where everything gets done in the driver.
"90s" programmer only by the skin of my teeth, but I didn't the internet centralizing. It just seemed like every company and person would have their site (and possibly domain) and they'd use that to communicate in their own way. My peers went to college before facebook and even friendster- we communicated by personal websites and blogs. These aren't technologists, but that was what was available to us so they picked up just enough html to publish.
I thought the future was making that publishing easier. I wasn't wrong about that- twitter and facebook are easy. I just missed the consequences of "easy" becoming "walled garden."
Yeah I was big into the idea of how the Internet was going to completely democratize information and education and knowledge. Not everyone predominantly choosing to plug into a few different major ecosystems run by giga corporations.
It still is. But nobody is using that feature. People choose to centralize and visit the same 10 sites and to avoid self-hosting.
Nothing is preventing society from using the internet in that way, but it's sooo much easier for the average non-tech person to not self-host an email server versus using something like gmail. It's the same reason I do not own a lift for my car.. sure technically I can install a lift at my house for my car but then I have to maintain it's hydrolics and I'd only use the lift once or twice a year ... who has time for that? Especially if my thing is fixing piano's and not cars. Let the car people handle cars, let the piano people handle piano's... I think we are in the same situation with tech.
>, but I think you are minimizing network effects by simply calling it "easier".
The gp is saying the millions choosing the "easier" action is what causes the network effects.
E.g. another example that applies to tech-savvy programmers: most developers have the knowledge (or can Google a tutorial) to host their own personal git server but most choose not to. The collective millions choosing not to bother with self-hosting leads to emergence of something like Github.
Signing up for a free account on Github is easier than:
- hosting a personal repo at home on a laptop or Raspberry Pi
- buying a $10/month shared VPS server to host it
So even before Github had network effects (say less than 1000 user accounts), it was still easier to create an account on an unproven Github platform than self-host. Those early adopters leads to future strong network effects.
But that also leads to unwanted side effect of Github deleting repositories from DMCA takedowns, etc (aka "censorship").
it really is much harder to self host than it used to.
90s: ask your ISP for a public IP, register your domain, start Apache and off you go
Nowadays: getting a public IP is iffy. All good domain names are taken. Emails from your self-hosted mail server go straight to spam/junk. Fiddling with TLS certificates is close to mandatory. The moment you start your server, you're bombarded by a flood of gratuitous requests, trying out every known vulnerability under the sun. etc. etc.
I wasn't part of the "ask your isp" era, but I would say you're painting nowadays a little more bleak then then it really is.
I can grab a cheap vps within minutes from DigitalOcean or OVH, with a public IP included. Both offer images with pre-configured software, and there are ample guides for setting up just about anything. Certbot/Let's Encrypt will literally setup add the generated TLS cert to your nginx config if you ask it to.
I dunno, I think this could have gone a different direction. It never really worked to self host things, because of IP address scarcity. Maybe if IPv6 had actually rolled out in the 90s, everyone could have had an IP address. Maybe then it would have made sense for people to make software targeting self-hosting. DreamWeaver web pages that you don't have to figure out how to deploy to a server somewhere, mail clients that run truly locally, maybe other different stuff that we never thought of. The earlier versions of these would have still had security and usability problems, but maybe they would have been compelling enough that the effort to fix those issues would have made sense.
A contributing factor to this was the early and still fairly-extant asymmetry between upload and download bandwidth. This is going away slowly, especially with fiber (e.g. Verizon), but it was typical from the end of the dialup era right up until today to have upload be perhaps 1/10th of download bandwidth. This puts quite a crimp in self-hosting anything that becomes (or is hoped to become) particularly successful.
I had a friend who was there at the start of IP-over-cable, and they were particularly excited about the promise that they would be offering symmetric upload/download performance. By the time it became a consumer product, that too had this major asymmetry baked in.
Well, if you can afford a cheap VPS, you can use something like WireGuard or OpenVPN to work around NAT - just forward all of the requests to the VPS to your own box instead.
As a 90s kid, I thought the society would be more censorship resistant. The internet is just a reflection of that. Most people claim they are against free speech, except when it comes to their opposite political faction which should be censored.
> Most people claim they are against free speech, except when it comes to their opposite political faction which should be censored.
Let's shut up and let the other side speak? It's usually the opposite in my experience =-P, haha. Sorry, couldn't resist, I know you probably meant to write the opposite.
Sure, you can say, even yell, whatever you want. I don’t have to listen to you though.
On a larger scale, “I” becomes “we” and “you” becomes “them.” The side that feels they are being censored don’t like the consequences of what they’re saying.
I've thought the opposite. I remember seeing Yahoo for the first time in 1993 and thinking "I can't believe they aren't using this to censor ideologically everything". They are doing it now tho.
I was on Yahoo News! daily during the 2004 US Presidential election. I can assure you that the comment sections were not censored in any way, so you can imagine the quality of the insights being posted.
Yes. Also I was born in a latinamerican "third-world"country where you see that the political class would use that hands down if they would knew and be competent enough to execute such a feature. China was, tho.
Somewhat related, I thought the progress of technology would be matched by a moral progress of society. People would become more tolerant. Aided by advanced technology, we would break down the barriers between us and see the similarities between all human beings. Instead we are seeing the rise of populism across the globe. Darfur wasn't the last genocide, we recently had the Rohingya and the Uighur ethnic cleansings... Technology didn't help us see other people like we see ourselves, it helped us distance ourselves further from other people instead.
It was naive to think that just because technology progresses, society progresses too. The 20+ year Iraq and Afghanistan wars really made me realize that advanced technology meant nothing if we were going to regress morally.
We thought everything could be abstracted away into a perfect library and no duplication of functionality should ever need to occur. In the end, we could even drag and drop (or write XML) to connect all those components together.
We thought multitasking was possible and even a good thing to design for. Hence the very noisy and distracting desktop OS designs we still have today.
We thought in terms of a much smaller handful programming languages. And anything “serious” would end up in C (or C++, maaaybe Java)
We thought Apple was kind of a weird thing, surprising company that barely hung on, the computer only shown on movies.
We thought democracy “won” so the future could only be one full of enlightenment, right?
We thought CLIs were so passé and every dev tool needs a GUI.
Merges were done with a diff / merge tool and often manually.
Software libraries were bought for big $ or provided by your OS/compiler vendor, not downloaded. Maybe the “reuse” vision of the past actually WAS realized. Instead of and XML file, it’s my packages.json :).
Well, I certainly didn't expect to still be coding for loops by hand in a text editor after 30 years. Guess it could be worse, at least I (mostly) don't have to worry about memory leaks anymore. So that's something I guess.
tbf computer science has not changed. Despite all the tech, we humans still use language to express our ideas so i don't think the coding of thoughts into serial text will go away
I thought tech would lead to much more isolation than it has.
By isolation I don't mean lack of socialization of course. There was plenty of socializing back then. But the people I met on Usenet, BBS, IRC and phpBB forums were people like me. We worked on tech projects, we talked tech, we had our specific jargon and subculture.
I distinctly remember when people asked me what I wanted to do back then, I'd make up some lie, but knew that my future would be living in a single room with no social interaction but with my online friends. (I did not think this to be bad at all, this was my idea of a good life).
I thought tech would evolve to make the internet more into an alternate reality, more separate from the real world. Maybe I was reading too much cyberpunk sci-fi.
In any case, I certainly did not anticipate social media, dating apps and the digitization of traditional brick and mortar businesses. And to be honest, I'm not sure we wouldn't be better off without them.
It seems there aren't that many "pure internet" projects out there. IRC replacements are all tailored for the workplace and real life interaction (maybe except for discord?). It still feels very strange to me that communication platforms would ask your for your real name (especially when, at this point, there's more "reputation" or "social credit" or whatever you'd like to call it attached to my online persona as to my real world one).
Crypto is perhaps the last bastion of such an autarkic technology (i.e. by netizens for netizens) but even that is slowly moving toward real world asset tokenization and institutional integration.
I thought a different major OS would supplant Windows. It kind-of came true with more Android devices in use than Windows; but by now I thought Windows would be a minority player and we'd have a bigger variety of different desktop OSes.
I also thought IDEs would be in 3D, with better visualizations of code running in parallel.
And, I didn't expect web applications to be the norm for most people. I thought we'd have a better "standard" cross-platform application development approach.
One thing that doesn't surprise me are application stores. After seeing how 90s applications used to take over your computer and wedge themselves everywhere, I expected that OS vendors would lock things down a bit more tightly.
>And, I didn't expect web applications to be the norm for most people.
Which is at least one reason why Windows
kept its crown. By the time there were real alternatives (increased popularity of Macs--albeit mostly higher end of the price range--and Linux as a fairly viable option), the desktop OS just wasn't very interesting for a lot of people. And Chromebooks were a reasonable alternative for many, especially in K-12. For those who did still need a "fat client," Windows just tended to win by default.
Lot's of good stuff here, but I'll admit I can't remember a single "prediction" of my own. I just thought everything would keep going in the same direction, faster CPUs, faster network, cheaper disk and RAM, more e-commerce and more information exchange were the expectations I can recall.
While we're reminiscing, the thing I remember the most from the late 90's is that it seemed no one knew what they were doing (or perhaps it was just me). The Internet opened up such a frontier in computing, for effectively hobbyists, as I guess we'll never see again. Perhaps that's why I don't recall thinking about what computing would look like in the 21st century--there was too much to be done right in front of my face!
EDIT: There's a mention or two of thin clients. I think only Sun thought that would work :).
Oh ok, I can think of one that actually worked out: Linux on commodity hardware in data centers. It became clear by 98 or so that the combination of Apache (httpd) on a bunch of cheap Linux boxen behind an expensive LB would tear down the Sun/IBM scale up model fairly easily due to both cost reduction and an improved resiliency. Of course there were MySQL bottlenecks to work through, but then memcache landed and pushed that off a cycle. Then there were the 64-bit AMD chips and a large bump in RAM density that pushed it back another. And then I guess true distributed computing landed.
I thought everyone in the future would spend their free time reading science journals, which I discovered via Usenet (because no one in my physical environment read them). Two decades later we elected a president who has said that climate change is a Chinese hoax and wind turbines cause cancer.
I mean, no amount of reading science journals convinced us to leave Iraq and Afghanistan until after we sunk multiple trillions of dollars into it and all we got out was an anti-American, pro-Iranian government in Iraq and the Taliban de facto ruling Afghanistan...
This. At bare minimum, a better-informed population that would be increasingly resistant to gov’t and corporate propaganda.
And in a sense, that kind of came to pass. Nobody believes _anything_ the government says anymore, and skepticism is at all-time highs. It’s just… everything.
I also bet very heavily against the widespread popularity of video on the net, not due to any technical limitations, but simply because reading is so much faster.
I don't understand the use of video to teach many things. Particularly when it's not just a "Building this together" tutorial, it's a "teaching core concepts" video. You can't go back and forth on a video without quickly skipping through it, you really have to watch the whole thing. And listening to human speech is much slower than reading. I understand it for some things (Gaming, DIY, etc) but not teaching thinking subjects.
So many 'trends' back then. A few that come to mind (from the pre-web days, anyway):
* Thin clients: everybody was going to have a 'dumb terminal', and all the computing/storage would happen elsewhere. There were many variations on this idea.
* PIMs: Palm Pilots and all the variations. I knew a guy who quit his job to develop Palm Pilot wares, claiming it was 'the future'. In a way, he was right.
* Windows: Linux was barely there, Unix was for neckbeards and universities, and it looked like Mac was mostly dead. It looked like Microsoft was about to swallow the world.
* The end of mice and keyboards: Yes, it sounds silly today. but supposedly VR/voice/whatever was going to replace it all.
Windows is still for business/economy. Linux is still for neckbeards. Although Mac is bigger than it was, the same people are still waiting for Apple silicon to catch on.
> Keyboards and mice
Have largely been supplanted by touch or gesture and consumer VR is becoming more and more viable every day.
Linux did eat the world, in the form of Android and all its variations. Every TV, most cellphones, many game consoles, and even your watch, are all probably running a Linux kernel.
(And there's also ChromeOS, eating the education sector.)
Linux in that case ate the embedded OS world. vxworks and so on. When trying to make a cheap device the OS can become the dominant cost. If it had not been linux it would have been one of the BSDs eventually. In those markets the margin is thin.
Keyboards and mice being supplanted by touch or gesture is only true for phones and tablets, which are new devices that really aren't equivalent to the general computing devices in the 90's: desktops and laptops. If anything, I'd say that tablets are just televisions that became small, light, and interactive, and smart phones are partly that and partly handheld gaming devices with multiplayer capability. Neither are equivalent to a computer (though they have computers built-in.)
Of course, as a software developer, I've always used computers differently than the masses. The computers I need are far more powerful than anything they need.
When it comes to thin clients, a lot of computing does happen via web services these days, albeit mostly video streaming and REST-based applications. Did you envision things a little differently?
I think he probably imagined it being a little thinner. The box on the desk hasn't changed size much.
But certainly the majority of "stuff" people do at home now takes place in data centers. And as a percentage of all computing cycles day by day, data centers do effectively all of it.
We wouldn’t make applications. Rather we’d deliver components which could interoperate via a binary contract.
Components would be “snapped” together by non-programmers to make an application.
I suppose that pieces of this became true with the proliferation and adoption of open source. Rather than a binary protocol, HTTP came out on top.
SharePoint brought us WebParts which tried to put the power in the hands of business users, but it turned out they were still too technical and not flexible enough.
I don’t see the role of software developer/engineer going away anytime soon.
> Components would be “snapped” together by non-programmers to make an application.
I’ve seen this with tools like Zapier. I know many non-technical people that put together amazing workflows just piecing things together with webhooks and similar tools.
In the 90's COM didn't make a lot of sense to me because I never needed it. Then, around 2001, I needed it to solve some problems and I finally invested the time to figure it out. It's actually pretty neat as long as it isn't in your web browser (ActiveX).
Around that time I too thought I was seeing the future - component-based development that's language and platform agnostic.
> We wouldn’t make applications. Rather we’d deliver components which could interoperate via a binary contract. Components would be “snapped” together by non-programmers to make an application.
This is how .net got it's crappy generic name. Back in it's early days it (and SOAP) was sold as the glue that would stitch all this together, just add a web reference and you'd have all this functionality in your project. It was incredibly disappointing to get through all this marketing and eventually realize it was just an MS version of java. Enterprise java beans, CORBA, COM and probably some others were also efforts in this direction. These days it's REST and microservices.
Hearing him name drop terms like markup language, stylesheets and internet is mind blowing. If only the promise of “no delay in response” had been achieved :-)
This is really interesting and reflected in some things like dreamweaver and Oracle Apex.
I can actually see some of this happening in the crypto space. Things like MetaMask and now chainlink combining into some form of low code smart contract app tool. As you say though even in this scenario devs don’t go away.
Yes, it's very profitable for devs because you can just steal everyone's money, since the smart contract language was invented by people who did zero reading about safe program design.
sounds like a beefed up unix philosophy, I think command pipes in the shell gave us something very similar to this idea way back, except that it doesn't really lend itself to complex programs like I think you're suggesting
Perl is the biggest unexpected casualty I guess. It was so cool and had so much energy. Tons of web applications were being written in it, and mod_perl seemed to be the future.
Then the momentum shriveled up with Perl 6 and it has just become completely irrelevant. Pretty insane trajectory. It's almost as bad as like ColdFusion lol
The late 80s and early 90s were an amazing time. The IBM PC was open to third-party add-ons, and that changed the whole world. Brand-new gadgets let the computer do more and more things, and then do it better than the first-generation gadgets. It just kept coming like a tidal wave. And then...
And then we got things that were "good enough" at everything that anybody wanted a PC to do, and we went past that to silly stuff that very few people wanted a computer to do, like CueCat.
And everywhere you looked, there were new things happening. The internet. The web. MUDs. People were just trying things, and just throwing the results out there for the world to look at.
I just kind of expected more and better to keep coming. I didn't see PS/2 and IBM trying to close the ecosystem. I didn't see Windows taking over the world and then Microsoft trying to close the (web) ecosystem. I didn't see the Facebook/Twitter/Youtube silos taking over. I didn't see cancel culture taking over the diversity. I didn't see hostility, nastiness, and hatred taking over the openness and acceptance that used to be there.
Computers failed to create a space dominated by "the better angels of our nature". We not only expected it to, we tasted it. We expected it because it was what we experienced. And then it got overrun by all the worse parts of human nature that we wanted to get away from. Turns out that making people act better takes more than technology.
I think that time was a bit of a golden age. Pre-PC, users were at the mercy of mainframe administrators. On their desk was a terminal that gave you a view into somebody else's computer. Then IBM started a revolution - giving control to users and they changed the world for the better. It didn't last long though. In the 90's PCs started their devolution back to being little more than dumb terminals giving us a view into somebody else's computer.
Ironically enough, it was free software that powered the building of the web-based world that removed power and control from users and subjected them to the unblinking eye of pervasive surveillance.
Microsoft has given up. I'm convinced that they adopted Chromium for their browser because they see Electron or something like it as the future of application development.
Thankfully Apple seems to still believe in native applications. All the interesting indie software seems to be on the Mac. Apple is trying to make speech recognition and OCR work better on-device than off. They are investing in CPUs because Intel can't figure out how to ship anymore. They still have stores where you can touch and see the stuff they sell. It's the most human big tech company out there right now. It might be the only one.
I don't think Windows will change much. They will reskin it, add more ways to change settings (especially remote admin), but what else is there to do? A lot of it is still plain old win32 and I don't think that's a particularly bad thing. Windows has been good enough for a long time now.
Microsoft's problem is that nobody cares about Windows anymore. Businesses are moving to web apps which I think makes a lot of sense for businesses. Gamers use it because that's where games run best. Who runs Windows because they are a Windows enthusiast? I know they exist, but not in the same numbers that exist for the Mac even though the Mac's market share is relatively small. (for the record, the only Apple computer I own is an iPad).
So Microsoft mostly wants to keep their users. There really isn't a path for growth which means there's not a lot of reason for investment.
I'm bummed about this and have been thinking about it a lot lately because I've been reading Ellen Ullman's Life in Code. In it she laments the fade out of the PC era (20 years ago!) and the loss of privacy and security and power that users had for a brief moment between mainframes and web apps.
I work in C++ and MFC every day and really enjoy it. Do you see a lot of new projects choosing those technologies?
I agree that web apps can't do everything. I'd go one step further though and say there are lots of things that shouldn't be in the browser.
Businesses love web apps for the ease of administration and (unfortunately) surveillance. Devs like web apps for the write-once, run everywhere aspect and the financial opportunities of SAAS schemes.
Users, I think, are often best served by well written native apps. Unfortunately, outside of Mac apps, iOS apps, and games, they don't seem to be willing to support the development of those types of applications.
I spent 4 years developing in a mix of Forms, WPF and MFC for such domains between 2014 and 2018, and they keep looking for people in 2021, without any plans to migrate to Web technologies.
Regarding MFC, given the way Microsoft killed C++/CX and replaced it with clunky tooling for C++/WinRT, I actually consider it more productive than coding IDL files by hand without any kind of Visual Studio support.
I expected the common programming language would be an improved Visual Basic or other Visual Programming Language (drag component to screen, write event handler), and that desktop applications would continue to be the dominant form of computing. DHTML looked like a toy, and I didn't take it seriously, and although I knew the internet was a revolution I only saw it as a means to distribute information, not as way to do stuff, so you would have to download and install an app, not run it in the browser (or buy the app in CDROM and receive patches via the internet). Most people had either limited (pay per minute, modem speeds) or no access to the internet.
I expected most people to be able to create and publish their own websites using desktop apps like Microsoft FrontPage or Dreamweaver, and that these types of apps would be the dominant way to create content on the web.
VT100 -> Net terminal, Web Page -> Web App. It'll move to "native apps" as mobile and desktop OSes merge, and then it'll go full-remote back to the VT100s. The circle of life...
It's less about what I expected, but more about what I did not expect.
The pace of technology change has accelerated dramatically, the complexity of software has grown beyond imagination, and perhaps correspondingly, the fragility and interdepencency has grown out of control.
We (myself included) have become so acclimated to complex software as users that we forget how much work is buried under the surface of a "simple" app. Even accurately describing the behaviors of a system is difficult now (and often not even documented, partly because it changes so often).
When I was taking flying lessons in the 90s, I couldn't believe how antiquated and manual-intensive the aircraft systems were. There seemed to be so many places where pilots needed to follow a checklist to perform manual tasks which could have been automated based on various available data. When I asked my instructor why it was so archaic, he explained that it's a long and arduous process to get new things certified by the FAA.
This difficulty in getting advanced systems approved was mostly due to the processes and tests required to ensure a very high standard of reliability and safety, since lives were at stake. At the time, I thought it was ridiculous. But seeing some of the Airbus automation problems (which cost a few aircraft and some lives), and then seeing the Boeing 737 Max disasters, I see how slower advancement, more testing, and slower release cycles can be beneficial.
But in the software world, the more modern approach is "move fast and break things". Not only is software now never complete (in contrast to when software was released once or once per n-years, on disk or CD), now it is released every 5 minutes with old bugs replaced with new bugs. There are days where every time you open an app, it needs to update first. I'm not sure this is a net improvement.
If I could say one really positive thing, it would be that tooling has gotten really nice. Even I recall when syntax highlighting arrived... and at first was laughed at. On the other hand, the better tooling is virtually required to keep up with the ballooning complexity and volume of the code we write.
I could rant and complain many more paragraphs, but it can be summarized thusly: modern software (and its development) has made some great improvements, but those improvements have been largely offset by reductions in the quality and formality of practices.
What's really weird to me is that is sure seems like non-super-famous non-rockstar teams (so not, like, Carmack & friends) in the 90s could accomplish a hell of a lot with maybe 1/5 as many people as the same thing would take today, and I don't know why that is. Today you give me the spec for a program that a team of a dozen, including the office manager and an intern, did in the 90s in a year, and I'll tell you you'll need 50 people and 18 months, and I won't be wrong (if anything, the time estimate will be way too short). And this isn't just me, I look at team sizes for some products and wow, the WTF factor is high. Or there'd be like a five-person team but they'd write a whole damn GUI toolkit or new language or something crazy like that in the process of building their product, and it wouldn't even take longer than doing that today with a huge team and a ton of off-the-shelf software and all our modern "productivity tools".
Part of it is that "Agile" development methodologies became popular in the 2000s, and well, let's just say they've turned out to be a lot less 'agile' than was hoped for.
I went to university in September 1997. In my first week there I used the internet for the first time. It took me about an hour to decide I wanted to learn how to make my own website, to start figuring out what I needed to learn, and to fire up an editor and to write my first HTML tag. 6 months later I got my first freelance gig. I've been making web stuff ever since.
I got it right, but with hindsight it was pretty obvious it was going to be a big thing.
I'm about 6 years younger than you judging from that timeline. I had a home phone line that could connect to a dial-up server. The desire to create websites didn't hit me until I too was in college, and by that time the web was much more complex than in '97, and I never got beyond basic static sites in Dreamweaver.
More than a decade later, I finally learned how to do it. HTML+CSS+JS and a bit of PHP for WordPress. I'll never be paid a cent for this knowledge as I'm not a web designer, but it feels good to work on my own sites, just as an expression of myself, and as a hobby.
I actually own some of the more speculative non-fiction books on "Cyberspace," in particular the representational aspect. I did expect some higher levels of abstraction to be present when it comes to working with data, browsing file systems, making connections, and so on. I was not necessarily expecting wireframe graphics, but yes, more abstraction. The closest I have seen is something like WinDirStat when it comes to looking at a directory, and even that is just colored blocks. I knew the graphics processing would be intense to match the bandwidth of the human visual channel.
I did see the Wild Wild West era of the Internet coming to a close, having donned white and grey hats myself off and on. The early Internet had security holes such you could drive a bus through them honking and dragging a bunch of rusty bikes behind you. The invention of the img tag more or less began the Advertising Era, whether they knew it or not.
I did have an early suspicion about privacy, and permanence, so my Usenet posts from 1989 never had my name on them. So far this has remained a decent decision.
I did not expect a trend in programming languages I have a hard time describing but can only sum as "yes, every programming paradigm and idiom will be crammed into that language over time if allowed."
I have never expected VR to take off until other technologies are in place and cheap. They aren't here yet.
I have never expected the "Fifth Generation" anyone-can-code drag-and-drop thing to happen and I believe that by the time such a thing is available (and effective at scale), it would be AI anyway.
None of the approaches to AI have been promising to me, aside from Cyc, which I would describe as being necessary, but not sufficient.
This certainly takes me back. I was a wee teen in the late '90s, but I too remember so many commercials and movies that represented the Internet as a new paradigm for visually-oriented computing. Not really skeumorphic designs of file folders and desktops and 'My Briefcase' icons.
I still think about that GUI Tom Cruise navigates in Minority Report. The film is almost 20 years old, and yet GUI design today seems to be focused on hiding more and more elements, instead of making the canvas bigger to display more info.
Exactly. I don't hate the general concepts or anything, they're obviously important for organizational structure, but the static and uninteresting representations do not take advantage of some of the rich bandwidth of the visual channel. We could see a lot of things about a directory, or at least have them hinted -- a sense of age of files, perhaps ... a collage of some oft-accessed files ... a TF-IDF for a document jutting forward only to be subsumed by a video segment washing over it, and so on. File sizes could be represented on some kind of log-scale.
And you're so painfully spot-on about hiding with GUIs. Microsoft continues to wave the utterly unnecessary 3D Objects in my face, and My Videos, and so on and so forth. One Drive appears to jostle for my attention, but meanwhile I have to do additional work just to see where in a directory tree I am.
I would have to dig through my boxes to find those books. They were rather dense and a little esoteric.
The grey hat stuff consisted of things like breaking out of the restrictive policies of my campus to get onto the Internet proper, tricking the equivalent of the (yet to be dubbed) script kiddies to incriminating themselves, and sort of Robin-Hooding accounts back away from a gent who spent a lot of time hacking accounts of various campuses to cause havoc before tying everything up in a neat dossier for someone else to deal with -- there was eventual deportation involved, if I recall, but that may have been just rumor.
I did rather discreetly inform the department of my major that their choices of passwords and PINs were not as clever as they thought they were.
This was in a university setting I take it? I wouldn't have thought the early Internet/NSFNET was exciting enough to hack into. Usenet, yes. But the Internet? Wasn't it just a document retrieval system for scientific work?
Well, we couldn't get out to it to even see what was there, which, by then had IRC, gopher, archie, and so on. Usenet. It was all locked away and we were curious.
We randomized the exchange listing, split up the results, and then proceeded to war-dial the entire exchange until we found a forgotten (and unprotected) modem line in the math department, which then hosted all of computer science. From there out to what were called "telnet gates," which were known only from rumor and just knowing of them was a commodity in and of itself.
"Telnet gates" being early form of a static proxy funneled through separate hardware? If so, I could see how they would valuable for ,say, the Kevin Mitnicks of the era.
I still remember my first palm pilot and already wondering why it wasn't a cell phone. I had a Nokia 9000iL that connected to the internet using a built in modem at 9600 baud in 1998 so I saw the internet as fully mobile from very early on.
I thought tech would be used more for good. The 90s were an incredible era where the internet and the people using it were usually good people interested in changing the world. Now everything is a scam, or collecting information on you. In the end, all internet money flows to advertising which is just sad. It's so over-commercialized and my original hopes have long been dashed.
I hoped for a future free of Microsoft software, and free of UIs designed by software developers. I hoped for a future with no drive letters. The first two are successful: I use no Microsoft software in my life, and many user experiences are now designed by specialists. Although I don't have to deal with drive letters, it is odd that Microsoft still has them.
> Although I don't have to deal with drive letters, it is odd that Microsoft still has them.
They don't, really. The Windows NT kernel has a unified namespace — it's just not one with a filesystem at its root, but rather an object namespace, so users (and even developers) are almost never exposed to it.
C:\ is just win32.dll-wrapper-ese for the NT object path \\?\Volume{some-guid}\, where \\? is the NT object (local) namespace root. You can find the GUID of your filesystem in diskmgmt.msc and navigate Explorer to \\?\Volume{some-guid}\ just fine. Look ma, no drive letters!
Perhaps surprisingly, despite \\?\Volume{some-guid} being the name of a Volume object, what you find under \\?\Volume{some-guid}\ is a regular filesystem of Directory and File objects.
It's as if Linux was set up such that / was a hybrid of procfs+sysfs+devfs, and all the filesystems were accessed by going to /dev/disk/by-uuid/... and navigating "into" one of the block-device nodes there.
Awesome thank you for this, will dig some more on this, kinda interesting to me. Anything else you may want to share with us?
I use storage spaces on Windows 10 on 1 machine with 2 drives to mirror them.. so basically a software raid. It's pretty interesting and it just works, I don't have to babysit it. Same with Zfs mirroring. Wonder by how much these two software mirroring techniques differ from each other.
Can you please expand on your dislike for drive letters? Why should we prefer to not have them? What will be in their place and how will it improve things?
Genuine question, trying to imagine how/why I would not want drive letters. Maybe each device should just use the uuid set by the manufacturer? But then that's too ugly to display in a file explorer.. and often I don't just have 1 partition per physical device, so multiple mount points.. and other times I have 1 mount point with multiple backing devices (mirrors/raid).
I thought WebTV was on to something big, in my mind more people would be spending time in interactive environments/encyclopedias and playing with 3d toys in their living rooms instead of passively watching television
now people watch more static video content than ever and are damaged to a point where it will take a generation or two to recover if it all -- whooops
Overall, I got everything wrong. Technically, commercially and socially: wrong, wrong and wrong.
I thought things would develop along lines that were of interest to me or which appealed to my personal aesthetic of computing (textual). And that viewpoint turned out to be the most myopic possible in terms of predicting what would actually happen.
If I predict something then almost certainly it will not occur.
To be fair, if you go back to the early releases of Windows and the Mac it wasn't clear that they would succeed. Both struggled to penetrate the market early on. People were happy with Word Perfect, Lotus 123 and dBase running os DOS. Even the developers of those applications were caught with their pants down when Windows did take off.
I did not expect html, http, or even tcp/ip to become dominant. I didn’t expect CSP centralization. Video over http/tcp? You have got to be crazy. Portable code in text (JavaScript) seemed ridiculous.
I was quite young at the time writing file sharing and chat network applications, and did a lot of experiments building WANs for doom. Most things I was interested required single multicast and rebroadcasted packets with low ping time over a fully-decentralized network. Also, tiny bits of mobile code that would load dynamically as you use a network application. Binary serialization versus plaintext. Progressive data loading, especially with fractal image formats. This wasn’t really about the future though; it was how things worked. Looking back, technology was mostly not about how to do things better, it was about raw horsepower to do things quickly and poorly. A lot of things changed with ISPs too, because they started throttling uploads and now mine doesn’t even allow open ports. P2P ping time among neighbors is much longer than cloud ping, which is absurd. A lot of this has to do with security, which is a bigger desk than I thought it would be.
I remember downloading and compiling Mosaic and thinking that this is stupid. A glacially slow way to display text with links and graphics, who's going to want that? Why not display text as it was meant to be displayed, on a terminal.
I still think there will be a return of peer to peer eventually, since every pocket has a computer capable of computation and bandwidth unimaginable back then. The ultimate form of data privacy is when you control the power plug.
I downloaded and installed the first “browser” (what a weird name) to see what the hype about “hypertext” (sounds sci-fi!) was all about.
It was a page of text where some words were underlined and blue. And if you clicked on a blue word, it would jump you to a new page before you were even finished reading the first page!
I remember thinking how dumb that was and must be for people with short attention spans. Uninstall and move on. This WWW thing is going nowhere…
Same here. It was just sending text across a network, how boring!
Also the developers who worked on the Web were using scripting languages like Perl, not real languages that had to be compiled.
In the late '80s I started studying AI in college, fully expecting it to boom. That did not really happen, until it did. Unfortunately, most of what I learned about AI back then is pretty useless now.
I started with symbolic AI: parsers, tree searches, expert systems, logic. Even then it felt like this approach was stuck in a rut. So I changed lanes and studied early ML: decision trees, genetic algo's, nearest neighbours. When I left college SVM's were all the rage.
All that time, neural networks were deemed interesting but unfeasible for anything realistic. Little did we know.
I feel that if I wanted to get back into this field, I might as well start all over again.
Symbolic AI gets used quite a lot, we just don't call it AI. Even expert systems, often seen as a late-1980s dead end, are based on the same sort of technology one would use for efficiently querying any database.
If the computer solves problems and we know what it's doing, that's software engineering
If the computer solves problems and we don't know what it's doing, that's AI
If the computer doesn't solve problems and we don't know what it's doing, that's Friday afternoon
To be fair, if you have the basic stats/code knowledge, you can totally pick up modern ML pretty quickly. You already know the difficult parts, the easy parts are pretty easy.
This may not apply if you want to create new ML algorithms, but those jobs are rare in any case.
I was told when I majored in CS in year 2000 that "in 20 years Java will be the only language" and also "it will impossible for an American to get a job as they will be hiring all of their programmers from India"
At university were told that we were running out of ip addresses as in the future even fridges would have an ip address so it could have a web cam to see what you needed to buy at the supermarket. I wish they had told us about cryptocurrency instead of the fridge-cams.
Check out Interix and Services for Unix. The surprising thing about WSL from a 90s perspective is that it's Linux-flavored instead of Solaris-flavored.
Opening up a Linux tab in the open source Windows Terminal from Microsoft would have seemed like dark magic to someone in the '90s. Or maybe I am a witch...
30.7 million sold just in 2020. 40 million has been forecasted for 2021. Various stats for the number of software developers on the planet cite numbers between 24 and 26 million.
MS doesn't, to my knowledge, post stats on WSL installations, but it's not 30 million+
I bet the amount of Windows PC sold worldwide beats that.
Yes not everyone installs WSL, because it is a tool for sell PCs to GNU/Linux developers that rather pay Apple and are neither buying Chromebooks nor Apple laptops.
I expected distributed parallel computing to be the norm, for software to get faster, not slower, and for advances in algorithm design to lead to more efficient, robust, advanced solutions to common problems.
We still build ETL pipelines to process text files into new text files on a single machine like it's 1995. Oh, but it's a virtual machine now. Progress?
Coming from the Amiga I was hoping that productivity and creativity tools on PCs would eventually be more open to collaboration with other tools, so that the whole computer becomes a highly personalized "workbench" instead of just starting an application and then doing all work inside that application.
And I wouldn't have imagined something as locked up as iOS in my most dystopian nightmares. On the other hand there's Linux and RaspberryPi, which gives me some hope for the future.
I’m embarrassed to say this on HN, but straight up I thought 9600 baud connections were the shit in the late eighties - who would ever need more bandwidth than this?!? Now I get pissed off when the 4K video on my phone stutters for a second.
I would be impressed by that. Our first modem was 300 baud, and that's apparently exactly the speed you can still keep up with the text as it comes in.
Java everywhere. I'll stretch the timetable into a little into the early 2000s. At my job, Java did run everywhere, from my Sharp SL-5000 Linux handheld to mainframes. We had applets in the server, and J2ME on phones. Sun was on a roll, Java was a good language, had garbage collection and good object orientation.
In 2021 I'd say that JavaScript has taken its place, running all those places, near to or as fast as Java does, adding functional to the repertoire.
I had hoped desktop development would be native compiled RAD tooling similar to Delphi and C++ Builder, the Web would have stayed as interactive documents, all computers would be improved Amiga like experiences and the depency on the CLI would have long been replaced by REPL based environments.
Ah and all modern OSes would be written in memory safe systems programming languages, following the footsteps from Oberon and Modula-3.
I'm sad that things didn't get all the way there, because it sounds like it could have been an interesting world to be in.
Delphi/Lazarus/QtCreator are still some of the best RAD platforms for desktop apps and the idea of being able to easily add abstracted cross platform components that would work with the native frameworks of the platform you're working on was great.
And REPLs really help with development in every capacity that i can think of, be it Python's, Ruby's or now even Java's REPL - heck, even platforms like Repl.it and Jupyter have gotten more popular and I'm all for it.
While web being just documents would be a bit boring, i also feel that it'd be way better for the energy consumption and often the performance and user experience as well. Not every site needs to be a SPA, if you just want to display some content and allow simple interaction.
And personally, memory safety is the way to go for most things! Not having it be at compile time and with zero cost is probably what's keeping us back from adding it to the OS level.
I'm not so much thinking of the future, but the these are my memories of programming / computers as a teen of the 90s (vaguely in chronological order, but it was a while back, forgive me):
- Grey, so much grey. Backgrounds, buttons, shadows
- Marvel at my FrontPage geocities prowess. <img source=Under construction.gif>
- DHTML
- OS/2 is going to be huge! So much power to be able to multi task
- Eventually everyone will have an AOL account
- Eventually everyone will have a hotmail account
- Access will rule the world. Visual Basic. VBA. Gaudy block colorf UIs will be du jour
- OMG Java Applets ruling the world
- VRML will reduce the need to know html
- COM/COM+ is the future of programs transfering data between components
- Buying more machines and spreading the load, rather than bigger machines will be the way of the future
- Novell Groupwise vs MS Mail
- Netscape should just buy Microsoft
- Borland C/C++ was going to rule the world
- Drag and drop UI designers might someday not screw up at the faintest hint of advanced layout (and even might get fast eventually)
- Windows would "Beat" Macs
- Linux will eventually get to be mainstream (desktop)
- ColdFusion will be the fastest / most secure server product
- Having opinions on Lycos / Altavista / Yahoo Google
- Everyone will want to run their own home squid server to make things faster
- someday programs would be big enough to come on multiple disks
- someday programs would be big enough to come on multiple CDs
- Neuromancer / matrix / shadowrun style human computer interaction would come true in the 2000s
- Human computer interactions would tend towards mostly speech / audio rather than qwerty
- jQuery accordions will rule the world
- XHMTL will be the new hotness
- XML will save the world from incompatible data and save so much time
- Source control might someday not screw up XML / UI design files
From the perspective of early-90s, I was all-in on Plan 9 and expected to see a lot more influence from it. Unfortunately that didn't really happen, with a few exceptions (/proc).
Man I see some really neat concepts in operating systems that were created in the 80s and 90s and somehow we got stuck with a handful of Unix-likes and Windows with small parts of operating systems of the past frankensteined in occasionally.
Rob Pike has a page on this where he laments that Unix-likes and Windows have become the only dominant OSs so OS innovation these days has mostly died. While there certainly is a resurgence of new ideas in OSs like GenodeOS, Fuschia, and others, OS development has slowed down a lot. It makes sense. Every time an OS is rewritten, applications need to make huge fundamental changes to how they work, for little gain for application developers and users. Until there can be a set of incentives to push OS development forward, there won't be much progress.
Given rob's disdain for academic research and the literature in general, I'm not sure he'd be my go-to guy for analyses about this, but I agree that we don't see the kind of OS research we saw in the 90s.
I think a lot of the things we were seeing at the OS level in the early 90s are now feasible to do in user space. That, plus the fact that it is prohibitively hard to talk to devices [1], has reduced the innovation in this area but also moved it elsewhere.
Having dealt with the pain of writing OS code, or libraries that have to function in one or more OS's, I can see why people prefer to innovate elsewhere if they possibly can.
That being said, there's a lot of possible "systems" work or work that could be described as "operating systems" in the broad sense that isn't in-kernel, and that work seems to have tailed off too.
[1] My favorite data point on this, which isn't even one of the more complex devices: back when I TA'ed CMUs "build a preemptive kernel" 15-410 class, the students wrote a kernel that would really boot. We had an old machine for their kernels if they wanted to see things working on a real machine not SIMICS. However, it had to be an old machine with a PS/2 keyboard, as the complexity of talking to a USB keyboard (or so I was told; I never verified this) was roughly equivalent to the "build a entire preemptive multitasking kernel" - and obviously far less pedagogically useful.
My prediction is that once the capability reaches contact lenses, it'll become mass adoption. The physicality of wearing a headset while walking or moving is difficult for lots of people, and tiring. Contact lenses weigh a lot less, and are less intrusive to others.
Of course, it doesn't come without pitfalls, but that's just my prediction.
I think glasses/goggles are less intrusive than contact lenses. Lenses are irritating and difficult to insert. It's also impossible to share contact lenses. I think VR is already pretty close with the Quest 2, to what it needs to be to get mainstream adoption. A little lighter, a little thinner, maybe 5-10 more years of iterative improvements and we'll be there.
I wore lenses for 4 years and couldn't stand it. So I had them blast laser beams into my eyes in an attempt to fix it, so that I never had to wear lenses again. And it worked!
My hindsight also shows me that I trashed the WWW as "yet another system like gopher and WAIS that probably won't amount to much", so I should not be trusted with predicting the future, or even possibly the present.
Right? Tech was going to level the playing field for everything. In a way it did...in so far as it concentrated a lot of wealth and thus 'leveled the playing field' for us losers.
If you're a programmer and in a first-world country I'm not sure I'd count you as one of the losers. I'm thinking of people in the gig-economy and workers at Amazon warehouses etc...
Stores/entertainment venues we've 'lost' (though they've found ways to survive sorta but diminished) that feel like real, societal level losses: pinball halls, arcades, bookstores, music stores. Tech was supposed to make the barrier to be a book author/musician much easier, and it has to an extent, but not without costs. Like would modern media be where it is if newspapers and magazines hadn't gone through the 00s-10s?
Arcades are actually back with Round1. Newspapers never could've lived for long, they essentially survived on Rolex not being able to advertise straight to their own customers.
8-bit guy had a vid on this a few months ago. Basically he opened up an electronic catalog from the 80s and showed how everything in there was items that did 1 thing. His phone now does all of that including buying the one thing.
We expected device convergence. But we expected the PC not the tablet/phone to be that device. We expected that to be an extension of the PC. Not the other way around!
I expect the act of programming will change quite a bit over the next 10-15 years. We will incorporate machine learning algorithms more easily in many workflows, so part of your work will be training models to produce black box functions that you’ll call to get answers to complicated questions.
Obviously, this happens now, but I think it will become more pervasive and many more programmers will find the pattern incorporated into their daily work stream. I see the act of programming becoming the human guiding the computer to write the programs and the human validates it.
I could be way off. I’d have bet anything in ‘94 that Beck would be a one hit wonder.
I realize I was supposed to be in the past. In ‘93, when I first saw email and newsgroups, I thought the Internet would be a force for good that would allow free communication that would have to be a benefit for humanity.
Like many, I didn’t anticipate how it would also amplify our negative qualities too.
I still think it can be a benefit, but it will be a battle against our dark natures.
Not quite in the 90's but a bit later I was sure that when Moore's Law would stop we would see the trend move to cores. I wrote an article[1] about this change and how we should all prepare for it :-D
If I could tell my young self that as of 2021 we are not talking about kilo- or mega-cores but that my new M1 which I am very excited about is a mere 8, I would have been severely disappointed.
I'm not a chip designer or anything like that, but I think it isn't the difficulty of creating processors with so many cores (or arrays of processors equivalent) that has stopped us, but the cost benefit of increasing cores to the expense. I use an HPC cluster at work, but for what I'm doing it makes sense to use the hundreds of cores available. Indeed, most of it is embarrassingly parallel.
Aside from GPUs, we have progressed on cores, just not as fast as we thought, and multi-core hasn't been around for all that long. Multi-chip modules like AMD's chiplet approach were needed to get things like 64-core Threadrippers. I can't imagine trying to get 64 full CPU cores onto a single die at an acceptable yield.
Strictly speaking, GPU's have thousands of ALU's/execution units, spread over a few dozen full cores at most. And reaching full utilization of that amount of compute is not always easy either, many problems just don't parallelize that widely.
I thought the digital divide will be a serious problem. It is in fact a problem now but not how I thought about it.
I thought, in the future, everybody has a computer so it will be a part of basic literacy. Most of our generation will be obsolete because we can't stand a chance against people who all had computers in their home at their birth.
Then, the smartphone was invented and most people were just satisfied with that. Now there is even less people who know how to use the keyboard than our generation. No matter how hard you try, you can't beat the efficiency of keyboard by palm-size touch screen input device on writing.
In the late 1990s to 2000s, there were so many personal text web sites, public forums and chats in my language, Japanese. Now it's all gone. There is no place in the modern Internet I can enjoy serious text-based communication in Japanese. Because, everyone use the smartphone now and it's so inefficient to write a serious text with it.
I also didn't anticipated the huge rise of SaaS. A software that depends remote server to function. I was the believer of P2P technology in 2000s. I believed future network will work on top of P2P mesh network. It failed for various reasons but mostly too inefficient to scale.
Then I see the rise of cryptocurrency. It's horrible. Nobody really cares about the technology. People relies on SaaS wallet service rather than hosting full node. It's the worst hybrid of server-client and P2P. Inefficient and still has authority. NFT... it doesn't even host data in the blockchain! TITAN... it's not a stablecoin. It's just a ponzi-scheme-coin.
I hate the smartphone. But right now, it's really hard to avoid programming job that doesn't involve the smartphone. Even if I don't directly need to deal with it, it still indirectly support the smartphone, further helping the spread of inefficient input device for the writing.
I lost all hope for the future right now. It will be even more orwellian world. Most of the computation will be done in the server side and our computer is mere thin-client to the server. Everything is censored and regulated. I wouldn't surprise we lost the ability to write freely.
Gradually, I've learned that the history of almost all technology is this:
1. Smart, well-meaning people see a problem, and try to craft a solution.
2. Bad actors completely ignore these good intentions and use the new invention for money and power, harming or exploiting the general population in the process.
This isn't to say progress is never made, but the lion's share of the benefits of any new tech always ends up going right back to the rich and powerful that didn't need it in the first place.
I wasn't a programmer in the 90's, but I did grow up then, and into the hacker culture of the 00s. Looking back, I feel comradeship for the radicals in the 60s. A revolution was in the air, things were going to change! And then it all petered away to nothing.
Smart, well-meaning people constantly dream of freeing everyone: from poverty, or political oppression, or ideological entanglement. One more advance, one more theory or piece of tech and then we'll have it! But the general populace will never be free, because they don't want to be. On an unconscious level, they desire their cage. Freedom is scary and uncomfortable. That's why smartphones replaced PCs, and why software and the web became a closed, user-hostile morass. Most people do not want the power to self-actualize. They just want to be kept comfortable until they die.
>>In the late 1990s to 2000s, there were so many personal text web sites, public forums and chats in my language, Japanese. Now it's all gone. There is no place in the modern Internet I can enjoy serious text-based communication in Japanese. Because, everyone use the smartphone now and it's so inefficient to write a serious text with it.
If you mean forums in the vein of Ayashii World, they still exist in the form of 2channel/Futaba. Unless you mean something else.
I thought x86 was on its way out in the 90s. It was, but amd64 gave it a new lease on life.
We had search back in the 90s. Archie, Yahoo, Lycos, Altavista, etc. But Google brought better ... advertising.
What did I expect? I expected hardware to get better. I didn't expect the phone to be so dominant. The iPhone 12 to someone from the 90s might as well be a Cray tricorder.
Hardware is a LOT better now. Dunno if software is that different. JITs are crap; they re-invent the same thing over and over. Linux re-invents what VMS and other commercial operating systems have already done. Most progress has been either hardware or enabled by hardware.
It's really a shame what happened to the possibilities for end-user programming/scripting at the OS level. The landscape today is quite dreary. Ask a regular person to "make a button" in their computing environment and they can't, because usually their only choice is to learn a full-fledged language and development environment. And this is in a situation where users make prodigious use of buttons in their daily lives.
I was expecting open source to take over. I was also expecting something like a decentralized form of cloud computing, similar to Plan 9 where you can just attach remote resources to your local PC and get instant access to their CPU, storage, etc. I was expecting the internet to turn into decentralized peer-to-peer "fabric" where everyone could publish or subscribe to whatever they wanted from their own machine. (When you consider that in the 90s a large server had a tenth or a hundredth of the CPU/RAM resources of even a rinky-dink HP Stream laptop from Wal-Mart of today, there's no reason this shouldn't be the case.) I was expecting diversified CPU architectures that could run architecture-agnostic "mobile code" like Java or Inferno, or even NeWS. A remote resource could send my machine code to run to present an interface, etc. We kind of got it, in the form of web apps but it's not quite the same. What we got was a corporate hellscape of siloed data and siloed user interaction.
I was expecting Guile to supplant Perl and Python, and for us all to benefit from the greater reach a Lisp could give us. What we got was motherfucking JavaScript as far as the eye can see, and development processes that hobble productivity so that companies can keep hiring programmer-clerks in legion strength and the executives remain unthreatened.
>I was also expecting something like a decentralized form of cloud computing, similar to Plan 9 where you can just attach remote resources to your local PC and get instant access to their CPU, storage, etc.
This was definitely a common vision. Being able to attach machines together ubiquitously. We would be able to have a room of multiple computers, as well as computers over the network, all working as one machine; sharing peripherals, audio-visual inputs and outputs, CPU, storage, etc. And it could all be used by multiple people at once. There would be no walled gardens required to share my screen with someone else across the world, or play the same music from every device.
I'll concur on the drag and drop programming. I started with VB in the 90s while doing tech support and then made way to C and then Java. I figured Java was short-lived when I took a job during v 1.1.7. It was slow and clunky. There were numerous articles about it doing OO the wrong way (not that it made any difference to me).
I though VB would grow and mature with building UIs and backing with more robust code, but it just kind of petered out.
If anything iOS storyboards attempted to bring that idea back.
I did not see infrastructure as code being a thing.
I used to work in the telecom world back then. I would have thought that by 2021 the meticulously-architected ATM (Asynchronous Transfer Mode) standard would have replaced the ancient TCP/IP protocol. Nope.
Then I would be wrong again, because I would have thought IPv6 would be everywhere...
Heh I toy around with a lot of retro networking/telecom stuff. Mostly ISDN and X.25, but a little Frame Relay and I’ve also considered purchasing some equipment to toy with ATM. I find it more interesting than the IP monoculture we have now.
I learned about bothTCP/IP and ARM in university, and my take-away was that ARM was specifically for phone calls, not data. Maybe that perception held it back.
Or maybe TCP/IP was simply too entrenched. Even IPv6 has been unable to replace IPv4.
After experiencing Smalltalk (at the very end of the '90s in my case), I did not expect that the industry would continue using primarily text editors (like VS Code) for writing software. It was clear that there was a "better way" that wasn't file-centric and that we'd be freed from debates over code formatting by richer models of the underlying program source. And now I work with people who think VS Code is the best source code editor ever (or vim or emacs or ...).
I completed my study in compsci in early nineties. I was in for video games : optimizing 3D engines, etc. So I was expecting 3D to get bigger and faster but I got bored earlier than that.
Interstingly, I absolutely didn't see the web coming. I was very interested in BBS'es and Minitel (France) but I didn't connect the dots until I heard of the Amazon and FaceBook and Google which made it big.
But TBH, I didn't see how transformational was computing. Most of the revolutions that came, I saw them as mere gadgets (including the mobile phone), or funny things to do. It's only very late that I understood the potential.
The only things I saw right were : Linux will eat everything (not because of hindsight, just because I wanted it to be :-)) and I saw that independent gaming would come to life when tools would be cheaper (and I was spot on).
My third prediction was : computing will become invisible and ubiquitous (c'mon one has to carry a computer in its pocket all of the time ? tss...). This has yet to realize :-) Maybe with 5G ? :-)
I was also fortunate enough to learn FP back then and it allowed me to see the future of many programming languages.
But overall, I was "inside" the computer and I didn't realize how much it'd change things and how much it'd change itself.
I expected VR, or at least 3D spaces, to become a de facto way of navigating the internet. In hindsight that would be horrible of course, as it would be slower, resource-hungry, and overwhelming to deal with for 10h every day.
In the early 2000s I remember UBUBU which basically was a customizable 3D solar system cover for your desktop + internet links. I found it pretty cool, the possibility of rose-coloured glasses aside.
Didn't realise it in 1995 when I started programming. By 2000, during the bust, many were talking about global disintermediation and empowering of individuals.
And I kept arguing that in a decade and a half, there would be a consolidated monolith and individuals would loose power for a long time to come.
Whenever there is a power tool, expect the power hungry to want to control it. My personal motto. Proves right every time.
This fall marks 41 years since my introduction to computers and programming. In the fall of 1980, I got to start using my jr high's DECWriter II teletype terminal (with greenbar paper!) connected to a PDP-11 40 miles away via a 300 baud breadbox modem. BASIC! FORTRAN! Pascal!
But I digress.
Looking back on the three decades since I graduated college in '90 (with CompSci as one of my 3 majors), there were three things which I saw as big game-changers, and became both a user and ardent evangelist for: the TCP/IP Internet, Linux, and Free/Open Source.
Today, it's easy to think of these as a given. But in the 90's they were not, with many battles fought, the outcome of which was not always clear, not always successful. Competing protocols, OS', applications and platforms, business interia and ignorance (not in the negative sense, but unaware/uneducated), and commercial companies protecting revenue flow ("No one ever got fired for buying IBM/Microsoft/Novell/Oracle").
I was sure that money would be effectively eliminated by MLM (multilevel marketing) plus six-degrees-of-separation, aka affiliate networks.
I figured that once MLMs could be automated they would replace advertising. Folks would make money connecting each other to products and services through a kind of generic MLM network. Word-of-mouth on steroids. Heck, I still think it could work...
What else?
Flying cars, and flying cities (see Bucky's Cloud Nine), fusion power and atomic tools, robots that could wash dishes and do the laundry, factories that build and delivery whole houses (Bucky again: Dymaxion Home), arcologies (ecologically harmonious mega-cities designed and built as single buildings, Paolo Soleri), Moonbase ("Welcome to Moonbase" by Ben Bova), and those are just the plausible things, linear projections of the capabilities on hand. Not even talking about nanotech, longevity drugs, elctrogravity, etc.
- - - -
It's much easier to think of things that did happen that I never expected. The first major divergence was Twitter. I don't think anyone predicted Twitter. Then there's Facebook. If you could travel in time to, say, the 80's or a littler earlier and you tried to explain FB to people they would think you were stark raving mad. "Put every detail of your life in the hands of some corporation? A voluntary spy network beyond the wildest dreams of the Stasi or Gestapo? Yeah right." But here we are... I never thought JS would still be with us, let alone turd-polished to such a gloss. People spreading lies and misinformation, that's not what the internet was invented for. It was supposed to put the world's information at your finger tips, not be a mighty bullshit engine. "Phones" that excoriate minds, no one predicted that (except the Church of the Subgenius: the "white stone" came as prophesized.)
> I figured that once MLMs could be automated they would replace advertising. Folks would make money connecting each other to products and services through a kind of generic MLM network. Word-of-mouth on steroids. Heck, I still think it could work...
This is happening now with crypto and the DeFi space. Products create tokens and the users hold it to be able to use some of the functions which drives the price of the token up and the developers profit.
I am still kind of surprised Microsoft has not switched to the Linux kernel yet.
Not much else has surprised me.
I have been programming full time since the 80s.
Still using vi with a shell when I need to get stuff done quickly. My own use of visual tools as come and gone multiple times.
I guess I count as a 90s programmer? I did technically write programs in the 90s.
I thought Java applets (those things you'd embed in a webpage) were hot stuff that would completely reshape the Web someday, but they managed to effectively die before the proprietary Flash. I thought Java as a server language was doomed for performance reasons, but that turned out to be just fine. Basically I completely misunderstood where Java was going.
The expectations around smart home stuff were way off too. E.g. expectations were more along the lines of your house reminding you to go buy toilet paper. What we got was the ability to ask a hockey puck to buy some more when we notice ourselves running low, if you can get past the privacy concerns.
I wrote this months ago, but worth repeating- from a Microsoft perspective, as Java was seen as the Microsoft platform killer by many:
“Java was a token threat. Bill knew it would not be performant enough on clients for Applets to succeed. They had to rebuild their language to put every fixed-size object on the stack because sorting an array<int> on a typical machine took forever.
That’s why Java still doesn’t have operator overloading. That Java battle would be on the server, and [MSFT] didn’t have a good language to bring to the battle there. Wired and co loved to paint a picture of Microsoft v Java as Goliath v David. Not true - Java was a speedbump.
The real threat was dumb terminals (aka web browsers). Even today, the world could run on Mac OS / iOS and Safari. If Microsoft was okay with sacrificing Windows market share and instead controlled content and services, they would be much more valuable.”
> The real threat was dumb terminals (aka web browsers). Even today, the world could run on Mac OS / iOS and Safari. If Microsoft was okay with sacrificing Windows market share and instead controlled content and services, they would be much more valuable
They were more valuable (as % of total stock market) back in 1999.
MSFT massive valuation and the consequent Gates wealth swelling to 150B back in 1999 was the reason why the DOJ attacked them.
I thought that by now we would get something like a global virtual reality simulation (something like one in Ready Player One)... instead now we have Facebook, Instagram, Twitter. Huge disappointment to see so much talent so much wasted.
I think that's what Fortnite, PUBG, Minecraft and WoW represent. Obviously the traditional, text+image based sites like FB and Twitter dominate global traffic, but as I understand it, WoW did act as a type of public square for a lot of communities.
I expected AOL to buy the internet as it was larger and AOL made money. But, as with many things, I wasn't willing to put any money into AOL so I missed the dramatic price increases in AOL stock. The rest, as they say, is history.
I always expected/hoped for more diversity in the CPU space. Instead, the alternatives sorta died one by one. (After learning how hard semiconductor factories are to build/revamp this was maybe foreseeable.)
Did anybody have 5 Ghz as the "max" cpu frequency on their bingo cards?
Actually, I remember touring an Intel fab. around 99/00. The guide (an engineer) was pretty sold they were "going to soon max out silicon for CPUs and have to move to germanium" based chips. (Apparently it's too expensive to this day.)
Older, I seem to remember Feynman being pretty optimistic about quantum computers? But that might just have been my takeaway.
I thought that everyone was going to be on the Internet in a fairly short time, and that would present three major problems.
First, accessible files would become so numerous that no one would be able to navigate them sensibly with the tools that we had. We would drown in a sea of random gratuitous data.
Second, user interfaces would tend toward a swamp of accidental complexity unless someone made it their business to compete on the basis of superior user experience, based on sound UX research.
Third, unless we could figure out a framework for individuals to own and control their own data, everything would end up owned and controlled by large, faceless institutions with no accountability, and regular people would be reduced to serfs with no effective control of their own data.
I wrote an email about these predictions and circulated it at Apple, trying to get someone interested. I tried to pitch the idea of tackling these problems head-on.
Nobody bit.
Partly that's because I'm not a great salesman. I lack the absolute conviction and the necessary zeal.
Partly it's because I was a minor cog in a large machine.
Partly it's because Apple was in a death spiral at that moment, and didn't have a lot of attention to spare for grand visions of the future.
I had worked for a couple of years on Apple's Newton project, then at NeXT. My expectations of the future arose from stuff I saw while working on those projects. Both Newton and NeXT tried really hard to make user experiences that were natural and comfortable and productive. Both were concerned with trying to connect people using ubiquitous networking.
Both succeeded in no small measure, but both also exposed problems that didn't become visible until you started connecting everything together and trying to design interfaces to help people make sense of it all.
I didn't sell my pitch to anybody, and neither did anybody else. Instead, I think we more or less got the dystopia I was afraid of.
People thought software would be linear, 90s was big on OOP and relational dbs, and that was cutting edge.. but 15 years later (mid 2000s) that was completely out of the door, and 15 more years later it's back.
I thought software systems would become more and more reliable with continuous refinement over time. Servers would never go down, operating systems would have no new features and would be considered finished, applications would be stable and unchanging, and all that code we all hacked would be reviewed and improved. We'd have a reliable stable base to build new things on top of with tried and tested methods.
Instead companies created an endless churn of features, rewrote everything because fixing the old problems seemed too difficult (hence inventing endless new problems), hacked new languages for convenience (and to create lock-in) then rediscovered why all that complexity was in the old languages, security was ignored, everyone built new code layered on top of the old stuff we hacked instead of fixing the old code, and everyone switched to commodity hardware creating a race to the bottom driven by low cost.
I've been expecting computing infrastructure to crumble for many years. I think the many modern problems with computing and networks, security in particular but also reliability, indicate it is starting to happen. It will take decades to fix.
Part of the problem seems to be the limited lifespan of people. The old engineers retire and are replaced by the young and a lot of hard won knowledge is lost. Books and the code itself pass on some of that knowledge but it is incomplete (or ignored because who wants to spend the time?
We need something new to sell in three months.)
90's System's Development Director here: I'd figured that enriched text/html like formatting would make its way into source code e.g. imbedding hyperlinks/tooltips to relevant documentation/screen shots/test debug cases, critical section of code highlighted in red, section of code kept for legacy in grey, annotated code with comment bubbles - this sort of thing.
The latest IDE's are certainly doing more of this with code highlighting and autocompletion, but IMHO it could still be easier for coders to more visually annotate and document code.
I started in the late 90s when I learnt HTML and tiny bits of Javascript and Perl. The only thing I can remember thinking was this should be easier. Most people used lots of different specialist tools back then, Dreamweaver for web pages, massive IDEs for programming etc. As I got into PHP later I kept thinking I was just repeating the same patterns over and over again.
Much later I discovered things like Ruby on Rails and Django. I switched to emacs and most others switched to editors like vim and VS Code. And things are just generally so much better now. A lot of the things I imagined would be possible back then actually happened.
What I got wrong was how popular the internet would become and what it would be used for. Back then it was mainly a space for geeks. Sure there were a few non-geeks doing internet shopping even back then, but anyone contributing to the internet was more of a geek. There were no girls on the internet. Anonymity was default. While, of course, there was a person behind each handle, it was no concern who they were in meatspace. On the Internet, everyone was equal. Anyone could be anything. Like Neo in the Matrix. That all changed with social media and the big corps decided they wanted to tie us to meatspace IDs. Everyone joined and they brought their meatspace bullshit with them. I didn't see that coming.
Less commercialized for sure, and certainly not being entirely consumed by predatory data gathering tactics. Also, I can recall in the early 2000's being enamored with the idea of cheap, unlimited, and ubiquitous high-speed wireless replacing all terrestrial infrastructure. Sadly, ISPs have colluded to ensure that never becomes a reality. I guess I'm just mad at commercialization overall?
somewhen around '98, i was thinking the line reality-vs-computing in the what-people-deal-with graph will move towards computing, i.e. more people doing with more reality stuff, i.e. much more (business) analysts and similar much-higher-order programming, and much less coding monkeys. Instead, coding ate the reality part, now everyone and his brother is a combo analyst-and-coder in some primitive language-based-on-some-mathematical-concepts, and only very few companies really still employ domain/business analysts. Domain languages (DSLs) didn't get the attention they should have.. there are still no tools/practical-support for making them - so instead, one has to code them into some common primitive language of choice, putting the "why" knowledge into comments.
(dont get me wrong, i am talking python/javascript, which although much higher level languages than e.g. C-et-al, but are still rather primitive comparing to even simplified domains of anything real)
have fun
While I'm not typically a very good "futurist", I was involved with things like Linux and the world wide web pretty early. I was in my 20's, mostly working on a mainframe based issue database (sort of jira-like) called PNMS, in the very early 90's.
We were migrating it to a Sun 690MP running SunOS. So, for the end users of PNMS, we got to try different ways of getting them acccess. DOS/Windows and packet drivers, Sun X-Terminals, Workstations, etc. The DOS and Windows experiences were buggy and generally lousy for the end users. The Sun xterms and workstations were great, but crazy expensive. So I was able to sell the otherwise stodgy management on Linux. Slackware, specifically, in the Linux kernel 0.99-1.x days.
Linux was still buggy at the time, but if you used the right hardware, and rebooted at night, the end users didn't notice much of the issues.
So, after getting it all working, it was interesting to me that a one-man show OS was working better and/or cheaper than what the industry leads could offer. I knew it would go somewhere.
Similar for the web. This was at a US military base, so they had access to the internet. I was given a Linux PC and modem to take home for on-call, so we had internet at the house. I remember showing my wife the very early Pizza Hut website where people in one city in California could order pizza on the site. She didn't get it, perhaps because she's very outgoing and the phone seemed natural. Being more of an introvert, though, I did get it, and assumed the web would be important. Though, I didn't predict how important.
I also made a lot of utilities for end users, and presumed that Tcl/Tk would take over the world. That was after struggling with X11, Athena, etc. That prediction didn't pan out. But Ousterhout later invented RAFT, which is sort of eclipsing Tcl now.
> The Sun xterms and workstations were great, but crazy expensive.
My dad has told me so many stories about how amazing those Sun workstations were. I think his university got them his junior or senior year, and they were the killer machines. Sometimes I wish I could've been programming in that era, but other times I think about how awesome it is that I can compile my code in under a minute.
I sure as fuck didn't think we'd see a flood of "SEEKING COBOL DEVELOPER" posts. But... as all those old guys fade out of the business, someone still has to maintain the mainframes. These massive mission-critical systems have struggled to migrate to the cloud. Realizing that a lot of the old machines are equal parts software and hardware... it's pretty crazy. I'm sure I'd want to die, but the rates for COBOL devs blow my mind.
Wish someone had told me, "Startups are fun, but look at how few of them make a buck. Be a dev, but focus on enterprise and public sector services..." As someone who likes making money, probably just as much if not more than solving problems... like happy for successes with startups in my 20s but I don't think it's something anyone should count on. Look for a nice fat enterprise / government contract. Buy yourself nice toys. Don't waste your time on projects that can't self-fund day 1.
Software services would all support being plugged into a distributed transaction manager, as in https://en.wikipedia.org/wiki/X/Open_XA . Instead there has been more of a shift toward embracing inconsistency.
Another: There was always this idea that in the future, applications would be built to run on the database, such as with stored procedures. Obviously, this was widely done, and continues to be done, but it never really matured in the way we expected. There is still a big gap in tooling to make this easy.
This is also related to the ideas we had back then about object persistence. One day, we imagined, we would just manipulate objects in memory, and all of the concerns about persisting those changes, including making it performant, would be taken care of.
Also, transactional memory was surely going to be the norm, whether in software or hardware.
> I also thought we'd see new paradigms in GUI and desktop design take over, probably involving 3D.
In a way, this has occurred. The Microsoft Bob, full-virtual-environment sort of 3D is not popular but modern UI's are generally hardware accelerated and use lightweight 3D effects to enhance UX, whereas this was very uncommon in the 1990's.
Like a number of other commenters mentioned, folks like me who were in STEM majors all had to learn how to code in some form or another, though I don't think I ever took any classes as such.
A popular joke then:
"what do mechanical engineers do when they graduate?"
"they learn to code and become programmers"
In my case, I had a part-time job in a lab writing programs in C++ to get various lab instruments to be remote controllable from a terminal and had to learn on the job how to do it. One of the professors in the lab gave me a copy of "Numerical Recipes in C" to help me with the C++ code I was writing. It didn't help at all!
I was also part of the generation that learned basic programming on a TRS-80 (aka "trash-80") in BASIC, then in HS some Pascal and FORTRAN in an informal computer club. My family never owned a home computer, but the school had a bunch of windowsOS machines curious students could play with.
Looking back now, it seems like I was really primed to be part of one of the first waves of professional programmers, and indeed a number of my friends did go that route. But by the time I graduated from university, I kind of hated coding! Recall that in the 90s there was no stack overflow, no google, no way to see how others might have solved the problem. If you couldn't figure out how to get something to work, you had to just grind away at it no matter how long it took.
At the time, I honestly thought whatever future there might be for professional programming I sure didn't want to be a part of it. And I also didn't think it would take over the world like it did. I left uni and I spent a number of years traveling abroad and teaching ESL before dusting off my coding skills and getting a series of coding jobs in various industries.
I am extraordinarily grateful that I learned "the hard way" how to code, but more than once I wish I had gotten my degree in CS instead of ME.
My older brother did this and his advice to me entering college in 1999 was don’t waste your time and go directly into computer science you’ll end up there anyway… so that’s what I did
I expected Linux to pose a serious threat to Windows and Macs eventually. The rise of OS X and the later polishing up of Windows into something acceptable took me by surprise at the turn of the century.
One of the biggest reasons for the miss was I had expected cross-platform app development to become much more mature much more quickly than it actually did. The long persistence of apps that only ran on Windows was something that I didn't expect.
I also didn't expect the fast decline of desktop computers leading to the vendor lock in effect that led to the best laptops tending to either run Windows or Mac OS, further eroding the relevance of Linux not just because Linux didn't ship on those devices but also because it was a second class citizen for drivers.
I also did not expect to be making this post from an Android phone decades later.
We see robots in manufacturing plants (e.g. cars), but not too many visible when you walk down the street. Of course, they're there, but hidden inside various normal looking objects. Automatic street lights, driverless trains, boom gates for public transport.
Robot/bot can mean many things, but typically means a humanoid object. I suppose controlling street lights used to be an entirely human job, but when the automated version of it looks nothing like a human, is it really a robot? Chat bots lack a physical similarity to the human body, but in text mode communications do resemble human activity.
Skynet won't send terminators. It'll use invisible surveillance, and use automated control systems to gradually increase accident rates and pollute the air, water, and food supplies just enough so that death rates exceed birth rates. Skynet can afford to be patient.
Up until around the time gmail was first released, it seemed like all the innovation and 'cool stuff computers can do' was in operating systems. Customisation was super popular, people were sharing screenshots of their linux desktops and there were all kinds of theming features in Mac OS 9 and Windows 98. In Windows 98 you could have an HTML page as your desktop background. Stuff like Winamp skins were popular in desktop apps. I imagined that the internet would be woven into the OS in the future, and everything would have a customisable aesthetic. Also that we would have more advanced file explorer UIs.
I started working as a VB dev in 1993 while studying for a cs degree. I can’t really remember any specific thoughts about the future tech, we were too busy literally inventing it. All I remember is this amazement at how most people still don’t realize computers would completely tranform the entire society, how easily solvable many problems are with some simple code. But also reading the Sprawl Trilogy (which has materialized in an uncanny fashion) and understanding the dangers. Our entire society (Estonia) was transforming and people generally were very hopeful for the future while working hard to get there.
Decentralization, both of computing systems and the physical economy. Also, improvements to education and government transparency. Improved environmental protections, and human rights.
I'd be interested to know what people thought of Semantic Web tech (which barely squeezes in, around 1999).
When I went to university (2013-16ish) there were still a couple of professors clinging to the idea that everyone would see the light and semantic web would be broadly implemented, but aside from that everyone else I've ever discussed it with has considered it a dead or vapourware technology. I am curious as to whether it was seen as viable or realistic back in the day, or just a pipe dream.
I've always believed that semantic web (in the sense of semantic connection technologies) would become a critical technology; just not in the way they said.
The idea of having semantic data alongside of your presentation data (embedding RDF statements in your web pages) is and always was stupid. They are separate concerns, and should live entirely separate from each other. You're not going to have content authors embedding machine-readable knowledge in with the human presentation; who the hell is going to bother with that kind of tedious, annoying work when their audience is humans?
However, data warehouses that can be automatically mined by smart machines for meaning is huge. It's still going to be awhile before our systems are powerful enough and there's data enough to make this worthwhile, but at our rate of data generation I doubt it'll be much longer before we simply NEED these kinds of connecting and semantic technologies in order to handle the ever-growing mountain of data.
We still don't have any sizeable shared ontologies, but that's to be expected since only researchers are interested in it at this too-early stage. Once the shit hits the fan so-to-speak, we'll see some company or group build a serious reference ontology in record time.
Having been around for all of the 90s: I was in my twenties. I really didn't care that much about the future of tech, because it was all fun and exciting. Maybe it's just me, but I don't think the twenties are a period of deep contemplation about long-term trends ;)
The things I did see were kind of right there. You could see Linux taking shape - and the biggest driver there, IMHO, was that Unix as an OS removed a lot of constraints Windows and macOS had at the time, that Linux was free & open, and that the entire teaching environment used some form of Unix.
You could see that Internet thing take shape. (I'm still cranky with myself I didn't see it enough, so I turned down several excellent startup options). A commercial future was still dubious. At least to me, but then, I'm not a futurist ;)
You could believe, for a glorious moment, that open sharing was an unalloyed good, and privacy wasn't necessary. (That lasted about 18 months, but for a brief moment in time, it felt amazing)
Most of us spent a lot of time making fun of early video sharing attempts, because we could do math. (Bandwidth made it impossible). We were less good in estimating how much bandwidth would increase. Also, the funniest early attempt was Oracle, because Oracle had a reputation even back then.
The one thing I did "see" early on was that single-core was headed to the dustbin. Mostly because I had the fortune to work with a bunch of people who worked on transputers. A dead-end tech by itself, but it made a very clear point that networked systems of cores are extremely powerful.
Looking back, what strikes me most is how little has changed. We had passionate editor wars. We had OS fanboys. We had our own overhyped and underused lazy FP language (Miranda, back then). We still applied functional principles, just not in that language. Lisp was supposed to be cool, but nobody used it. Nobody wrote FORTRAN, but we'd have been doomed without numeric libraries from FORTRAN. C (and figments of C++) were the workhorse for heavy tasks. The lighter lifting, you did in a scripting language (Perl), because ain't nobody got time for C++. GUIs were derided as fat and bloated, and people in general clamored for the good old times of fast and light software. Programs were at the outer limit of complexity we could manage.
I didn't expect the proliferation of tech to result in a total loss of privacy. The internet was promising to be the great equalizer in the late 1990s and early 2000s. Broadband would mean access to jobs in rural communities revitalizing towns with new growth, and it seemed like everyone would be able to participate. That view is hopelessly naive today.
Not a 90s programmer, but I think one of the greater fallacies of the optimistic 90s and early naughts was a belief that technology in general and Internet (cyberspace) in particular will be able to remedy many of humanity's woes. It turned out that, at the end of the day, even very sophisticated technology is infinitely inferior to quite simple and very well understood economic forces and political will.
When I started studying CS in 1990, people told me that "programming" was a dead end: 3rd generation languages wouldn't require programmers and anyone would be able to bend computers to their will without programming. I didn't heed their advice... We're farther from retiring the profession of Software Engineer than we were 30 years ago. But, is that a good thing?
I wanted mostly everything we have now since the 80s (portable computers, access to BBSs everywhere, chat to your friends everywhere so I do not have to call them etc), but while working on some formally verified software back then in uni in the 90s, I thought we would be better at making software by now. It is better but not so much as I thought then. Ah well, it is still a young industry.
I thought asynchronous circuits would be more prevalent in the future
because clock speeds couldn't increase indefinitely, but it turned
out that everyone would rather use all kinds of weird-ass kludges to
avoid them in ASIC and FPGA design such as frequency scaling and
multiple clock domains. I also expected functional programming to become more mainstream, which partly happened.
I was pleasantly surprised that in the early web everything was text based and super easy to mess around with. I didn't seriously expect this to last into the noughties, though. I believed that most of the protocols will quickly be replaced by something more efficient and binary. Maybe even signed and encrypted...
Just when I started to think that will never happen, it happened;-)
All of the 1990's "Desktop" software projects that failed horribly. I use way better versions of these today on the web and they actually work -- success! I am still not happy with the NCSA Mosaic user interface.
In the 80s, I dreamed of a 1 million pixel monitor: 1024 x 1024. It's 2021, and I am typing on a 1366x768 latop - big fail.
I had hoped by now that computing fabrics would be a thing. Just a grid of 4x4 bit look up tables each loaded from RAM, to allow rapid reconfiguration, and clocked like a chess table, so there wouldn't be any race conditions to worry about. You could scale it to billions of gates, route around hardware faults, and stuff would just work. Instead we got FPGA hell.
I never expected the top minds of our generation to spend their skills manipulating me into pressing buttons on behalf of other people trying to make me buy things I don’t need.
I knew video game worlds would get bigger and more realistic than the wire renders of the BBC Elite, but I didn’t expect them to be full of loot boxes and other people.
That the immense, infinte pleasure I obtained from using the internet as an academic would continue, instead of the commercialised, centralised silo's we have today.
Sure, the web is infinitely more useful today, but I feel like a corporate web slave when using it ... dodging trackers, VPN's myself against XYZ ... etc.
I thought it peaked towards the end of the nineties and stupidly stopped studying programming at all, I managed to keep a decent job with the skills I had until about 2010. I thought the internet would have died out by this time. I was such a stupid young kid.
I wanted this to happen so much. I really thought type="javascript" meant that other languages would become available, and that Perl's dominance server-side and embed-ability meant that it would definitely become a client-side option.
I always kick myself that I didn't really appreciate the trends of increasing processing power, reduced size and so on, and what that would mean (smartphones, cloud computers, etc.) While at the same time, also not having much idea about the future now. Visualising the future seems hard.
I expect the future of tech (especially data) to be more decentralized. Not necessarily because of privacy, but because there are several opportunities to use the staggering amount of power available in mobile devices. Someone will figure out a killer app or platform, and open the flood gates.
I attended a meeting on the future of computing in 1992. The speaker discussed the end of the corporate data center, and the rise of "utility computing."
Also discussed was remote storage, globally available file shares.
It looks like this is happening with what we now call "Cloud Computing."
Most applications would be distributed as bytecode (e.g. compiled Java or similar) and not as native code, commodifying hardware. You could swap your stuff over to a totally different CPU architecture and barely notice.
The Internet would be a lot less centralized than it has become with a lot more peer-to-peer traffic.
The Internet would be very disruptive to political systems and large corporations, promoting a lot more small scale DIY and anarchistic type stuff. (Standard issue 90s techno-libertarianism... we did not understand network effects.)
I thought the Internet would decentralize us physically too, making it possible to do the same work from some 'holler in West Virginia as the middle of Manhattan. (I still don't count this one out. COVID pushed us a little further in this direction. This is more a cultural thing than a technical thing, and culture changes slowly.)
C and C++ would be dead. Nobody would write in memory-unsafe languages anymore and garbage collection would be ubiquitous. We'd probably have type systems that caught a lot of bugs too. Rust is finally delivering on the latter, so this may eventually come to pass.
CPU frequencies would continue increasing, or would at least level off at a higher frequency than they have. I would have guessed 10ghz or so since beyond that the speed of light vs the size of the die becomes an issue.
AI would remain very far off. I did not imagine self-driving cars until ~2050 or so. The 90s was still the "AI winter."
Apple would be dead by now.
X86 would be dead by now. I never would have imagined it would have been possible to get this kind of performance out of that hairball of an instruction set.
Spinning disk would be dead. (It is on PCs and mobile devices, but I mean everywhere.)
I didn't predict mobile at all. I would have said such devices are too small to have good UIs. How would you use something with a tiny screen like that for anything but maybe texting?
I thought we'd all have at least a gigabit (full duplex) at home by now.
I was really pessimistic on space, at least in the near-mid term future. I didn't expect to see humans still in space by 2010-2020, let alone something like SpaceX.
I was really pessimistic on energy. By now I would have predicted we'd be well into a global depression brought on by fossil fuel exhaustion. We'd be deploying nuclear fission as fast as possible to keep the grid up, but cars and airplanes would become extremely expensive to operate (thinking ~$20/gal gas). We would have reduced mobility but we'd all be connected.
I wasn't concerned about climate change because of course we would run out of cheap fossil fuels before that would become a really big issue.
Early 90s?
Networking: ATM to the desktop, QoS everywhere, apps reserving capacity as consumers.
Software: 4GL, low code, data definitions which trigger code generators.
Add to this the paradigm of intelligent agents doing your programmatic bidding.
I thought we would have different debugging tools.
Specifically, I thought languages would be designed around debugging use cases. The fact that we are still using debuggers and loggers and printlns to debug blows my mind.
Grew as a programmer and got a degree in 90s. I remember seeing Visual Age for Smalltalk and Delphi and thinking “wow, the future is going to get even more advanced from here”.
Linux on phones. I waited and waited. Bought every single gadget with anything resembling Linux on it (remember the Sharp Zaurus)? Got hyped with the Ubuntu Touch system.
Sometimes you can be right about a thing but wrong about about the timing, so much that you're wrong anyways. I resisted learning Javascript+HTML+CSS because I felt it was an awful hack and that it was just a very, weird GUI toolkit-- and something better would come along, and lets just learn that. Well, That something better was probably Silverlight and we all know what happened to that.
I still have dreams about being able to point a cell phone at a physical object (television, fast food restaurant, car etc), and being presented with an AR GUI that controls that device. I suspect this is where things will head with Apple's AR headset. This is not a new idea really... in the mid to late 90s there were specialty AR headsets which could read QR codes and look up schematics and overlay them for you, But it was literally, a 486 that sat on your head.
Other things that blindsided me... The current advances in AI. I worked at an AI lab in the late 1990s. Our pride and joy was a demo that could, given a dataset of 500 images, differentiate between a few classes of images... really simple stuff like, flowers, animals, landscapes, machines. The images were like 48x48 and it took weeks of training on a Sun Ultra workstation.
And this was an interesting lesson for me. To quote a friend of mine who was/is an AI researcher--"These are the same algorithms we used 25 year ago. It's the computation that has changed." He had an idea to make a botany app which could tell what kind of plant what was based on a picture of the leaves-- this was 12 years ago at least. I told him it was impossible because I couldn't fathom how it could work. But he was right of course and now I believe there are such apps.
Back at that university in 2000, I had an array of Sun A1000's-- like 70 disks in total to make 1TB of storage. It cost the university 250k. Today, I have about 100TB of storage at my house personally, and it cost me... 5k maybe? Similarly, my cheap graphics card has more power than every computer on earth before at least the mid 90s.
This is to say, when the SCALE of things change, your capabilities change a lot. This is something that worries me about Google/Apple etc. They have access to oceans of compute power/storage. How could a small company ever compete with that? I would like to get back into AI as I always found it fascinating-- but how can I even learn when access to resources like that cost billions of dollars? I can't even get a decent video card or RAM for my computer at the moment.
We also seem to be at the end of this era of ever-increasing capabilities. We will need a big breakthrough to keep up the pace, or we will have to get much better at using resources.
--
I'm also very surprised by how little what I consider fundamentals of CS matter in the day-to-day. I probably had the last of the classical engineering educations-- full tour of calculus, physics, chemistry, partial tours of like EE, chip design, automata, relational algebra, to mention nothing of the humanities.
When I was a team lead at a large corp, the young folks would be impressed about how I would use an idea from another discipline to solve an engineering problem-- but I think the era of that kind of guy is over. I think what's important now is speed and loyalty. I watch in amusement as the JavaScript community discovers stuff that has been known to the wider world for decades.
Regarding loyalty-- I was raised with the Scientific mindset-- that our commitment was to the truth. A colleague of mine once worked for Sandia National Laboratories and he jokingly suggested that they falsify the results of an experiment-- and he was pulled aside and told something to the effect of, "We are scientists, we do not do that. Don't even joke about it."
I feel like that mindset is absent, even in science let alone business. The most important skill is loathing yourself while pretending that whatever bullshit, alternate version or reality your bosses boss is forcing on you is great.
This is something I could never adapt to. Working 60 hours weeks simply because of bad management. In my corporate gigs, it was always the same-- when a project went well it was because the manager did a good job. When a project went poorly-- it was because engineering fucked it up. I've worked on projects where 9 months out we warned management about XYZ. They ignored it, then blamed us when what we said was going to happen, happened. I think engineers are ill-equipped to defend themselves in these political environments.
In short, I never imagined most of my jov would be going to meetings and lobbying for common sense.
OP here. You're free to contribute too. I'm of the opinion that the '90s was a massively successful if not revolutionary showing for tech towards both consumers and insiders (academics, startups, working programmers, et al.) as well as a heyday for irrational exuberance. I don't think that was the case in the '70s and '80s where, for example, most who came in contact with computers treated them as flashy typewriters, toys, or defense industry lab projects. There didn't seem to be much in the way of "something new" aside from Usenet or BBSs. Maybe early 3D ray tracing, Hypercard, A/V capabilities (i.e. MIDI, Genlocks, 16-bit color), etc. But all in all, the 70's and 80's seemed more like an era of standardization than an era of realized experimentation (Amiga excluded). If there was a revolution in the making, it was (outside of insider circles or corporations) bubbling underneath rather than exploding into view. But I could be mistaken in my assessment if you'll enlighten me.
I was a programmer in the 90s but I didn't become a paid professional until the early 2000s.
I thought that the internet would kill the tabloid press (I am in the UK), and give the public better quality news and information. Instead, the internet became the tabloid.
I thought that it would be the "other guys" who would first leverage the internet as a political platform. People descended from 90s hacktivists, DMT-eating civil libertarians and the like. The rise of internet driven populism was a surprise to me.
I thought that in the future I would own an SGI home computer with superpowered custom chips, or something like a Playstation 2 with an os and keyboard. That is, something highly integrated like the home computers were.
I kind of miss the immersive experience of working with a computer where the whole thing is a unified design and you have a couple of big books which tell you everything you'll need to know to program them, and all the layers of abstraction are accessible and make sense to the programmer-user.
Back then, I thought that the computers of the future would be more profound rather than merely more complex. That they would still be knowable to the lone hacker, only they would demand more from them. Instead we have these huge complexes of ordinary complexity, reminiscent of power station piping diagrams, that we work on in teams as bureaucrats of abstraction. Of course, back then I was a kid, in love with the mystique of computers, there was a lot I romanticized, and a lot I didn't know.
I thought there would be more visual tools for programming, things derived from NeXT Interface Builder, Visual Basic and multimedia authoring systems that people used to use to make CD ROMs and information kiosks in the 90s. And I thought that the development process would be more integrated. We wouldn't be bothering with stacks of batch-mode build tools and things like that, and it would be relatively easy to open a window or play a sound in a high level language without having to do C bindings to third party cross-platform toolkits etc.
I didn't think programming would be replaced with graphical tools, I thought it would be augmented by them, in a unified, ergonomic package.
I thought VR would get going sooner than it did. I was a big fan of Howard Rheingold's 1991 book "Virtual Reality" (a great guide to early VR) and got to try it out in 1993 at a computer exhibition. I used to fantasize about going to VR arcades and playing flight simulators with a bunch of people. Now I do it at home with IL2: Great Battles.
One prediction I got right was shaders on graphics cards. Back in the late 90s I was a user of BMRT, a shareware Renderman-compliant raytracer which introduced me to the idea of programmable shaders. I figured it was only a matter of time before they ended up in consumer hardware, and my ears pricked up when the author of BMRT went to work for Nvidia.
I initially thought that the AI technology will fully mature in the early 2010s. I never thought that programmers and developers will become a popular occupation in the modern era.
1) We just expected single core performance to double forever every 2 years or so. Many of us were ignorant of single-core performance scaling and the memory wall issues. I want to be clear: Computer Architects were warning people about this in the early 90s but many ignored their inconvenient truth.
2) 2D GUIs would evolve into 3D GUIs - maybe VR? Maybe something else? And the CLI would be gone
3) Low/no code drag-and-drop style programming tools would take away many dev jobs
4) Microsoft taking over and Unix dying
5) All programming would be object-oriented (We are talking 90s style definition here)