Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Oh! Don't feel too bad. While the late 70s and early 80s were a great time to be alive (and interested in small computers,) there was still that nagging feeling that if you could only get a system with just 64k more memory, you could do WONDERFUL things, instead of the mundane things you were working on.

But I will say... the thing that made it interesting to me in retrospect is none of us knew what we were doing. I mean sure, we could write code, but we wrote it in languages that let us shoot ourselves in the feet. repeatedly. The first time I saw Smalltalk was like a religious experience. The day I realized why I couldn't use Smalltalk for "real" systems was like losing my religion.

On the hardware side... it seemed there was a new peripheral every week: 3" floppies, 2.8" floppies, 3.5" floppies, hey! an affordable hard drive!, voice recognition, head mounted displays, etc.

No one seemed to have figured out what the market would want to buy and certainly not how to make it at a profit.

And to me... that spirit of adventure... the "hey, let's just try this new thing and see if it works," was what characterized that era.

And you, right now can have the same experience. Just go out and learn about something halfway new (like how to use LISP in a modern web stack. (okay. maybe it's not that new.)) And try to find a group that's experimental.

The hardware and software tools we have now are INSANELY better than when I was a kid. Approach the market with that "beginners mind" and it'll be great.

And from the description of the timing of your arrival in computing, you're probably senior enough people will need to take you seriously.

-cheers!



>Oh! Don't feel too bad. While the late 70s and early 80s were a great time to be alive (and interested in small computers,) there was still that nagging feeling that if you could only get a system with just 64k more memory, you could do WONDERFUL things, instead of the mundane things you were working on.

Which eventually turned out to be false. Instead we got Slack, Facebook, and Electron apps using those gigabytes for things one could do in a PDP-11.


> ...there was still that nagging feeling that if you could only get a system with just 64k more memory, you could do WONDERFUL things, instead of the mundane things you were working on.

To me, that feeling only really went away in very recent times, when common desktop platforms stopped being starved for RAM. Before we could have 2GB+ RAM on our computers and not even think about it, it really was the case that adding more would make for a "wonderful" experience; there's just no comparison between running with free RAM vs. without! The transition to 64-bit compute everywhere also helped here, of course - no more address space constraints.

On the peripherals side, USB was a huge advance in retrospect. It did come at a bit of a price, in that every peripheral now has to run a fairly complex microcontroller just to deal with the high-level communication protocol (and in turn this led to the rediscovery of GPIO as a "thing" - it used to be the case that GPIO was how you did device interconnect!)

The next real advance will probably be software that actually uses heavily-multicore systems and even GPU compute effectively, for general-purpose tasks not niche special cases. Basically an equivalent to Firefox Quantum, for everything else that we do on our machines.

(Oh, and of course we're still waiting for a mobile platform that runs a genuine general-purpose OS. But hopefully we'll solve that shortly anyway, thanks to efforts like Purism and pmOS. And even the ubiquity of "smart" mobile hardware is in fact somewhat recent. This is linked to the upcoming advance I was just discussing, because power-efficient platforms like mobile are big on multicore compute and the use of GPU.)

Added: I just saw a sibling reply that talked about the crappiness of Windows 10 and recent OS X versions as proof that hardware innovation is dead, and a reply that in turn blamed the end of Moore's law. I think that both are missing the point quite substantially! Linux works just as well as it always did, and surely it should be the real standard if we're talking about innovation!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: