Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wonder how much the effects vary between different professions.

I'm in my 40s. Incredibly old for HN standards. And yet, I feel no nostalgia for the "good ol' times." I mean, don't get me wrong I'm sure there's a lot of things that set me apart from newer generations -- I don't get Snapchat at all ;) -- but I don't see me being happier by being put in a house set up to look and feel like the 90s/80s.

Is it maybe because we as programmers tend to be less prone to be stuck to the past? Just wondering



As a 40s hacker/entrepreneur what I miss most from the 90s was the feeling of having a vast unexplored frontier with endless possibilities ahead of us even for the little guys. These days, with the web and mobile revolutions maturing, it feels like the 5 or so giant tech monopolies have locked up most of the future potential. But maybe that feeling is part of having an older mentality.


Computing has become so banal. We used to be working on important problems. "The best minds of my generation are thinking about how to make people click ads. That sucks." - Jeff Hammerbacher, formerly at Facebook.

(There are more people working on important problems than in the 1980s. Computer science research used to be about a hundred people each at MIT, CMU, and Stanford, with a few smaller groups elsewhere, plus internal efforts at IBM and Bell Labs. The whole field was tiny. Now, it's larger, but overshadowed by the massive level of activity associated with ads.)


>Computing has become so banal. We used to be working on important problems.

1. Blockchains/smart contracts,

2. Garbled circuits/Snarks/MPC (Multi-Party Computation)

3. IO/VBB Program obfuscation,

4. FHE (Fully Homomorphic Encryption),

5. Machine Learning/Vision,

6. Global/Solar-scale performant and secure routing protocols,

7. TEE (Trusted Execution Environments),

8. Advanced P2P systems like IPFS,

9. Bioinfomatiks.

...


I don't understand what you are trying to say with your answer, as the person said "we used to be working on important problems" and you responded with a list of random technologies. Technologies can sometimes be "problems", and sadly often are :/, but that means that they were failed solutions.

A list of hard problems we could be tackling: 1) the world is going to run out of fossil fuels, 2) we are destroying the human ecosystem by global warming, 3) there exists a very large amount of inequality between the upper and lower class in our society and the gap is only increasing, 4) we have more and more humans of whom society demands "work" to get "pay" in order to survive even as we come up with ways of replacing more and more "jobs" with "automation", 5) there are many subsets of our population divided by axes such as race and sexuality which are discriminated against by others in both direct and indirect systematic ways, 6) we have a limited number of antibiotics that are generally safe for widespread usage and pathogens are adapting, 7) for numerous and potentially diverse reasons an increasingly large fraction of our society is being turned off of science and has stopped believing in basic things like the benefits of even our oldest and most trusted vaccinations, 8) humans continue to die from diseases like cancer, 9) governments and companies have begun to usher in a dystopian era of surveillance under the guise of protecting us from terrorists and spam and serving us advertisements.

A couple of these problems can be addressed with the technologies you listed, but even in the core of some of these communities that want to address problems 5, 7, and 9 you honestly just end up finding a lot of people who are exacerbating problems 2, 3, 4, and (annoyingly) 9.

I despise the cloud :(. It was just so much harder for people to abuse the crap out of us when the concept of a computer was something that, even if it could connect to other computers to get information, was not something that fundamentally relied on other computers and which stored all of its information on other computers and could be remotely controlled by other computers. We are to the point where arguing that I "own" the device on which I am typing this message almost doesn't make sense: I am borrowing it from Apple and I can only hope that they don't screw me too hard :(.


Many of those are problems, not technologies. Saying Blockchains is just a technology is like saying Operating Systems or Compilers are just a technology. They are an active area of computer science research.

Many of the problems you listed are likely only solvable by large scale social movements. Solutions to important technology problems change the landscape over which progress is made but they are not social movements themselves.

>It was just so much harder for people to abuse the crap out of us when the concept of a computer was something that, even if it could connect to other computers to get information, was not something that fundamentally relied on other computers and which stored all of its information on other computers and could be remotely controlled by other computers.

The cloud is the new word for mainframe. The PC pushed things away from mainframes then the internet/cloud pushed back. Blockchains are interesting space between these extremes.


I am pretty damned certain that the "important problems" being referenced here, and the ones we thought we would be able to affect in the heyday of computing, were not "we need to learn how to make a faster compiler" but "we are going to change the world". You seem to have conflated "interesting" with "important".

If anything, we agree on one point: that computers failed to solve those problems, and where we thought they would--Twitter being a great example--they quite arguably made the problem much much worse.

That is why I will argue computers feel so much more depressing today: we have been slowly coming to the realization--not just in the past few years but since at least the 50s (if not the turn of the last century)--that the entire concept of a utopian technofuture is probably a fantasy and dystopias now seem so close that we barely find the idea compelling to talk about anymore.


So why don't you start to tackle them? Problems get solved when real actual people do hard work, not when some ephemeral "they" decides it is time.


Meanwhile people have to eat.


Well, we can eat and enjoy a lot of things we enjoy because people who also "had to eat" and also "had families and mortgages" did their part, and even sacrificed their lives, for making things better.


Just nitpicking a bit about something I'm passionate about: Becoming a multi-planetary species should at least be on that list, if not first item.


To clarify for anyone who isn't convinced:

Not dying should be the first on that list, for obvious reasons, and becoming multi-planetary reduces the chance of (all the) humans dying exponentially for each planet colonized.


To satisfy your imagination, let's coerce everyone to believe we can be a multi-planetary species, nevermind that we only know of one planet that support us without extensive engineering effort.

Are we even the same species with any of the same concerns by the time we get to Alpha Centauri planet?

These notions still coming from the mouths of my generation are getting a bit bonkers to me as I age.

Let's work towards a goal we'll never be able to validate actually happens, as we'll be dead before we get close. Let's build a system that coerces people towards that goal.

Let's chew up more and more of this planet researching and building towards technologies and fucking over the next gen of humans here.

Because a generation that grew up watching Star Trek wished it would happen real bad!


What you are misunderstanding.

1. The earth will still be inhabitable in the short run. Even with worst case global warming x10. It will be a lot different, and probably not better, but we can still survive here.

2. 99% of human history people lived without liberties or human rights that became a theme of anti-autocratic philosophies and cultures that grew out of The Enlightenment to make the modern day world possible. Alan Turing would have been tortured to death in earlier years. These last few hundred years may very well be an anamoly, and while we know humans exhibit a tendency towards immoral, autocraric leadership, we should take advantage of the fact we live in the best 0.1% of time in history to develop such technology. It may be that in the short run is the only foreseeable possibile time to become inter-planetary.

3. Competition is what propels human progress

4. Global warming bureaucracies have the same problems the War on Poverty, War on Drugs, and prohibition democracies have. All these problems could be solved by your standard intelligent young adult, but they aren't because of bureaucratic inertia and politics. They are problems of people and human culture of greed.

5. I don't watch Star Trek


Software is the new means of production.

While management is cluelessly guiding what software is developed, we'll never put the requisite effort into those problems.

They want return on investment, not idealism.


It's hard to be excited about that list when so much of it seems like technology that will be used against me. For example, by businesses to further reduce the freedoms we have when using proprietary software.


I'd look at research efforts that have been focusing on revising the heart & original spirit of computing. YC's HARC especially - harc.ycr.org

Bret's talk about their group's vision for computing - https://vimeo.com/115154289 - really helps people who think computing is "done" and all the interesting stuff is figured out.


Most people here are simply users of these technologies, not creators. The actual programming is still banal.


Like with all the sciences, the role of the individual has been diminished. Interesting technology may be seeded by some individual genius but it takes a huge number of people coordinating to build something like a self-driving car, a breakthrough AI, the LHC. And it takes a lot of really smart people spending most of their time doing relatively banal tasks to pull these off. I think the bar for innovation is just so much higher than it used to be, there is so much knowledge required across so many fields that you either end up a generalist, who rarely gets a chance to dive deep into any one thing, or a specialist who is forever stuck in their one area of expertise.


How many people are working on these problems?

The sad reality is that most software engineers want to work on these problems, but instead have to work on pumping out CRUD apps and proprietary APIs of questionable social value to pay the bills.


Who is working on better routing protocols? Displacing BGP seems like a monumental task so it seems like bgpsec is about as good as we are going to get. Would love to be proved wrong though.


>Who is working on better routing protocols?

Scion is a clean slate internet architecture research design (including routing)[0].

>It seems like bgpsec is about as good as we are going to get.

I was author on a paper [1] that reduced some of the downsides to the RPKI (the PKI BGPSEC relies on).

There is also interesting work on getting most of the security from BGPSEC without complete BGPSEC deployment [2].

In a different direction, as we build the internet of the inner solar system we will need protocols with different properties than those we needed for terrestrial networks.

[0]: https://www.scion-architecture.net/

[1]: "From the Consent of the Routed: Improving the Transparency of the RPKI" http://cs-people.bu.edu/heilman/sigRPKI_full.pdf

[2]: "Jumpstarting BGP Security with Path-End Validation" http://dl.acm.org/citation.cfm?id=2934883&dl=ACM&coll=DL&CFI...


I've seen scion before but IMO "clean slate" protocols are DOA when it comes to displacing BGP. Just look at the mailing list. A protocol with an integration model of forming an overlay has no integration model. ETHZ and Co have been marketing that at all kinds of academic conferences and it's seen approximately 0 uptake outside of organizations that volunteer to run it as an experiment.

Sorry about the rant, but I get the impression that the authors of replacement protocols like these are more interested in becoming academically famous for being the inventor of the Internet (i.e. The next Vincent Cerf) rather than proposing solutions to existing systems.

I would be interested in new protocols for super high latency networks (i.e. The solar system model).

The RPKI work and path end validation work is interesting and I haven't seen those papers before (been disconnected from publications recently). Thanks for the links.


Bioinformatics is very often pretty basic counting and basic statistics applied to moderately large datasets and no concept of best practices.


yes...and important problems that we have barely begun to solve


The funny thing is Hammerbacher worked on trading problems which have tons of super fun CS problems even if it is a bit nihilistic making rich guys richer ... then founded Cloudera, which, I dunno, seems pretty "beige" as far as CS goes. Didn't realize he did a stint at FB. That could suck the life out of anyone.

Physics has the same problem. Upwards of 20k people in the APS... most working on very obscure problems spraying "shittonium on silicon 111" or making no progress in various theories of everything.

Still plenty of important and interesting problems in CS, some of which I've been lucky enough to work on.

Blockchain stuff is pretty useful and interesting, though it could all fall apart with some breakthrough in crypto.

Machine learning is mostly used for stupid stuff like getting people to click on ads, but there's lots of interesting use cases for it which haven't been explored yet.

There's also a lot of work to be done on the core tools used in ML; while everyone babbles about deep learning, boosting, PGMs and topological data analysis still have a lot of low hanging fruit IMO.

One that people don't think about enough: non-standard computing architectures. Quantum computing hasn't produced anything of note yet, but it's hardly the only potential area of research here. Simply using stuff like old school Harvard architectures has tremendous implications for security (no more buffer overflows, yo), but nobody bothers thinking these things through and implementing.


> most working on very obscure problems spraying "shittonium on silicon 111" or making no progress in various theories of everything.

That's the problem with foundational research: It always looks obscure and impractical until suddenly there's a huge breakthrough out of nowhere. Same dynamic as with startups, where 99% will never make a significant mark of any kind, while 1% change the world forever. And you cannot know in advance which startup (or which foundational research) falls into which category.


Surface science (shittonium on silicon 111) has been promising to explain catalysis for 40 years now... it's still important to do, makes the chips run faster, but it's usually not considered real foundational.


>Blockchain stuff is pretty useful and interesting, though it could all fall apart with some breakthrough in crypto.

At this point it seems unlikely that a breakthrough in crypto could kill blockchains without killing nearly all of modern cryptography. For instance we can build blockchains which are secure even against quantum computers. We can build blockchains that make no number theoretic assumptions using only secure hash functions. Any technique which can break all techniques we have of inventing hard to solve problems (i.e. cryptography), would have a massive impact on technology.

A breakthrough in crypto that would destroy blockchains is much more likely to be a technology which is significantly better than a blockchain for what we want to do with blockchains. That would also be a exciting result.


I was thinking something along the lines that hashing proofs aren't very rigorous. I mean, it all looks pretty good, and I have no idea how to break this stuff (not my department), but some weird topology guy could wake up one day and discover that hashes aren't as good as they looked.


Nonsense, we've been working on payroll registers for decades. You just remember the fun stuff.


That's possibly false: the best minds are likely working at places like Genentech (no relation to my world) and other places where truly useful things requiring deep knowledge of multiple technical domains is a prerequisite.


SpaceX?


SpaceX is doing more or less the things that have already been done 50 years ago by truly the best minds of that generation, except they're cheaper and run as a private enterprise. They're standing way up there on the shoulders of those giants, and we should recognize that.


Making things cheap is just as important as inventing things. Nobody would care about computers if they still cost $10m and took up a whole room.


We're not talking that kind of price differential. A factor of 5x at most, with worse specific impulse than the Russian engines from the 90s. I get it, Elon is the second coming of Steve Jobs for many, and it is admirable that he's pursuing the fields that are hard, but he's not the second coming of Sergey Korolyov or Werner Von Braun, or even the countless others who made the space age possible.


Well, most folk here still would. Point valid, nonetheless.


Yet nobody else had landed and reused the first stage of a rocket. They are doing truly novel things.


Didn't NASA do that in 1996?


If I understand correctly, the Space Shuttle's SRBs were recovered, refurbished, and reused starting in 1982?


Yes, though they didn't make controlled landings, they used parachutes.


Yes, all those rockets landing their first stages back in the 60's, that was quite boring and old. Oh, wait, they didn't do that at all.

> They're standing way up there on the shoulders of those giants, and we should recognize that.

We do. And they do too.

But they are innovating, and bringing down the cost of putting stuff in orbit is very important progress, no point in belittling that.


It wasn't because it was impossible. Shit, those guys landed on the Moon and Venus with nothing but slide rules. It was because government is not spending its own money, so it can afford five times the cost.


Landing on the moon is a bit different than landing here on earth.

If you wanted to say that SpaceX had done 'nothing new' then you're about a year late to the party, they are definitely innovating.



You only have to look at WiReD Magazine from those times (ca. 1993-2000) to appreciate how different the zeitgeist was during the personal computing revolution. Everything was changing and anything was possible.

Personally I think it's not because we were working on great new tech products then. I think it's because we knew we were at the forefront on a tsunami of a tech sea change of historic proportions that was obviously reinventing much of everyday life in ways we didn't yet understand, from the global public commons to the local grocery store. The question on everyone's lips was, where will it end? What's happening? Where is the world going next? And can I play?

Then from 2005 to maybe 2012 we saw an apparent renewal of tectonic forces as mobile smartphones surged forth and all of us became incessantly and globally interconnected.

But now, in today's hindsight, we've had some time to get perspective, and we're disappointed. Where was that brave new world we were promised? We've gained tweets, social and news feeds, e-books and e-tail, and handheld GPS. But we've never been more aware of how meaninglessly trivial tech's ultimate tsunami -- small talk, or how thoroughly manipulated we've let ourselves be by corporations whose highest calling is to sell us ever more trumped-up commodities and ever greater dependency on their latest bit-of-the-nonce drug fix.

O where have you gone, Gene Roddenberry?


Same here. 90s were me working my butt off to get a Ph.D. which I didn't end up using, and watching my pals who worked at X, Y, Z cashing out. Some very good times were had for sure, but I don't feel nolstalgic about times when I was eating ramen, working 36 hour shifts at the synchrotron and driving a $200 car.

I do feel a bit nostalgic about the 80s; the music, the big hair, and feelings of immanent nuclear doom, and the eventual fall of the Iron Curtain: a very unique feel to this era. Granted I was a teenager for most of it, so maybe that's why.

As for age: the main thing I've noticed is my hairline and my joints aren't what they used to be. Exercise, eating right and occasional fasting seems to stave off msot of the physiological effects of aging. I'm pretty sure working in a creative field (and being around a lot of younger people) helps a lot with the psychological aspects.


The feelings of imminent nuclear doom were the best. really made the pop culture special


(Another 40's hacker/entrepreneur here)

Something I try to continuously tell myself is that "It's still early days" because while it doesn't feel like it - I think it is.

Look at the massive shifts we've seen in social media use (Friendster > MySpace > Facebook + Twitter > Snap) - at near any point in their usage you could have reasonably said: "I don't see how anyone could overthrow one of these - they're too big"

There is so much future still to invent.


IMO, each generation faces different landscape of software and hardware infrastructure and end-device. The existing social net work and communication solutions can't meet the delta of this generation's needs against previous ones, hence there comes the big unmet needs from the emerging generation. However, solution builders have to be close to this generation enough to have enough sense/instinct of the delta to build products that can actually cover those delta.

Just my observation and not being described accurately and clearly enough here. But I hope you get what I'm trying to say.


Search went through many shifts, but hasn't since Google. When a new technology matures, someone eventually figures out how to dominate it, and does.

The difference with computing was on-going Moore's law (forever young), which now slows/ends. Worse, though they still shrink, the cheapest node remains at 28nm. Cheapness fuels revolution. Peak silicon has passed.

We could get a new technology (like fracking etc and peak oil), as Kurzweil suggests for the singularity.


>When a new technology matures

I don't believe search is mature. We haven't seen a lot of innovations in web search for a while and what we know as web search is probably the best Google can do. It's probably not the best possible web search though. Just the best from Google.

The reason nobody is there to challenge Google is because noone has come up with a way to fund a web search engine other than through ads. Challenging Google probably also means challenging the whole ad industry. It's a rough climb.


Also 40s hacker/entrepreneur here. Your complaint is like a 40-year old in the 90s complaining that IBM, Microsoft, Intel and the like have locked up most of the future potential of computing.

The internet titans of today will be replaced or be forced to change as the next wave of innovation takes over.

There may be a lull right now, but either AI or augmented reality looks like a good candidate for the next cycle.


I see your point but I also think that in the 90s the computing world was more open and accessible. I guess that happens to most maturing industries. The market players become bigger and start aa dominating​. Around 1900 a guy in a garage had a reasonable shot at building a car in his garage (or an airplane) and start a business. Not possible anymore. You need millions to get even started.


I was a teenager in the 90s, and the consensus among most of the adults I knew working in tech was that it was over. Microsoft had won. Programming was a commodity, all the tech jobs would be outsourced to India, and you were much better off getting into finance.

Remember that towards the end of the 90s, Netscape - the one promising new startup that threatened to break the Windows monopoly - had just had its ass handed to it by IE, with some spectacularly underhanded tactics. All the dot-coms had required hundreds of millions in funding and were rapidly blowing through them, with no sign of a profitable business model.

Hope didn't really return to techies until the mid-2000s, when a bunch of events (Google's IPO, acquisitions of Flickr & del.icio.us, development of Rails/Django/JQuery/MySQL, and founding of Facebook, Reddit, and YCombinator) made people realize "Hey, maybe it's not the end of history after all. People have actually been doing pretty cool stuff all this time when we weren't looking."

2015 seems like it was the new 2000. That'd put us around 2002-2003 right now, complete with the huge focus on P2P, security, and government overreach, and with a couple years to go before we realize that not everybody gave up.


Interesting perspective. In the 90s I worked on handheld devices like the Apple Newton and that a lot of fun and there was a lot new and exciting stuff happening in that area. It seemed a much smaller world where you could keep track of what's going on.


Yeah, there was tons of interesting stuff happening in R&D in the 90s. The problem was that Microsoft held a complete strangehold on distribution, which meant that for the average consumer (i.e. teenage me, not in Silicon Valley, and everyone I knew), it might as well not exist. I remember, when I first started to read academic papers and surf programming sites in college around 2003, thinking "Holy shit, there's all this cool stuff that was invented over a decade ago, and I can't use any of it unless I switch to UNIX."

What happened in the mid-2000s was that open-source software (and later Apple) finally broke Microsoft's hold on the route to the consumer, and suddenly small teams of people could deliver software that billions of people could use. There's a similar situation in R&D now; I remember that when I left Google 3 years ago, there was a whole lot of really cool stuff being developed that has yet to see the light of day. No corporation can hold innovation back forever though, because of Joy's Law: "No matter who you are, the majority of smart people do not work for you."


But that wave of innovation was fueled by Moore's Law... now deceased.

Mature industries, absent disruptive technology changes, tend to consolidate and be dominated.


I have a somewhat contrarian prediction for Moore's Law. I think the next wave or two of innovation will obliterate the current progress of Moore's Law.

Yes, Moore's law may seem to be slowing down right now, but all you need to do is look at the size, energy consumption, and ability of a human brain to realize that nature has blindly created a much more advanced "technology" than what we currently have.

Next, look at flying. Birds are nature's most advanced flying machines. We've been able to expand on that to the point where we have jets and even reusable rockets that can go a thousand times faster than the fastest bird.

Right now, computing is at the "man puts big flaps on his arms and jumps off high ledge" point. Once we figure out the "first airplane", AKA wright brothers moment, Moore's law will explode.


It means it's time to start looking for the next revolution.

Before the web, there was the PC revolution. And...microcomputers before that, I think?

It's true, computing technologies mature and there are fewer opportunities for real break throughs. But then new technologies arrive for the cycle to start over again.


That's how biotech, in particular neural technologies, feel to me right now. In reality, it's still such early days there that it's probably closer to computers in the late 70's than the web in the 90s, but I think the spirit is similar.


I would say the opposite. You used to have to do everything yourself, but now you can leverage the infrastructure of amazon/google/microsoft/... to do amazing things by yourself. The potential of ai is just starting to get unlocked, and we're going to see lone wolf developers do incredible things in the coming years.


40s hacker here too. I felt that way until i discovered VR. I know its not popular on HN and many are skeptical but man I am loving it again. So done with stupid screen apps :)


That how I feel about AR and hololens, just picked up a dev kit. Hit me up if you want to collaborate on a holo app :)


I think we have endless possibilities but it seems nobody has time, the necessary resources, wisdom, and/or courage to try new paths that would compete with those 5 giants.


Those frontiers still exist. They're just moved to other technologies now. 3D printing, VR/AR, robotics, blockchains, IoT, etc.


I think the thing that really did it for me was seeing Windows 93 on the front page today. So much possibility and creativity lost.


As a counter point, perhaps, if that were true we should expect to see much less VC money being directed at tech startups.


Well, we're already seeing VCs diversifying into other sectors. Also, I'm inclined to see the huge amount of VC money as an indication for an overall shortage of investment opportunities.


Before the web and mobile people thought programming was done, e.g locked up on desktop by Microsoft. There's a vast unexplored space in AI/autonomous/AR/VR/sensor networks/IoT/smart things and I am sure I'm missing something.


I'm 40 and I feel this way now about cryptocurrencies


On the contrary, I see at least social as being open for distruption. FB did displace myspace after all.

Google, Apple, and Microsoft, though, seem pretty safe.


I think Apple jumped the shark quite a while ago. I would love to ditch my Macs, but the problem is that as much as Macs suck, everything else sucks more. If someone built something that Just Worked the way Macs did back in the Snow Leopard days I would replace all of my Macs immediately. And I think this is possible starting with Linux as a base. Linux has come a long, long way since 1993.

I just bought a couple of Raspberry Pis and I get the same exciting this-is-the-start-of-something-really-cool vibe from them as I did from my first Apple II back in 1980.


I jumped ship in snow leopard days when I saw what they were doing with their shared object formats (basically making my life difficult for no reason).

I've used Mint since then; never looked back. Mint on an old X220 thinkpad is better than anything mac has done since then. Requires a bit of overhead on setup to get power management right, but I paid that price years ago and can endlessly clone HD images.

Bonus is, most of my work product ends up on the EC2 anyway.


My problem is that I need to figure out a solution that works not just for me but also for my very-non-techie spouse. Mint might work for me, but I'm pretty sure it won't for her.


Mint+KDE is pretty good. Sadly the akonadi junk needs to be sabotaged.


I think iOS 11 is an example of Apple realizing that they need to take a step back and focus on the "it just works" aspect of their mission. There are dozens of small fixes that I've wanted for years that they've finally fixed, all in one release. Haven't had the chance to look at High Sierra, but I'm hoping it's the same there.


All of their products used to behave much more synergistically. There was a real feel when you bought an iPod and hooked it up to your iMac that the two popped together. Now, an iPad barely ever needs to even touch a Mac or even PC. The technology is at the point where it ought to be possible to offload display/UI tasks to an iPad - but you can't in an integrated way. They really need to tie their whole product lines together again (through synergistic features).


Not iPad as much as iPhone, but do you use Handoff? I do all the time, between work and home MBP, my phone and watch. For iMessage, web pages, emails, phone calls, etc.

I feel like the only thing we've actually lost was the requirement to use iTunes to set up a new iDevice, and good riddance to that. An iPod really was a peripheral whereas their modern successors are fully-fledged mobile computers.

If you mean use an iPad as an external display, that's a rather niche use. Is there a reason that the 3rd party app Duet Display doesn't meet your needs?


I'm 33, young-ish but I still have some nostalgia for the early days of the internet. I miss the lightness and innocence, when viruses had silly graphics and at worst screwed up one computer. When the biggest internet companies were still frivolous and playful and inconsequential.

Today tech has gotten heavy. Hackers work for organized crime and espionage, ruin lives, and cost the world countless billions. Tor and Bitcoin would have felt badass back in the day. Now they are used to dodge totalitarian regimes and to run black markets that are spreading untraceable opiates throughout middle America. And the iconic internet companies now shape the world, for better or worse.

Tech has arrived and I wouldn't roll back the clock, but I am sad that there's so little lightness left.


I'm considerably older, and I too have no nostalgia for the past, but I certainly have a broader view on many tech topics than my younger colleagues. A couple of days ago I was explaining to a couple how at one time the line printer was the primary output for users, no monitor of any kind. This made separate characters for line feed and carriage return important, in that you could program with them to create better (but still very primitive) graphics.

But today is certainly the golden age of software engineering. I'm still excited about many things, and learned about a significant new area of interest to me just a couple weeks ago.


I'm turning 30 this year. Not old at all, but just old enough to not be young.

I don't feel nostalgia, what I do feel is envy for the younger. Especially those younger than me that I feel are more successful than I. Not in the "Ugh, fuckers" way, but in the "God damn it if I knew then what I know now or at least listened to the people who did know, I could be that ultra successful young hotshot! I had all the opportunities and I didn't take them for various reasons. Grargh!"

Not sure if that's what feeling old feels like but I certainly feel a level of "Poop, there's def less time ahead of me for all the shit I want(ed) to do than there used to be"


>>Not in the "Ugh, fuckers" way, but in the "God damn it if I knew then what I know now or at least listened to the people who did know, I could be that ultra successful young hotshot! I had all the opportunities and I didn't take them for various reasons. Grargh!"

Eh, the types of success we envy is mostly a matter of luck. Being born to a family with means and connections, being good-looking, being at the right place at the right time, etc.

Obviously if you actually knew back then what you know today, you would buy lots of bitcoin and Tessa stock. But in terms of "soft" knowledge, I'd say you would probably end up in a similar place.


Maybe you're not old enough?

When you're 80 you may miss the heydays of your 40s when you were at the peak of your career (and possibly family life).

There are probably a lot of 30-somethings for which their teenage high-school years were the best years of their lives (I know several, I'm 34 myself and prefer my present to my past), so the right age for nostalgia is probably a very individual thing.


I think for people who don't stagnate and continue to progress throughout their lives, nostalgia will be limited. People who peaked in High School (for example) and stagnated in life afterwards will probably tend to look back on their past years with fondness.

I see this with people in the town that I grew up in who couldn't get out. After High School, they largely languished in the same town and haven't really done much with their lives since. I see them talking about past years and High School with far more regularity than the more successful people I know. The successful people on the other hand - they bring up the future more than they bring up the past. That's probably because they have something to look forward to on the trajectory they're on - it's a mental state, a mindset.

Now this is a very simplistic version of my observations but it's what I've seen so far. I'd love for people to bring in some kind of science to this theme that I've observed. My observations could certainly be isolated and wrong but from conversations I've had with other friends from similar smaller towns, they've reported observing similar things.


Your observations strike me as all too accurate due to my own anecdotal experience. I've tasted both sides of the coin. After leaving a phase of torpor that lasted years and started taking life into my own hands, my mindset completely changed to the point where my past self feels almost alien. The effect nostalgia had on my daily life essentially went from a driving force to a non-factor.


I would still add the disclaimer that you may simply not be old enough- after recent experiences with my elderly & currently very sick father.

When you get older your body wears out, even if you take care of it. You may not notice it as an active 40 year old but you definitely will as a 70 year old.

Other things like culture and values change around you and you can't always keep up, causing the feeling that "the world is going to hell". Maybe your field was doing great 20-30 years ago and is now on a downward trajectory (again you can't always plan for that decades in advance).

It doesn't have to be that way but getting old at some point does suck for most people, some earlier than others.


You're right, of course. The idea of slowly degenerating and becoming irrelevant like this has filled me with a sense of dread for a long time, although I've largely accepted the inevitable now.

I would say I'm attempting to get at least one decade of vaguely defined "intense" living. If I can get that it would already be more than what a lot of people get!


As a developer who is almost 60, and the father of two budding developers. I can't image a better time to EVER have lived.

The technological landscape is amazing, new things in tech, biotech, materials, and just human understanding of ourselves.

I don't feel old because I am excited about everyday and what adventures it will bring. I can still beat the twenty year olds in speed chess, though my twitch reactions are not fast enough to keep up in the modern FPS's. The trick is to challenge yourself everyday in mental, physical, and spiritual ways. Having great kids helps with that!


> "I can't image a better time to EVER have lived."

I dunno. The other day, my wife and our 5 children and I were all having a relaxing evening together as a family in the living room, and something my 4 year old daughter said accidentally set off Siri who said "sorry I didn't catch that" and my daughter said "never mind" and we all had a good laugh. Then my daughter said "I love you, Siri." My heart sank. I'm probably not giving enough context to explain it, but she seemed to genuinely think Siri was a real (and neglected) person in the family.


And in a few years Siri will probably be a sibling intelligence. I do believe our new computer overlords will be kind to us ;-)


Same here, same age range. I miss nothing about the 80s or 90s except not investing in the right skills or stocks earlier. Today is full of far more potential and choice, imo.


I'm about that age as well. I had a pretty rough entry into adulthood thanks to my school peers. Being a "geek" was tough in those decades. I harbor no nostalgia for that hot mess.

So there's one more reason a lot of older folks on here might not be quite as stuck on the past as most.

Watching my son sling 'duinos and rock NodeJS on his raspis and not get bullied for it is sweet. The golden age is now.


Yes! It's true for my kids too, although I'm guessing mine are a tad younger.

It's interesting to try to put a finger on when computers became "cool". I suspect it's only with the rise of the smartphone, as even video games were seen as a tad nerdy in the mid-2000s.


I'm only nostalgic for retro-technology because it's what got me into the field. I might boot up a copy of SIMH and run a DEC PDP-11 image of RSTS/E or the like just to see it again and wonder at the fact I have control over a simulation of a machine I lusted over and learned so much from. That usually lasts about 1/2 an hour and then I'm on to learning something new. The novelty and innovation is what makes this industry so much fun. I can't imagine ever growing tired of it.

In terms of "good old times" I just lament the fact kids today don't have the unstuctured freedom to explore the physical world like we did. People today (from my perspective) have very different perceived risk than older folks.


I agree with your last point--it's a real shame my kid will never have the experience of getting on her bicycle at 7AM, exploring a 10-20 mile radius all day without a call phone and GPS, and being trusted to return by sundown. I'd have no objection to allowing it, but every nosy ninny in the area would call 911 at the mere sight of an unaccompanied kid roaming the neighborhood.


I implore you to try the Microsoft Hololens. It changed my life, AR is the future!


No, it's you. I'm 43 and while I don't see myself as particularly nostalgic either, I have plenty of colleagues who are. I have no idea how we compare to general population.


I'm 55 and not nostalgic at all. Weirdly, I can hardly stand to listen to music of my teenage years because it puts me in a pretty sour mood -- and I was quite successful and happy as a teenager and in my early 20's.

Just yesterday I competed in the open division of a regional racquetball tournament and in a month in a half I'll compete in the open division in a regional powerlifting meet. The is not a humble-brag as much as to contrast this to the vast majority of the folks my age with whom I hang out.

Moreover, I'm heavily invested in growing intellectually which seems a contrast to many of my peers.

These folks have largely, though not all of them, adopted the "I'm too old" mentality to try to compete or grow in athletics or in intellectual pursuits.

The point is that this article, I believe, tends to comport with reality: many people begin to accept an easing of their personal standards and drive for growth because they're "too old" to not continue to push and compete.


At my first job, my boss, who was in his seventies at the time, told me that once people get to a certain age, societal expectations of sharpness decrease rapidly. According to him, this led to many people eagerly taking up the mindset you describe.

Maybe your peers never really had the personal standards you thought they had in the first place, and simply stopped trying once the peer pressure eased up and they got a good excuse to lay around?

This type of pressure is often met with scorn as something we are too good for. In my case however, it really helped me become a better version of myself over time.


I'm 55. I'm nostalgic, but also forward seeking. There were certainly things about the 70's and 90's that were better than now. Not sure I'd want to go back though. I'd rather change the future...


The best thing about 15 years ago was that my body was 15 years younger. Missing their youth probably has more to do with missing any particular specific cultural artifact of the time.


For me it's: I miss my youth and health, and I'd love to be able to go back just so I could have the time and energy to work on the things I'd like to work on now.

I am still making things, it's just more difficult now.

But video games are a hell of a lot better now than they were back then. Legend of Zelda: Breath of the Wild? The Witness? Life is Strange? Persona 5? So, so good. And there's nothing stopping me from playing the old games when I feel like it (except having the time).

Also I wouldn't mind having another replay of my high school and college years with the confidence that I have now. A lot of my experiences with women would have gone a lot more positively then if my attitude and approach then were the same as it is now (Well, in the past five years. I'm in a solid relationship now). I wouldn't want to replay the sitting around a classroom or schoolwork part of it though.


Your outlook will change in your 50s for sure, things will get punishingly clearer then.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: