Hacker News new | past | comments | ask | show | jobs | submit login

>Now, with that sort of context, there seems no possible chance that we can know how far, far away we are from the Manhattan project era of computing.

Just piling up years doesn't mean much. Yeah, civilization might go on for 100,000 years. Or 1,000,000 years. Or 20,000,0000. That doesn't mean it will progress the same as it did the first x years.

Just think of how we had homo sapiens for 200,000 years but progress was almost flat until around 4-5 thousands years ago. The mere availability of time doesn't ensure incremental progress at scale, or even monotonically increasing progress.

There are things such as low hanging fruit, diminishing returns, etc.

One can already see in physics that the most earth-shattering progress was made until around 1930-1950, and from then on it's mostly small pickings. When you start fresh, there are lots of low hanging fruits to get. At some point, you've reached several limits (including physical limits in measuring and experimental equipment without which you can't go further, and which you can't overcome because they're, well, physical limits).

And that's even without taking into account a regression (e.g. because of a huge famine, a world war, nuclear war, environmental catastrophe, a deadly new "plague" like think, etc.).




One can already see in physics that the most earth-shattering progress was made until around 1930-1950, and from then on it's mostly small pickings. When you start fresh, there are lots of low hanging fruits to get. At some point, you've reached several limits (including physical limits in measuring and experimental equipment without which you can't go further, and which you can't overcome because they're, well, physical limits).

Certainly. And you've hit on the core reason why computing is so eternal: there aren't physical limits.

When a computer becomes ten times faster, we don't affect the world ten times more profoundly. So the physical limits that hold back clock speeds don't matter too much. But when we suddenly no longer need to own cars because we can hail one on demand, our lives become completely different.

The limitations are social, rather than physical. Our own minds, and our lack of ability to manage complexity, is the primary bottleneck standing between us affecting the world, right now. Tonight. Imagine a hypothetical superhuman programmer who can write hundreds of thousands of applications per day. Think of how that'd reshape the world with time.

It seems true to say the amount that'd affect the world is linear w.r.t. time. The longer that superhuman programmer churns out permutations of apps in as many fields as possible, the more the world will change.

But that's exactly what's happening: hundreds of thousands of applications are being written per day, by humanity as a whole. Project that process forward to 102,283 AD. Are you sure the rate of change will be a sigmoidal falloff like physics?


>Certainly. And you've hit on the core reason why computing is so eternal: there aren't physical limits.

On the contrary, there are several. The speed of light. The plank constant. Heat issues. Interference issues. Issues with printing ever smaller CPU transistors. Plasma damage to low-k materials (whatever that means).

And all kinds of diminishing returns situations in Computer Science (e.g. adding more RAM stops having that much of a speed impact over some threshold, or you can make a supercluster of tens of thousands of nodes, but you're limited by communication speed between them, unless the job is totally parallelizable, etc).

>When a computer becomes ten times faster, we don't affect the world ten times more profoundly. So the physical limits that hold back clock speeds don't matter too much.

Huh? What does that mean? If we can't make faster CPUs, then we're not getting much further. Increasing cooling etc only helps up to a point.

>Imagine a hypothetical superhuman programmer who can write hundreds of thousands of applications per day. Think of how that'd reshape the world with time.

We already have hundreds of thousands of applications. It's not exactly what we're lacking. We're talking about qualitative advances, not just churning out apps.


If we can't make faster CPUs, then we're not getting much further.

On the contrary. In our day to day lives, we can do far more today than we could a decade ago.

In 2006, Intel released Pentium 4 chips clocked at 3.6 GHz. In 2016 we use quadcore chips where each core is about as fast as that. Yet we're way more powerful today than a mere 4x multiplier, if you measure what we can do. Think of how limited our tools were just a decade ago.

Clock speed isn't a great measurement, but the point is that making CPUs faster doesn't make the field more powerful.

It seems like we're talking past each other. I was referring to effects that computers have on the world and our daily life, but it sounds like you're referring to total worldwide computation speed, and how it will change over time. If you're saying the rate will slow sigmoidally, similar to the progress in physics, I agree.

But the thesis is that the rate at which the world changes w.r.t the field of computing is unrelated to the total available computation power. Our minds are the bottleneck.

Compare this situation to the field of physics. The rate that the world changes w.r.t physics was related to how important the discoveries were.

The contrast is pretty stark, and it might have some interesting long-term consequences.


> It seems like we're talking past each other. I was referring to effects that computers have on the world and our daily life, but it sounds like you're referring to total worldwide computation speed, and how it will change over time.

Interesting, because you seem to be arguing for each other's side. If you look at what we have then yes - for scientific purposes, total computational power has increased enormously and continues to do so. But if you ask what effects computers have on daily life of you and me, then not much has changed in the last two decades. Software bloat, piling up abstraction layers, turning everything into webapps - it all eats up the gains in hardware. Yes, the screens have better resolution and we now have magic shiny rectangles as another interface, but it seems like software only gets slower and less functional with time.

> Yet we're way more powerful today than a mere 4x multiplier, if you measure what we can do.

Scientists? Maybe. SpaceX can run moar CFD for their rockets on a bunch of graphics cards than the entire world could a decade or two ago. Rest of us? Not much, it really feels our tools are getting less powerful with time, and I don't feel like I can do that much more (and if anything, the primary benefits come from faster access to information - Googling for stuff instead being stuck with just the spec and the source code speeds things up considerably).


It's pretty clear at this point that I've failed to communicate. I'll bow out now.

I have a magical device in my pocket that can summon a car on demand.

Two effective hackers can set up a complete product within a few weeks, and then host it without having to think too much about what we now call devops. And when their servers start to melt, they can spin up however many they need.

We no longer get lost. Remember what it was like to get lost? Like, "I have no idea where I am. Hey, you there. Do you know how to get over to X street?"

These things were not possible ten years ago. Maybe people here simply don't remember, or choose to forget. Or maybe I just suck at writing. But every one of these incredible advances were thanks to advances in the field of computing. Both theoretical and practical. For an instance of the former, see the recent neural net advancements; for the latter, rails, homebrew, the pervasiveness of VMs, and everything else that our forerunners could only dream of but we take for granted.

Have a good evening, and thanks for the enjoyable conversations. You and coldtea both do really cool work.


>I have a magical device in my pocket that can summon a car on demand.

As a gadget lover, it seems magical to me too, especially since I was once lusting over things like a ZX Spectrum. But in the big picture, is it really life changing technology? You could do sort of the same thing already in 1970 with a stationary phone and a cab service (and in 1995 with a mobile phone). Not sure in the US, but where I live I used a cab service all the time ever since the eighties -- it took around 10 mins after the phone to get to you, so not totally unlike calling it with an iPhone.

Same for not getting lost. GPS is nice and all, but was getting lost much of an everyday problem in the first place (for regular people of course, not trekkers etc). Maybe for tourists, but I remember the first 3-4 times I visited the states, where I did two huge roadtrips with McNally road maps, and I didn't have much of an issue (compared to later times, when I used an iPhone + Navigon). I got lost a few times, but that was it, I could always ask at a gas station, or try to find where I was on the map and get on the exit towards the right direction.

I'd go as far as to say that even the 2 biggest changes resulting from the internet age, fast worldwide communications and e-commerce haven't really changed the world either.

Some industries died and some thrived -- as it happens --, and we got more gadgets, but nothing as life-changing as when typography or toilets or electricity or cars or even TV was developed (things that brought mass changes in how people lived, consumed, how states functioned, in urbanization and in societal norms, in mores, etc., heck even in nation-building --e.g. see Benedict Anderson on the role of typography on the emergence of nation states).

What I want to say (and this is a different discussion than the original about limits to computing power over time, etc.) is that technology also has some kind of marginal returns. Having a PC in our office/home was important and enabled lots of things. Having a PC in our pocket a few more things (almost all because of the addition of mobility). Having a PC in our watch? Even less (we already had PC+mobility solution).

>Have a good evening, and thanks for the enjoyable conversations. You and coldtea both do really cool work.

Thanks, but don't let our counter comments turn you off! They way I see it is we're painting different pictures of the same thing (based on our individual experiences and observations), and those reading it can decide or compose them into a fuller picture.


>In 2006, Intel released Pentium 4 chips clocked at 3.6 GHz. In 2016 we use quadcore chips where each core is about as fast as that. Yet we're way more powerful today than a mere 4x multiplier, if you measure what we can do. Think of how limited our tools were just a decade ago.

While 2006 Pentium 4 chips might have a 3.6 speed, they were also slower in execution (due to architecture and x86 cruft) than a single 3.5 core today. Factor in much slower RAM at the time and much slower spinning disks and you can see where most of the multiplier comes.

But even at such, I don't see us being "way more powerful" in what we can do today. In raw terms, mainstream CPU development has plateau-ed around 2010 or so, with merely incremental 10%-15% (at best) improvements year over year. In qualitative terms, are we that much advanced? Sure, we've got from HD to 4K (including in our monitors), but it's not like we're solving some totally different problems.

>It seems like we're talking past each other. I was referring to effects that computers have on the world and our daily life, but it sounds like you're referring to total worldwide computation speed, and how it will change over time. If you're saying the rate will slow sigmoidally, similar to the progress in physics, I agree.

Yes, I was taking about advancements in computing as a field (and I think the thread started discussing just this, hence the OP talking about Newton in the comment that started the subthread, etc.), and not in what we can apply existing computing power equipment to.

>But the thesis is that the rate at which the world changes w.r.t the field of computing is unrelated to the total available computation power. Our minds are the bottleneck.

I don't think that the field of computing is of much importance regarding this observation. Our minds are the bottleneck in all kinds of world-changing things (e.g. achieving world peace) -- applications of computers is just another thing that our imagination might limit us.

That said, computational power changes would enable novel technologies that we already can imagine, that can enable far more change into the world that current computers can (not necessarily all for the better, either). Advanced AI is an obvious example. But also more basic, but computing hungry stuff from entertainment to drug research.


>Think of how limited our tools were just a decade ago.

Not sure what you mean here. What were the limits a decade a go?

Many top programmers still swear by vi(m)/emacs, both started 40 years ago. C is 40+ years olds, C++ is 30+ years old, Python is turning 25, Javascript 20, Rails is 10 years old.

Turbo Pascal, which is over 30 years old, was arguably on par with modern IDEs on very many fronts.

I would say the contrary, there has been very little improvement in the tools aspect.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: