Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You are far more optimistic than I. I expect developers will quickly find ways to write software which consumes an entire 1Gbps link.


I expect that Google already has plenty of ways to consume an entire gigabit link; eg, Google Play music/movies/tv, Google Drive, etc. I'm sure they'd be more than happy for all of your data to be stored in the cloud, and have all of your access streamed to you in realtime over that gigabit link.

Come to think of it, I would be more than happy for all of my data to be held securely in the cloud so that I would never need to worry about backups or syncing my data between multiple machines. Assuming reasonable privacy and security practices, of course.


Assuming reasonable privacy and security practices, of course.

How about an in-kernel or FUSE Tarsnap driver? Are you reading this, cpercival?


> I expect developers will quickly find ways to write software which consumes an entire 1Gbps link.

Twitter .js app will be like 400mb.


That's a good thing. Instead of relying on third-party providers with loads of bandwidth for cloud storage, you and your family create your own private cloud, with direct peer-to-peer transfers across your gigabit links. The reliability would be as good as the size of your family.


The thing stopping that from happening today is not bandwidth.


In my case it is definitely one of the limiting factors. Try editing your family's latest HD home movies over the 2mbit upload speed that comes with most cable plans.

Good software support is another hurdle; I believe AeroFS is one of the companies working on this, though since I'm fully comfortable with sftp et. al. I never signed up for their beta to find out.


I expect developers will quickly find ways to write software which consumes an entire 1Gbps link.

You say that like it is a bad thing!


It usually is. Designers tend to say, "Gee, I've got an entire 1Gbps link! I don't need to think about efficiently using bandwidth ever again!". This is fine if the pipe never fills up, but if it does (like I am sure it will) you pay the price.

This is a cycle that has repeated for decades with every type of computing resource, and the end result is usually, for a 10x improvement in hardware capacity, you get a correlated but much smaller improvement in performance.


It usually is. Designers tend to say, "Gee, I've got an entire 1Gbps link! I don't need to think about efficiently using bandwidth ever again!".

This is true, but is only a problem if everything else is equal - which they aren't.

I'm not defending badly performing graphics heavy websites which have no reason for existence except to display 24bit versions of things that could be done in CSS.

BUT I am looking forward to the ability to transmit multiple streams of 1080p (and higher) video while my children play games with rich, 3D video assets and my electricity supply optimization company monitors the temperature of every cubic centimetre of air to determine if the air conditioner's fans need to be turned up.

Yes, that will chew up a lot of bandwidth - and I'll love every single bit travelling over that beautiful, beautiful, fibre...


It'll just be better per bandwidth, if the growth comes slowly, is mostly what I'm getting at.


That's fair enough.

But complaining that fiber is getting rolled out too quickly.. that's a problem I'd love to be able to complain about it my area.

Realistically, it's a 3-5+ year project before it is seeing significant penetration.


AKA the Jevons Paradox, first observed with regards to efficient coal stoves.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: