Hacker News new | past | comments | ask | show | jobs | submit login

2030 is only 12 years from now. so we should assume it will be about as different as 2006; which was about the same as today except for smart phones everywhere. Yawn.

Maybe a better game would be -- what single technology will become ubiquitous in 2030 that is at the first adopter stage today?




Indeed, this is from the world economic forum, yet some of these predictions are more likely from 2130 more than 2030.

And nothing about social change, economic power shifts, etc - The pampered American calling in to see how the minions in China have done the day's work.


Smart glasses/contacts will bring about ubiquitous AR.


Smart glasses, maybe, if there's an iPhone-level breakthrough in optical tech.

What will likely be ubiquitous is simply a seamless, always-on, phone-based AR experience. Lock screens will become a relic of the 2010s as phones move to becoming AR-first devices, with apps joining SMS and phone calling as secondary use cases.

Glasses will be on the market and in use, but they'll probably be too bulky for ubiquity in the next decade. Smart contacts will be the long-term goal, but that tech will be missing a lot of prerequisites for some time.


> Smart glasses, maybe, if there's an iPhone-level breakthrough in optical tech.

Highly unlikely, unless we somehow get to treating each photon quantumly.

Engineering wise, the hard part about 'smart contacts' is the power/data transmission problem. You'd need a wire hooked up to each one, maybe routed from the back of the eyeball. Might as well just use opto-genetics and modify the optic nerve outright.

Physics-wise, the focal length of the lenses matters a lot. Just putting a flex-screen on contacts would lead to VERY blurry images, especially as the cornea changes shape to adjust focal lengths. You'd have no ability to discern display changes as your data is essentially ON the front aperture. Maybe you could do DIC (Differential Interference Contrast) or SIM (Structured Illumination Microscopy) kinda stuff and get really good resolution for things up close to your eye, but that would be a lot of effort to just maybe be able to count the hairs on an ant. Also, you have to be at least a focal length away to just do that, like the diameter of your eye.

Glasses are much better suited for this, as they have sufficient distance away from the eye. But still, you'd have to focus on the glasses in order to read anything. Sure, you could track each eye in real time and adjust the 'blurriness' of the glasses input to match the focal length changes of the cornea, but the processing there would be really crazy, any lag would likely cause nausea as the vestibular organs go haywire. Imagine being on a boat in very rough/random seas.

Likely, just using opto-genetics to hack the optic nerve would be easier. Getting those proteins to behave well is an entirely different task and there really is no path forward at this time for individual optical nerve stimulation in a nerve bundle of transfected tissue.


Unless Murphy's law has anything to say about it then it should be a lot more progress in the same amount of time, right?


Robotic chefs. Nice handle by the way, no more waiting! :)


That assumes technology evolves linearly.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: