Hacker News new | past | comments | ask | show | jobs | submit login
100 Gbps achieved from space to Earth (news.mit.edu)
482 points by sizzle on Dec 7, 2022 | hide | past | favorite | 151 comments



Don't optical links to space suffer terribly from atmospheric distortion?

Imagine looking at a shell on the bottom of a swimming pool while there are ripples in the water....

Usually the shell is a bit distorted. But at some points in time, you see two shells... And other points in time, none.

If the water represents the atmospheres shimmering due to changing density, and the shell represents the satellite you're trying to receive data from, then at some points in time, you won't be able to receive any data at all, because the receiver cannot see the satellite.

Network links that are up and down every few milliseconds aren't very useful for much apart from bulk science data download. Perhaps that's why this is marketed for science missions rather than space internet?


Yes weather is still a challenge for space to ground communication. However there are numerous techniques being developped to handle this. One is to use adaptive optics similar to what is used by telescopes, a similar approach can be done digitally (like MIMO antennas in wireless). Now if the sky is overcast all these techniques don't help, however you only need relatively few ground stations to essentially get to above 99% availability.

That said the first rollout of optical comms is to intersatellite links. There the only issue is loss due to diffraction (and pointing etc., but that's a different discussion). Optics has a big advantage because diffraction is proportional to the inverse of wavelength, and the wavelength of light is several orders of magnitude smaller. This determines the size of the transmit and receive antennas/apertures needed, or the loss for a given size. For realistic parameters optics would have 30-50dB less loss which is translates directly to SNR and thus allows the much higher rates.

Source: Doing research in this area


Can confirm this from my own experience. Just elaborating on the availability requirements, the idea is that clouds aren't everywhere at the same time, so if you have enough in-space bandwidth, you can hop your traffic around the constellation until you reach a satellite that is above a cloud-free area.


Can orbital angular momentum modulation help here? I'm a deepnoob here but iirc some forms of polarisation are more or less affected by 'water in the sky' and masks.


I'm not an expert in this area, but if I remember correctly a lot of satellites use radio links in the neighborhood of 10 Ghz for ground communication because they've found that band isn't affected much by atmospheric conditions.


10-100 Ghz has very few bands where water is an absorber. It tends to fall between the molecular and atomic modes of many molecules, so you only suffer free space loss (which is still significant over long range at high frequency).


"traditional" geostationary satellites use various c, ku, ka and high ka bands. Anywhere from 6 to 30 GHz. Depends a lot on how new it is, what its intended use is and transponder/antenna configuration. and also depending on the intended size/type of earth stations and remote terminals.

in addition to tracking/telemetry/control low data rate radios in the L/S-bands.


A large reason why these are aimed for scientific missions is because the single photon detectors utilized for receiving these transmissions are still incredibly expensive. They are basically liquid-helium cooled arrays of nanowires (SNSPDs) which act like micro balometers - pretty sweet tech, but not commercial grade just yet.


Is there a reason you can’t use an avalanche photo diode biased in Geiger mode to achieve this?

If biased properly an APD can be used for single photon counting and they only cost $70-250, I’m just not sure if they hold up in space without modification.


The nanowire localizes the photons in space, time, and energy. APDs can't do energy.

The cryogenics allow using cryogenic-level bandgaps/carrier-energies, so each optical photon turns into many carriers which then flow out the wire in both directions, the delay between the pulse on both ends localizes the impact along the wire.


SNSPDs have lower dark count rate, and especially lower timing jitter in the near-infrared than APDs.

The deep space optical demonstrations I know of all work in the near infrared (e.g. 1550 nm), just like most of terrestrial fiber optics used for communication. Probably because there's lots of infrastructure at that wavelength already, and it's a good passband through earth's atmosphere.


In network terms the result is packet loss, which is unavoidable. You could shoot through atmosphere with a lower frequency light like infrared but that suffers a reduction of bandwidth and is still subject to packet loss, just less.

I have had to live off a personal satellite in the middle of uncivilized nowhere for a year (literally). Packet loss is the true regulator of speed irrespective of bandwidth numbers.


there's a big section of the article that talks about that (at a high level). Essentially all in the second half of the article with the title "From radio waves to laser light"


There's an entire field of Adaptive Optics for Free-Space Communications (AO-FSOC) where the turbulence of the atmosphere is actively monitored in real-time, and the laser wavefront is pre-compensated to minimise the effects of atmospheric distortion.


Hey, I worked on this. Glad to see it getting a lot of press :).


Is this likely to work at Lunar distances anytime soon? I saw that the James Webb telescope people were unhappy about losing much of their communication time on the Deep Space Network to the Artemis 1 mission. Could this be more cost effective than an major upgrade to the Deep Space Network?

How about at Earth-Sun L1 and L2 distances?


I would expect L1-to-earth communication to be problematic because you'd have to distinguish the signal from the background radiation of the sun.

It'd be interesting to know what the technical limits are in terms of output power and aim/focus. Generally, doubling distance means the signal power drops to 1/4th, and maximum data capacity of a communication link is proportional to the signal/noise ratio. So that would mean a 100 Gbps link might drop to 25 Gbps. You might be able to bring the signal/noise ratio back up by using a better detector or a more powerful laser, or aiming better. Or maybe the 100 Gbps data rate is limited by the transceiver, and there's actually plenty of S/N ratio margin that can be traded for range without affecting data rate at all.


If L1 is a problem, then L2 would be a problem as well. After all, at L2 the spacecraft is receiving its commands from the direction of the sun.

However, the problem is not quite as bad as it seems. Spacecraft at L1 and L2 Lagrange points actually are in a halo orbit that "orbits" around the Lagrange point. Attempting to stay at exactly the L1 or L2 point is unstable, since gravitational forces tend to knock you away from that point. The halo orbits are much more stable. And for a spacecraft in a halo orbit, you never have to point your antenna directly at the sun.

The problem is solvable for radio communication at least. There are currently 4 spacecraft orbiting the Earth-Sun L1 point (ACE, DSCOVR, SOHO, and WIND) as well as 3 spacecraft at the L2 point (Gaia, James Webb, and Spektr-RG).


Optical to Orion (O2O) is a plan to do a lasercomm demo on one of the future Orion moon missions.

When the Psyche spacecraft launches and heads to the asteroid belt (was supposed to launch in august) it will do the farthest (by far) lasercomm demo. I work in the group that made the SNSPD ground receiver. As my boss says, with a distance 1000x farther than previous space laser comm demos, closing the link is 1 million times harder...

Fun fact: when the Phyche comm laser is pointed at earth, the size of the spot will be roughly as large as California. Even with the largest optical telescopes, the loss in this link will be insane. That's why you need single photon detectors.

As you get to farther and father distances, one thing you can do is shift from on/off keying to large-M Pulse Position Modulation. This way you can save up the power on your satellite to send fewer but higher power laser pulses, each of which carries more bits of data. I believe the DSOC mission will go up to M=256. Meaning each pulse of photons received on earth will carry 8 bits of information based on when it arrives within an alphabet of 256 time bins.


The issue is not so much background radiation (you'd have similar issues with RF), and your SNR is going to be reduced because of diffraction (as you rightfully point out goes at r^2). However the SNR would still be much better for optics because diffraction scales as 1/lambda.

The reason why JWST did get an optical link is that people developing these things are rightfully conservative and optical links in space are really still under heavy development.


Are you able to talk about what kind of hardware is on the satellite? I'm curious if it's commodity like the Mars helicopter or something made for the purpose.


Not OP, so I'm guessing purely on my knowledge of how most of the industry works, but there's likely some sort of FPGA with custom IP at the center, connected to a powerful optical transmitter/receiver.

Associated reading can be found here: https://www.esa.int/Enabling_Support/Space_Engineering_Techn...

https://ntrs.nasa.gov/api/citations/20150009433/downloads/20...

https://www.fierceelectronics.com/electronics/fpga-enables-h...


The main paper seems to be behind a paywall (rolleyes), but I did find this synopsis that covers the basic approach and hardware: https://digitalcommons.usu.edu/cgi/viewcontent.cgi?article=4...

It sounds like this test payload was part of a larger CubeSet built by NASA, but the actual datacom components seem pretty much off the shelf (besides the optics). 100Gbps transceivers, an optical mux, and an EDFA - all common in terrestrial telecom - and some IR optics to collimate the beam.


Anything you can share on latency?

(congrats on the achievement)


Isn't it just "distance/c"? (or something like 0.999c since light is a bit slower in the atmosphere)


That's why I actually ask.

E.g. "light travels approximately 1.5x slower through optical fiber than in a vacuum"

https://www.commscope.com/globalassets/digizuite/2799-latenc...


Fun and somewhat related fact - this is one of the main advantages of hollow-core transmission fibers! As far as I know, currently only used to extend the range of HFT orgs...


Hollow core fiber? Total internal reflection requires the core to be a higher index of refraction than the cladding, so I wonder how that works!

Turns out, it functions differently. Instead of total internal refraction it has to rely on weird physics like photonic crystals. The pictures are absolutely wild.

https://www.rp-photonics.com/hollow_core_fibers.html

https://www.google.com/search?q=hollow+core+fiber&tbm=isch


Fascinating; thanks for the links. I wonder how the splicing process works for that.


Terrestrial radio links are similar in that they can be lower latency than fibre though spectrum concerns can come into play.


Radio links can also often be straighter. Easier to go over someone’s house then ask permission to dig a trench.


("1.5x slower" == "its speed in optical fiber is 1/1.5 = 2/3 of the speed in vacuum")


It's all relative.


So nice to hear from you, thank you and your teammates for advancing the state of science and space exploration.


It was a great experience (I was on the satellite side of development).


That's awesome! I work with CubeSats too. Would you happen to be at the Albaorbital PocketQube Workshop?


This isn't that hard to do, even with 1960s technology.

It takes about 3 days with the Apollo launch system to get from the Earth to the Moon. So if you load roughly 800 4TB portable hard drives into one of these and fly it to the Moon, this will result in an average transmission speed of 100Gbps, if my math is correct.


Good luck slapping 800 4TB hard drives from the 1960s onto a rocket. The Delta-v on that baby would suck


Well, I meant today's portable 4TB hard drives, using rocket and lander tech from the late 60s.


The purpose of this is to go the other way, from orbital sensor platforms that aren't intending to ever return intact. To do it your way, you'd need to send a disk-carrying vehicle into orbit, dock with the satellite, download onto the disk, and then return. Since the satellites don't exactly have USB ports on them, you couldn't do that anyway, and in practice, would still need to retrofit them with some form of high-speed data transfer mechanism to get the data onto your disks. The best option would very likely still be optical transmission, as trying to insert a cable into a USB port on something moving between 7000 to 17000 MPH is not exactly trivial.

At that point, what advantage is there in sending a rocket to receive the transmission and then carry the data back when you can just use the exact same link to send the data straight to Earth?

Theoretically, they could carry some number of disks into orbit, fill them, and then drop them back on parachutes. Something like that is actually what the earliest surveillance satellites did. They dropped film from orbit that the military retrieved and developed.


Huh? What a bizarre nitpick. I could fit 800 4TB portable hard drives into my van, and drive to the next city. That would be a quicker data transfer than via the internet. I guess that means fiber isn’t anything notable?


The age old adage is part of the OP's point, tongue in cheek of course.

"Never underestimate the bandwidth of a station wagon full of tapes hurtling down the highway."


The point is that latency also matters


Amazon Snowball works like that for large data migrations.


Or Amazon Snowmobile, which is a literal container truck on the road.


Oh right, that's what I had in mind. For some reason Snowball came to my mind instead of Snowmobile.

Was surprised to see that Amazon Snowball also exists and serves the same purpose :)


There is an old joke of data transfer speeds reachable with a railroad car full of CDs. Though I do not know the RFC number for TCP-over-Railroad.


A 1960s 4 terabyte portable hard drive, you say?


But goodness the ping time sucks in that one!


Sure, but if people only focus on bandwidth and ignore latency, as they usually do, my solution is easily done with off-the-shelf technology.


Where does one pickup a Saturn V rocket from?


You don't "pick one up". You make it yourself.


“…to downlink all the data they could ever dream of.” I think they are significantly underestimating the dreams of users of comms.


Things are much more difficult from space, but DLR/Mynaric posted 1.72 Tbit/s over 11 KM back in 2016, so we might optimistically not be too far off.

https://www.dlr.de/content/en/articles/news/2016/20161103_wo...


yeah satellites potentially have gigantic cameras arrays and sensors of all kinds


I wonder if re-transmission is the best solution to corrupt blocks. Some students that commuted to school on a train (MIT?) figured out if they transmitted ECC-style blocks combined with RAID-style parity blocks they could instead rebuild corrupt data.

It all depends on the kind of corruption. Periodic spikes, white noise, blackouts - different problems need different solutions


This is known as "forward error correction". If the nature of the corruption is known and predictable, then you can design a reasonably efficient scheme to mitigate it. Otherwise, I suspect some amount of acknowledgement and retransmission at the MAC layer or above is a good idea. It really depends on the latency of the link, though, and the probability of corruption. If a round-trip delay is larger than a reasonable window size, and corruption is frequent, then FEC would help a lot. It's the only option for one-way comms such as digital broadcast television.


> This is known as "forward error correction".

Fun factoid: Reed and Solomon were working at MIT Lincoln Laboratory (ie where the OP result is from) when they invented RS codes. ISTR they were also working on satellite comm, in which Lincoln has a long history.

[Source: I spent a decade there myself and drank the kool aid.]


Awesome, what are you up to nowadays? Anything interesting brewing you can share?


These days, most of my time is spent on video encoding and distribution for fun and profit: https://www.centaurdelivery.com


That’s super cool. Error correcting codes are truly amazing technology.


Which is exactly what Wi-Fi does, adjusting the amount of parity data depending on link loss, to keep retransmissions to a minimal.


Indeed. It's fun to look at the MCS table for something like 802.11ac. a lot of different rates are actually the same underlying modulation, but with varying amounts of overhead from forward error correction.

802.11 data frames also have an acknowledgement at the MAC layer. The radios dynamically ramp up the MCS rate until packets start dropping, then ramp the rate back down.


And still, wifi links are among the top sources of packet loss. Without sophisticated forward error correction it would probably be much worse, but it shows that this is not an either-or question. The correct answer is: do both.


I think communications theorists would argue that retransmitting is essentially an FEC, and likely a suboptimal one for these links. That said, real-world constrains don't always match up with theory, while modern optical communication makes extensive use of FEC (however RS codes are not used much anymore, in long-haul it's LDPC codes) AFAIK FECs for these type of "intermittent" links are very underdeveloped. Moreover, if the probabilities of long outages is too high, one would need a lot of memory for the FEC which becomes difficult to do in real-time processing (remember we need to process at >100Gb/s).


US geointelligence uses error-correcting codes, but beyond that, individual detectors on sensor arrays fail all the time, and you can't go up there to replace them and you don't want to bring down an entire satellite when it still mostly works, so ground processing needs to be robust to missing data anyway. It's fairly straightforward to just interpolate pixels, but the full process is a bit more sophisticated and also involves building overlay layers rating the quality of each pixel, so follow-on processing like object and change detection is able to take into account not just hey, what am I looking at, but also how reliable is each individual part of the scene. To a human looking at the final image, though, you'd never know the difference.

I'm assuming most of what this would be used for is imagery collected from space. For whatever reason, other commenters seem to think this is for holding conversations, but it clearly says it's for data from science missions. Even if you were trying to talk to someone on the other end, though, it's rarely that big a deal if part of a word cuts out. That happens all the time with existing ground-based calls or even just two people in a loud room and the human brain knows how to handle it.


Another comment has pointed this out, but Reed-Solomon coding was invented as a method of forward error correction. It was only applied later to storage systems as an "erasure code" because it can detect and correct extremely long runs of bad bits. Comparatively, it can detect and correct many fewer random bad bits.


That error correction, great video of 3blue1brown explaining how one such algorithm works:

https://www.youtube.com/watch?v=X8jsijhllIA


You must have a retransmission mechanism anyway. You can always implement such mechanisms on top of a retransmitting protocol.


It amazes me that we can do this from space, but I can't do this from my house. /s


Nearby to me is the Microsoft HQ campus. Few miles away is Amazon, not to mention most every other major software company. Even SpaceX has an office here. My home has one available ISP, and that's Comcast. I pay monthly for 100/30 what other homes pay for 1000 up & down. The home 50ft behind mine has fiber. It's insane to me.


Maybe you can get a better deal from the neighbor than from Comcast.


What's your budget?


Happy to pay upto £5.000,00 for setup/digging and then upto £60/month for 1Gbps in Central London. G.Network, Pure Fibre, BT (not a business address), Hyperoptic all don't want to bite while some have fibre in a street short distance away (~100 meters) from my house


It's tough in the UK, especially London, especially especially central London.

I moved from NZ where we'd done probably one of the best fiber rollouts of any country; I had 1Gbps into an apartment from about 2016 or so. Before that I'd had 200ish mbps into a house in the suburbs in about 2014 thanks to the previous tenants having run a business from the place and paid for a line in.

Now I live in the UK I can see why it's hard - too much history/housing here is just built up far too much/not built up in a modern way either. Also I think when obtaining permissions from property owners in the case of flats, Kiwis are generally more likely to approve of something like that than people here.


Yeah, the Mews street I live in we are all property owners. Only in the middle bit is city council property


That got me thinking about trying to find the mission budget for TBIRD and I came up empty handed after a bunch of googling...


300 dollars if you life in switzerland is enough 30 for 10.


Line of sight is easier to arrange upwards.


Exactly. Which is why I'm always less than enthused by articles such as this. With the telecommunications industry, it seems breakthroughs such as this never make their way to the average or even slightly high-end user such as myself.

Wake me when telecommunications industry gets their shit together.


I might be wrong here (for a lack of in-depth knowledge) but isn't the full-body motion of the cubsat (for aiming) and the ARQ protocol the real development here?

I know the laser links themselves might not be new, and while it hasn't been done like this, we knew that it would be possible considering we've done a ton of variants (sat-to-sat, station-to-station, over fiber, over glass, in open air, in a vacuum), even just hitting the beacons on the moon and measuring the reflection somewhat shows that long-distance links could be done. Maybe the new-ness in the laser aspect is the small satellite and energy package compared to high-powered lasers you might expect?


Kind of nice that cubesats and (SpaceX) rideshares brings down the costs enough that more of such experiments can be done.


Useless but slightly relevant aside: I was almost part of the first ecommerce transaction from outer space. This one civilian guy was in the middle of a large luxury item purchase before his space ship took him to outer space. My bosses emailed him and I think he responded from space somehow. They wanted him to sign off on the purchase to make it official, but he didn't until he got back to Earth Prime. Almost!


I don't always want to be the party pooper and this tech does have a lot of benefits when trying to transmit large items. However, you ain't ging to be video conferring to the moon or Mars with it. Not a single word about latency in the article.


For those curious, I went and looked it up, and the one way (not round trip!) light travel time to the moon is about 1.3 seconds, and one way light travel time to mars is 3-22 minutes or so, depending on how far they are at the time.

So maaaaaybe you could have a really painful conversation with someone the moon, but not Mars.

edit: This is WITHOUT any latency introduced by the link/protocol, or if you have to then route the message from one side of the earth to the other, just time-of-flight distance calculations, so the absolute minimum possible latency.


1.3 seconds delay for conversation is mildly inconvenient, not painful.


Just another bad day on Cisco Webex™


Really? I'd be driven crazy by a 1.3 second delay. 300ms is bad enough.

You'd be continually talking over the other person.


That’s the speed of light, an aspect we can’t improve. The article is about bandwidth, an area we are able to affect.


You can send the data through a different medium to improve latency which satellite optical links could help with.


This is already sending data via lasers traveling through free space (and air, which has nearly the same speed of light as free space). You're not going to get a medium with a faster speed of light than free space unless you get into space-warping theoretical stuff. Even if you managed to run a fiber optic cable between a satellite and the earth, it would be around a third slower than lasers through space.


Speed of light is still a limit, fastest possible roundtrip time between the Earth and moon is 2.5 seconds. Round trip between the Earth and Mars would be over 6 minutes when Mars is at its very closest to Earth.


In space, latency will be dominated by distance. In low orbits, lasers will have low latency, good enough for interactive applications. Geosynchronous, Moon, and farther can't be helped.


in space its the distance, light has a speed limit


> latency

Well, the Moon's semi-major axis is about 380000 km, which means latency is lower-bounded to about 1 s.

Similarly, Mars' closest approach is 54.6 million km. That means a latency lower bound of 3 minutes.


The moon is only a second away. That's awkward for conversation, but perfectly possible for a structured meeting setting.


Laser link (not RF) implies a whole bunch of things:

1. speed of light through atmosphere, so basically c 2. line-of-sight is required, likely stationary base stations. Probably also subject to atmospheric and weather conditions


Just to clarify, item 1 is exactly the same for RF.


Humans wil adapt. If you want to call mars an ai will support you what the question/following sentence will be so you can answer the predicted question in one swoop. Problem solved.


Or you could write an email? I don't see how that works at all, you'd still be sitting in silence for minutes while the response loops around.


They did that recently on the TV show Avenue 5.


The satellite is only 300 miles up. The processing power on the satellite itself probably matters more for latency. At 200Gbps max (2x100), it's probably no slouch.


Encouraging to see the CubeSat program producing such notable progress. Hopefully, we'll see many more reports of engineering advancements from miniaturization and commercial components adapted to the rigors of space.


Anyone familiar with this start-up --https://www.aalyria.com/?

Seems like they are miles ahead of the MIT team which is still in the demo stage.


I don't think they have any demonstrated space to ground comms yet. There are a number of players working on this though. I would say Lincoln lab is still the leader in this field though.


Now they just need to work on the latency.



Yes, but at what latency? That's the big pie in the sky with space intercommunication. Typical (pre-StarLink) latencies with satellite are like 3k milliseconds.


On what service? I have done a lot of military SATCOM, and our normal rount-trip latencies to geostationary could be as low as ~560ms. Not anything like as low as LEO (which are far, far closer to earth) but not unusable.


Why is latency the “big pie in the sky”? We’re not talking about counter strike or Halo CE matches here, we’re talking about the ability to transmit large amounts of data from deep space very fast.


The fastest way to transmit data (albeit not from space) is FedEx (mailing hard drives). This is not ideal. Why? Latency. That's what makes networks better than pigeons.


Quick! Someone tell NASA they can FedEx to Mars!


I'm curious how many households this would support, taking into account usage distributions over time. Would this be sufficient for a medium city? A large town?


Zero households. This system is meant to communicate with one ground station at a time, and only for a very brief window. It's really only good for downlinking bulk data that was collected by the satellite, not Internet access.


I've seen numbers of about 2.7Mbps[1] for average peak traffic rates per subscriber on cable, which would give you about 37,000 users if I didn't mess up my bits and bytes.

But as the FSO link is point to point, you would need something like a high-altitude platform station (HAPS) like a blimp, UAVs with RF or a RF tower on the ground to receive the FSO signal, and then broadcast it to many users.

[1]https://www.nctatechnicalpapers.com/Paper/2018/2018-analysis...


You forgott latency...not usable for rt communication.


how so? stock markets are using lasers to reduce trading latency vs everything else available so what do you mean not usable for real time communication?


To shorten the distance, not make it longer (aka satellite). And i don't think it's laser but microwave.

https://www.six-group.com/en/products-services/the-swiss-sto...


There's an inherent limit to how fast you can communicate with objects in space, so a video call with someone where you will have 1-2s of lag (guaranteed by laws of physics, not occasionally) might be unusable. There is also the impact of distance, you might have even higher latency to the same object depending on the time of the year (Earth - Mars today vs Earth - Mars few months out).


Your figures are totally wrong. The distance to geostationary orbit is around 30,000 km, so at the speed of light (~300,000 km/s), that is 100 ms of latency.

Low earth orbit is 3,000 km or so, meaning that's only 10 ms each way.

Odds are good that if you had a mesh network of low earth orbit satellites (like Starlink) you could actually get an antipodal point-to-point video call with less latency than with terrestrial fiber. That's not a function of bad terrestrial switching/routing: it's the fact that light travels faster through vacuum.


Latency to the Moon and back is below 2.5 seconds. Bouncing a signal off the Moon is a known ham radio operators' pastime.


"household" i assumed person was talking about geo or low earth orbit communication, everyone knows light can only go so fast.


You're dreaming, buddy. 5G can't even be successfully utilized in the richest nation on earth (U.S.). This brand new tech is decades from customer applications. Cool display of technology, but I won't get excited over something that won't improve the lives of anyone except the military industrial complex and its benefactors while internet services and cell services continue to degrade every year.


The US is notoriously bad for fast internet in general though. Here in Asia 5G is mostly rolled out in some places, I was getting 100Mbps on my phone the other day.


Since you've already started breaking the site guidelines again, I've banned this account.

If you don't want to be banned, you're welcome to email us at hn@ycombinator.com with reason to believe that you'll stick to the site guidelines in the future. They're here: https://news.ycombinator.com/newsguidelines.html.


Gamers in space with 300fps and no lag. In the future when I get sniped in matchmaking it’ll be by an astronaut.


Imagine the cooling system!

A gaming computer can use 1000W. Google says spacecraft radiators reject 350W/m^2.

If my math is correct (no guarantees) you'd need a king-sized-bed-sized panel just to cool your rig.


I don't think that because a computer uses 1000W of power it would produce 1000W of heat though. I think if that were true then there could be no computation done since all the energy of the electrons would be converted to heat. I'm not sure what the efficiency rating of computers is, or how much is turned heat.

Not to rain on your back of the napkin math. Although I think your sentiment is right, the cooling systems for computers in space would need to be bigger since you can use convection to move the heat away.


It is all heat, except for the light from your monitor which then turns into heat when it's absorbed.

Computing isn't doing work in the physical sense. An HDD of ordered data doesn't have more potential energy than an HDD of data in a different order.


It's all the rage in spacetech these days... Especially among American and Chinese startups.


And somehow I can not even get 100Mbps internet connection in Central London (Zone 2)


Does this have implications for real-time satellite video data?


Meanwhile in the US, consumer internet is barely at 1 Gbps


Signal in space travels 3 times quicker than fiber optics. This means that in some cases the latency may be lower than fiber.


Its actually only 50% faster. Light in a fiber optic travels at ~2/3c. In a vacuum it travels at c. So its

c / (2/3) = ~1.5


is there a reason some US stock markets are using lasers ? wouldn't fibre be just as fast as laser going through the air?


Yes this is why many high frequency traders have installed laser or microwave links to their nearest stock exchanges, simply to gain a few microseconds advantage in their trades. Fibre isn't as fast as laser going through the air. It's 50% slower (as the parent comment states) (unless you are using special hollow core fibres which are uncommon for now). More importantly, fibres rarely go in a straight line between 2 points, they wind their way through buildings, down into basements, through buried pipes etc and this all adds extra distance to the route and hence more delay to the signals. A line of sight link is the shortest route between 2 points.


That question does not really make sense. Fibre links use lasers. Maybe what you are asking about is free-space vs fibre links. Yes some HFT players use free-space optical or microwave links, because of the improvement in latency, they typically also eliminate all (or most) FEC which introduces significant latency as well. However, they don't really care about data rate so much.

For almost all other applications the factor 1.5 you gain from free-space vs fibre is not worth it. A trip around the equator in an optical fibre takes 200ms. That is acceptable for almost everything. Moreover, the latency of using any something free-space would likely not be much better, because one would need to regenerate at least 3 times (which might add ~10ms or so for each regeneration)


Lasers are easier to setup (you just need two end stations) and don't need dedicated lines (you don't have to lay any fiber). If you have an existing connection between two points, then fiber might be better / same, but if you had to setup something for cost / speed, then laser would win.


The higher the index of refraction of a medium, the slower light travels in it.

Air barely refracts light, and glass refracts it heavily.

BTW laser links are prone to work worse in bad weather. Microwave links are often used instead.


But what are the pings?


And yet here in 2022 cell reception quality is quite literally worse than it was in 2010. Telecommunications industry is really baffling at times. Just like when people were excited for 5G, I'm very skeptical that this technology will ever actually improve our lives in the short to medium term.

Overall, services across the board seem to be worse than they were 10 years ago. And yet the tech is certainly more advanced. Really disappointed as a whole with the telecommunications industry. Maybe that's the field I should have focused more on, as they seem to be struggling to improve things even with technological breakthroughs such as this.

Just a shame.


This is 1 point-to-point link (laser) with direct aim required. No forwarding, no more than 1 user.

I don't see how this relates to the use-case of millions of broadband users that you are talking about where you are routing fiber cables all over the place. We do have 800-gigabit fiber in core networks, we just don't route it to every home b/c why would we. And 5G radiates to 1000s of users simultaneously.. regularly getting multi-gigabit speeds on mmWave as a regular user is pretty amazing to me.


I'm sure this varies by region, since anecdotally reception and bandwidth is significantly improved everywhere I've been in the past 10 years.


This. Rural Rocky Mountains here and in the last year 5g has made service exponentially better.


Pretty rare for me to see it below 50mbps. Usually around 200 or above.


>Overall, services across the board seem to be worse than they were 10 years ago.

Do you live in the US? The pathetic state of telecoms in the US is a US problem, not a global problem. In many other places, our internet and cellular service is fast and cheap. But you guys don't like regulation or competition, so this is what you get.


MKBHD recently shared the same sentiment in a video. He feels, anecdotally, that 5G is worse than LTE. I feel the same way and recently switched my preferred network to LTE.

You might find a better experience doing the same. I've found 5G to be truly awful and I live in one of the biggest cities in America where you would expect better infrastructure.


5G is better in some usecases. LTE gets destroyed at sporting events or festivals, while 5G can support more devices and faster speeds.


[citation needed]


Now they can watch youporn on ISS, don't forget turn it off when livestream on nasa tv.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: