Hacker News new | past | comments | ask | show | jobs | submit | blhack's comments login

I will happily pay for high quality news. Every few months I check back in with The Financial Times to see if I can get it delivered to my house again (they used to deliver in Phoenix, but stopped, presumably they lost their printing partner here). My wife even tried to set up a PO box in another state and have the contents forwarded to us, but we could never get it working.

I also paid for Foreign Affairs for a long time, but eventually the quality of the paper (as in the physical material) dropped down a lot, and the number of ads went up.

Lapham's Quarterly (now defunct) wasn't really news, but happily paid for that.

Also plenty of substacks, patreon podcasts, etc.

--

My local paper just ran a story about a woman "trapped" in her Tesla because the battery died. They started the story with a "warning" to anybody who might be considering buying one. The solution, according to this article, was to locate the "secret" release button that opens the door. Of course to anybody who has ever ridden in the front seat of a Tesla this is an absurd framing of the physical door handle which opens the door in the exact same fashion as every door that has been manufactured for a vehicle for the last 100 years. If you own a Tesla you have probably had to tell somebody not to use this handle (since it seems like such an obvious way to open the door) because it doesn't crack the windows and could damage the window seal (or so the warning that pops up when you use it says).

I'm not going to pay for that.


Substack simply has better quality.

I've been involved in some things a handful of times that made it into the paper. Technical laws being passed, corruption, complains about a system failure... In every instance the only thing that was really correct was the simple facts (law X passed, thing Y failed, person Z arrested). Anything more nuanced tended to be 'technically' correct but was phrased in a way that often would make you think the opposite of what actually happened.


How could this possibly comply with European "right to be forgotten" legislation? In fact, how could any of these AI models comply with that? If a user requests to be forgotten, is the entire model retrained (I don't think so).


This "ai" scam going on now is the ultimate convoluted process to hide sooo much tomfuckery: theres no such thing as copyright anymore! this isn't stealing anything, its transforming it! you must opt out before we train our model on the entire internet! (and we still won't spits in our face) this isn't going to reduce any jobs at all! (every company on earth fires 15% of everyone immediately) you must return to office immediately or be fired! (so we get more car data teehee) this one weird trick will turn you into the ultimate productive programmer! (but we will be selling it to individuals not really making profitable products with it ourselves)

and finally the most aggregious and dangerous: censorship at the lowest level of information before it can ever get anywhere near peoples fingertips or eyeballs.


Machine Unlearning is a thing, see e.g. here [0] for a introduction.

[0] https://ai.stanford.edu/~kzliu/blog/unlearning


> how could any of these AI models comply with that? If a user requests to be forgotten, is the entire model retrained (I don't think so).

I don't believe that is the current interpretation of GDPR, etc. - if the model is trained, it doesn't have to be deleted due to a RTBF request afaik. there is significant legal uncertainty here

Recent GDPR court decisions mean that this is probably still non-compliant due to the fact that it is opt-out rather than opt-in. Likely they are just filtering out all data produced in the EEA.


> Likely they are just filtering out all data produced in the EEA.

Likely they are just hoping to not get caught and/or consider it cost of doing business. GDPR has truly shown us (as if we didn't already know) that compliance must be enforced.


Have you ever seen the Northern Lights with your eyes? If so I'm curious where you saw them.

I echo what some other posters here have said: they're certainly not gray.


Was there a similar backlash to this identical ad from LG in 2009? https://www.youtube.com/watch?v=NcUAQ2i5Tfo


The popular sentiment has changed from enthusiasm about "digital", to disillusionment about big tech inserting themselves into our lives to monetize everything.

In 2009, smartphones were a novelty, and the iPad has not been announced yet. People were wowed by the new capabilities that "multimedia" devices were enabling. They were getting rid of the old, outdated, less capable tools.

Nowadays "multimedia" is taken for granted. OTOH generative AI is turning creative arts into commoditized digital sludge. Apple acts like they own and have the right to control everything that is digital. In this world, the analog instruments are a symbol of the last remnants of true human skill, and the physical world that hasn't been taken over by the big tech yet. And Apple is forcefully and destructively smushing it all into AI-chip-powered you-owe-us-30%-for-existing disneyland distopia.


I guess earlier people must have assumed it is not really possible to replace all those instruments and tools with a small phone.

So the ad was probably punching up in a way back then.

Today there is a real recognition of how pervasive digital devices and AI tech is becoming.

With all the might and influence Apple and tech companies now have - this ad might have evoked a sense of punching down.


Apple ads team should apologize to LG for stealing their ad


Apple steals _everything_ and never apologizes.

https://www.theregister.com/2012/10/12/apple_licenses_swiss_...


Synecdoche NY, specifically this scene: https://www.youtube.com/watch?v=Z9PzSNy3xj0

But there is also a scene (which I cannot find online) where the main character (a playwright) is explaining that the only way to make his play work is to make everybody a main character. He realizes that everybody, everywhere, is living out a rich life and that they're the main character of that life.

It is a fantastic movie and i highly recommend it.


Oh my.

I love Philip Seymour Hoffman (RIP), but I found Synecdoche, New York to be sooo incredibly anti-interesting in its multiple layers of ironically unironic self-indulgence that I really truly hated the film.

I have never had such a strongly negative reaction to any other movie. I'm having trouble thinking of a somehwat-close second. Maybe Blue Velvet?

If I heard someone else say that, I'd imagine that they had missed something, or just didn't "get it", but I don't think that's the case! :)


Wow I love both Synecdoche and Blue Velvet! What didn’t you like about Blue Velvet?


The first revulsion that leaps to mind is Dennis Hopper and the gas mask rape scene.

There are others, but it's been a while, and my memories of it are mercifully fading. I prefer to not interfere with that process!

If the goal of art is to induce thought and make an impression, these are both successful projects. :)


I watch a lot of films, mostly older ones these days, I don't want to estimate how many.

This film has haunted me with its beauty. Not as airy as Mallick's work, a little softer than Aronofsky (The fountain, black Swan are masterpieces).

I think Mr. Nobody attempts a weaker version of this film, cloud atlas also.

Gattaca, inception, 2001, kpax maybe.


Yes! I love Aronofsky's work. The Fountain is a hauntingly beautiful film. The poster from this film hangs in my children's bedroom and I can't wait to watch it with them when they're ready for that type of story.


I knew a guy who worked on that film and still didn't know what it was about. I personally enjoy the scene where she's touring a house with a broker while it's on fire.


Probably the most memorable scene and not a spoiler:

https://youtu.be/WFwS_Dqd-IU?si=A4h7BSXx90CxqvhV


This is for very low bandwidth text communications when you're out in the country and can see the sky.

Stuff like this has existed from companies like garmin for some time. This is very cool, though.

Here is when this was announced: https://www.youtube.com/watch?v=Qzli-Ww26Qs

Pretty cool! Also kindof funny to see the TMobile CEO trying to hype people up and Elon sortof reigning it in.


But low-bandwidth text is all I need. I guess I won't be renewing my subscription for my InReach Mini forever (although that form factor is pretty nice).

I don't need to be able to stream Netflix when I'm in the backcountry, it's just that my wife insists I need to be able to get a helicopter if I break my leg.

Now we need app developers to log off their fiber-served wifi when writing messenger apps, and log into a high-latency, low-bandwidth, high-packet-loss network instead so Messages will actually open instead of whatever it's trying to do to upload my location history and download contact pictures...


> I won't be renewing my subscription for my InReach Mini forever (although that form factor is pretty nice)

Before getting an InReach, confirm a 406MHz emergency beacon doesn't fit your bill [1][2]. (I have this one [3].)

You can't text a loved one. But with no subscription, a years-long battery and powerful radio that works around the world, you can call emergency services to your precise location.

[1] https://www.sarsat.noaa.gov/emergency-406-beacons/

[2] https://www.rei.com/learn/expert-advice/personal-locator-bea...

[3] https://www.rei.com/product/161982/acr-electronics-resqlink-...


My wife hikes in Western Colorado quite a bit and has helped rescue 3 parties over the past 4 years. In one case, a young woman (an ER nurse) was in dire distress for no apparent reason and fading quickly (in retrospect possibly COVID related sequela). Initially, emergency responders were going the route of a land-based rescue, but the Mini allowed the urgency to be communicated more clearly. The helicopter touched down on a scree field (but couldn't land) while my wife and a couple others helped the EMT load her. Docs said that she had less than an hour or so before long-term consequences or death.

Any emergency beacon is better than none, but two-way communication can be a literal lifesaver. I hope that the Starlink system is eventually linked into our 911 infrastructure and available to anyone with an LTE phone regardless of carrier.


Two-way communication is important to prevent false positive emergencies, too. There have been several times I've used my inReach to message that I'm going to miss my planned check-in time, but everything is OK - no need to alert SAR.

Also to communicate on-the-fly decision-making to inform potential SAR - "I'm making good time, going to head up this extra peak before continuing the planned route."

I'm happy to have a cell phone backup to the inReach, but I don't see this Starlink offering as a replacement. That goes double if you're out in winter. Phone batteries aren't great in the cold.


Phone batteries are fine down to -20C and usually are functional to about -30C.

You should have no problem with that anywhere in the world as long as you keep it in an inner pocket. Perhaps pack wired earphones just in case you need to make a long call while keeping the phone itself warm.

Notably the battery should not be charged below about 5C. If you do, it will be permanently damaged.


Respectfully, I don't know about that. I wouldn't trust it if my life was on the line. I go for walks and bike rides when it's -25C to -30C air temperature (before windchill) and I've had old samsung galaxies and newer pixels both crash if I take them out to take a photo. I presumable due to voltage dip.


This is exactly the reason why I'm sticking with the CAT S6x series phones and willing to put up with mediocre performance/features, as far as smartphones go.

They've been the only ones that don't just turn off in really cold temperatures, even without babying them in warm pockets.

The general ruggedness is also pretty good to amazing, depending on how fragile your previous phones were. For example: it survived a ~10 meter (32ft) drop on rocks, whereas everyone was convinced it was done for.


Do you have one of the newer models with the thermal camera? I was very interested in seeing how well that works.


Yes, currently the S62 Pro. What would you be interested in knowing?

To be honest I first thought it would be a gimmick, but being able to see heat / rough temperature degrees has proven itself quite useful.

But you don't really realize it until you have it available. It's kind of like having another sense available to you.

Random examples of where it was useful:

- finding shitty chargers/electronics which were really hot while doing nothing -> wasted power

- spotting a water leak before it was visible (think cold spot in the middle of the ceiling)

- checking car tire alignment (one side hotter than the other)

- finding buried hot water pipes

- finding where cold air is leaking in during winter

- spotting damp areas

and probably others that I'm forgetting right now


Thank you for tkaing the time to repsond to me and giving me a good breakdown!


Yeah having a newer phone helps.

Once I was stuck with a ridesharing car opened and off in -20C because my old samsung s8 throttled so bad I couldn't even reboot it. It was completely unresponive with the flashlight still on discharging it even faster.

I couldn't leave the car too, being responsible for damages during the rent.

After a bit of time I remembered that to force shut down newer androids you have to hold volume down and power. Why would they not keep it as a long press is beyond me.


To someone that lives in a place where we just had an entire summer of 40°C and higher, -25°C sounds like Antarctica, or Mars. And you do that on a bicycle? Clearly, you must be an alien!


Thanks for the laugh. I send my australian friends photos of snowbanks and iced up hair. The hardest part if having a plan if something goes wrong, e.g. I'm not changing a tire or fixing a chain at that temperature. I've been tempted to wait on a bus at a slow train crossing. At least cold I can layer up with and move. I'd die at +40C.


Actually, an Italian guy is about to start a bike hike to the South Pole. This is the first link I found to a page in English about it https://aminhacorrida.com/en/omar-di-felice-comes-back-to-at...


> -25°C sounds like Antarctica, or Mars

I had that in my hometown in eastern France at 800m altitude, a town a few km away has a record of -41. -15/-20c was almost a yearly occurance when I was a kid


Do people in those areas need phone cases that are insulated to retain heat?


I don't know if those exist. I imagine if you are a linemans or working in the oil field you might have a rugged phone/radio or leave it in the truck cab.


> Notably the battery should not be charged below about 5C. If you do, it will be permanently damaged.

This depends on the specific battery chemistry. The BMS should prevent the charging scenarios which would damage the battery. For some it's 0C not 5C. Some can handle charging below 0C.


I have never seen a phone BMS that prevents charging due to undertemperature


Isn't that a stock feature on pretty much every chip? I mean, if you've got a temperature sensor for overtemperature protection, including undertemperature protection is trivial.

Even the cheapest chips from TI and ST seem to include it. And modern cell phones often advertise extra-fast 20W+ charging, so I doubt they're using the cheapest chips.


I can already imagine freezing to death while my phone refuses to charge because it wants to protect its precious battery


My iPhone refuses to charge when too hot or too cold.


My last two phones (Xiaomi 9T and Pixel 7) have struggled below about -5C. They still work but the battery drops very quickly so you can't rely upon them. Keeping them in an inner pocket works but I then take them out and see the battery drop. The sudden temperature change might not help there.


Samsung Galaxy XCovers have user replaceable batteries, so can atleast keep spare(s) warm.


iPhones often turn off around -20c with a warning about temperature, it happens to me every time I travel to Lapland in the Winter.


my iphone 6s and 12 pro beg to differ. both saw serious drops in battery in pretty normal 10-15F chicago winters while I was outdoor ice skating or even just going for a long walk with the phone in my pocket.


> I hope that the Starlink system is eventually linked into our 911 infrastructure and available to anyone with an LTE phone regardless of carrier.

If Starlink's solution is really completely unmodified LTE, I'd expect 911 calls to work regardless of being a T-Mobile subscriber, just like for existing terrestrial networks (where you can dial 911 even if your carrier does not have signal in a given location, or even entirely without a SIM card).


I don't think LTE can cope with the extreme doppler shift present when the other end is a satellite in LEO


The shift is actually not that extreme in LEO (a few KHz at most).

And couldn’t the satellites mostly adjust for that, given that the relative doppler shift should be pretty constant between mobiles in the same spot beam?


LTE can't even handle a high speed train. 1 kHz causes significant degradation

https://eudl.eu/pdf/10.1007/978-3-319-66628-0_41

doppler shift is not a constant for a stationary observer from Earth.


Obviously it's not constant, but if it's uniform (enough) within the footprint of a single spot beam, the satellites can adjust for the global component both in their transmitter and receiver, and the mobile devices only have to compensate for (or tolerate) their local difference from that.


I don't know how much I'm allowed to say, but at least part of what they're doing is normal Cat-1 LTE that any modem that supports it will be able to pick up


How are they accounting for the presumably very high timing advance?


As far as I understand, the timing advance only matters to compensate for differences in distance/latency between different devices (i.e. to avoid uplink transmissions to talk over each other on the same frequency).

The satellites can correct for "global" latency themselves (unless there are higher-level parts of the LTE/E-UTRA radio protocol that can't tolerate such long latencies, e.g. ARQ timers).

For GSM as a half-duplex technology, there's also the matter of devices not being able to transmit and receive at the same time, but I believe the same principle applies: As long as the timing differences between different devices in the same spot beam isn't too large, that's something the satellites could globally correct for.

The same probably applies for doppler corrections.


> you can call emergency services to your precise location.

I guess this is the first time it's occurred to me that hikers and backpackers and such carry them too, which is odd because I hike and backpack.

I picked up the InReach after finding myself 200 miles offshore in a sailboat without an engine or battery power (which meant no VHF or electronic navigation equipment or autopilot or even red/green lights for others to see us at night) nor any way to make power

I assume the 406 Mhz beacon has a range of a few miles?


406MHz I think is referencing PLBs and EPIRBs, which I'm guessing you're familiar with from sailing. The power is high enough that satellites can be used to get gross location, with radio direction finding being used by rescue party


AHA! Yes, thank you.

Other than the compass (which was only visible in the day time) the EPIRB was the only piece of equipment we assumed to work on the aforementioned voyage. We of course didn't test it as we were able to make it back safely, but I should probably learn more about how it functions.


I’m surprised that things like US military-style tritium-illuminated compasses aren’t more popular or widely used for cases like this.

They’re very safe, last something like 10 years, and aren’t really expensive (compared to “I need to know what direction I’m going and have no way to do so, or can’t see it”).


It would have been a delightful addition and honestly would have kept us much safer. It was fairly easy to keep Orion's belt just port of the mast, but occasionally cloud cover or weather would make that more difficult.


with no propulsion, I'm assuming you were just drifting and not really doing to much navigation. just curious if you tried any celestial navigation at night to attempt any sense of direction?


Yeah. We mostly just picked out a set of stars and kept them roughly to position. Every so often we'd try to illuminate the fixed compass with a phone to ensure that we hadn't wildly drifted from our expected bearing (also, stars move over the course of the night and I have no idea how much)


if there was wind they may have been sailing.


It was an old race boat, we had ample wind, and the owner casually bragged that he'd replaced the sails with ones that were "about 30% too big for the boat."

We were sailing like demons. One of my biggest regrets of the trip was that we didn't have any wind or speed instrumentation to let us know how fast we were going, but it felt extremely fast.


That sounds like a hell of a story. How did you find yourself in that situation? And how did you get out of it?


If it were me, I’d assume backup paper navigation charts and either a handheld battery powered backup GPS or my smartphone that has an app to parse data from the GPS chip. Could use primarily dead reckoning plus turn the phone on every couple hours to update true location. Most sailors are using paper charts for tricky sections anyways — you just normally have a constant GPS location provided by your navigation system.

There are also lots of established shipping lanes you could find that would be relatively densely traveled and give you periodic feedback for which direction to go. A bit more dangerous due to collisions without lights but as long as your retroreflector is hoisted, big ships should see you clearly on radar. plus you’ll generally be swapping sleep shifts and always have someone manning the helm on the sailboat, though seeing large vessels at night can be somewhat more difficult.


Mostly all that's right. We had three phones and a few battery packs (tho we'd exhausted them more quickly than expected and lost one to rain) because the paper charts aboard were from the late 80s when the boat was built and were all for the Pacific ocean it was built to sail.

The only other complicating factor was that due to the August Florida heat, nobody really had the luxury of sleep for the first couple of days.


> Most sailors are using paper charts for tricky sections anyways — you just normally have a constant GPS location provided by your navigation system.

My experience in Marine SAR is that this is no longer true. Apps like Navionics, SeaPilot, etc - often without any backup at all - are by far the most common form of navigation.


Yeah. I've spent a lot of time learning charts and a terribly small amount of time actually using them. In this case, we didn't have the right ones anyway.


Offered to help the owner (a friend of a friend of a friend) of an older racing sailboat move his boat from Florida to Maryland. It was planned to be us and 4-5 other crew, but (red flag the first) ended up being only me, my wife, and the owner.

There were lots of red flags before finding ourselves in squarely over our heads -- an overheating motor was explained away as having had an undersized thermostat installed. Plausible enough. The lack of a bimini in August in Florida was just forgotten, but led to pretty significant overheating to me. The autopilot not working we didn't realize until about 15 miles offshore. That the autopilot was draining all the other batteries we didn't realize until we lost navigation lights. Etc. Etc.

This was embarrassingly recent, but suffice to say a LOT of lessons were learned. The boat was foreign enough that I accepted too many "explanations" as comfort when they should have been a reason to abort. Failures compounded and voila, we're now 15 hours away from civilization flying a spinnaker through thunderstorms at night and positively hauling ass.

Eventually we got the owner to appreciate our discomfort enough and how over our heads we were to head to safety in Charleston (he'd still just been heading east, which was baffling -- but apparently it is not everyone's first instinct to go to safety when life-threatening failures crop up) but that brought its own perils -- coming into a crowded channel at night without navigation lights isn't advised. We were shining a flashlight onto the sails, but the flashlight would change modes if it wasn't held steadily enough. One of the storms we'd sailed through had killed the owner's phone as well as his phone charger, so the little bits of navigation we had were precious, but necessary coming into shallower coastal waters. A cargo ship coming out of harbor kicked us out of the channel just enough that we ended up grounded and stuck pretty squarely about a half mile away from restaurants, but late enough on a Sunday that there wasn't any other traffic we could hail down. (A radio would have been lovely in that case)

I was sunburnt and heat-stroked enough that despite guzzling water constantly, I hadn't urinated in 24 hours, and though we were in relative calm, the totality of circumstances, I used the last of the dwindling battery on the last usable phone among us to call the coast guard for evacuation. The owner of the boat stayed behind.

To paraphrase Cheryl Strayed -- If you'd asked me at any point in the journey, I was absolutely miserable, but on the whole it was miraculous. Gained a ton of skills. Learned a ton of red flags to look out for. Experienced a lot of firsts, not the least of which included a crash course in celestial navigation. And being 200 miles offshore and awake the whole night during the Perseid meteor shower was absolutely brilliant.


"If you'd asked me at any point in the journey, I was absolutely miserable, but on the whole it was miraculous. Gained a ton of skills."

Sounds a lot like ocean racing sailboats under normal circumstances.

I quit ocean racing (after about 5 seasons) when I had a sudden realisation that the only part that was any fun in the last 3 days was sitting in the bar after it was over and talking about it. And there were ten times as many non-crew people there enjoying that with us as there were crew on the boat I raced on.


I quit when I realized I despised the cold so much that if I fell in I would probably give up after 5 minutes.


Yep, that sounds like a wild ride!

I'm glad you lived to tell the tale, stranger.


Thank you. I also am.

Honestly glad my wife was aboard as it lowered my risk threshold enough to get me to want to abort.


What happened to the owner? Some say he is still there with his doomed boat to this day...


lmao -- we checked on him the next morning. He didn't answer my call, but that made sense as his phone had died, so we drove by where it had been stranded and it wasn't there so he'd moved. Felt guilty that we couldn't get in touch with him to offer him a ride back in our rental car, but caught up with him a few weeks later and he managed to get the boat into a slip somewhere down there.


Re: range, it's still satallite based. It's purely for emergencies, though, rather than for convenience.


Thank you


Something else to keep in mind is that the dedicated PLBs often have 10 times the transmit power as the combo satellite messengers.


As usual, there's pros and cons, and talking about just one side causes bias. Yes, a PLB can work with more obstructions like trees. However:

With PLB, satellites can't accurately measure where the PLB is; the search and rescue crew will find you by local radio reception.

With a satellite messenger, the gadget knows its own GPS location to a pinpoint and can send that as part of the SOS signal. And you can text the SAR staff and say "I fell and I'm now stuck on a tall ledge halfway up the cliff".


>Now we need app developers to log off their fiber-served wifi when writing messenger apps, and log into a high-latency, low-bandwidth, high-packet-loss network instead so Messages will actually open instead of whatever it's trying to do to upload my location history and download contact pictures...

FWIW, it's perfectly possible to simulate arbitrary levels of bandwidth/latency with a variety of tools even while having a fiber connection. For example, Macs have long had the "Network Link Conditioner" tool as a free utility included with the Additional Tools for Xcode package, which then allows simulating configurable bandwidth, latency, and packet loss. There are similar tools for Linux as well, tc is powerful. Most firewalls with quality traffic shapers also allow at least the first two at the network level, I used that on OPNsense to simulate a VSAT connection with 750ms latency and 4/.5 to a specific VLAN so that we could just connect systems and then see how applications worked. It's been awhile but it was eye opening. A nice thing about that approach is that then you can just connect devices to a given VLAN, which makes testing back and forth super easy. Wired is trivial of course, have a switch where each port is a different test VLAN, but even for wireless if you have WAPs that support PPSK/MPSK, then hopping between test VLANs just means reconnecting with a different password. Simulating packet loss with a network device seems the be more niche and complicated and I don't know if any firewalls put a GUI on it. tc queue disciplines can be used though so any Linux device with two network ports can sit inline and modify the traffic to simulate loss/latency, an RPi would be fine for that.

I agree it'd be nice if more app developers would test under less than ideal conditions, particularly since it's so trivial to do so. I think most simply don't think about it though, same as many GUIs (web or local) not testing for stuff like various types of color blindness.


> it's perfectly possible to simulate arbitrary levels of bandwidth/latency with a variety of tools even while having a fiber connection.

Yes it is perfectly possible, but it’s very difficult. Having just gone through this process I can attest that this is not something the average “app developer” is going to be capable of doing, at least on Linux and Windows.

The raw tools are there, but they are complicated, poorly documented, and require network engineering knowledge on top of software development knowledge.

Probably there is some nice package to wrap up the whole “generate a virtual network and add latency + packet loss between these two end points”, I just never found it.


It's not that they won't be "capable of doing" it, it's that the effort required will be more than they care to put into solving a problem they can't sympathise with and/or isn't a requirement handed down from their boss. App developers have to jump through all sorts of technical hooks to get certain things done, but since their jobs depend on it, they power though. If "work reliably even in shit network conditions" was a baseline requirement that their jobs depended on, you bet they'd find a way.

As for how hard it actually is: iOS has already been explained, the Android emulator has network latency and speed simulation right in the GUI, as do most web browsers. And there's alway the option to switch your phone to 2G only in the settings and/or go into the basement or elevator.


>If "work reliably even in shit network conditions" was a baseline requirement that their jobs depended on, you bet they'd find a way.

This is the real problem - there are deadlines and most projects don't even consider working in questionable network conditions. Developers aren't going to put in the extra effort when it doesn't contribute to the job they are asked to do.


I have had to deal with poor App/Play Store reviews for both of "It doesn't work on the train while I'm commuting in the morning" and "It doesn't work at the event when there are 100,000+ other people there" flaky network related problems.

I always at least _ask_ in the requirements gathering stage for a new mobile app: "how much effort do we want to dedicate to app performance/reliability under marginal network conditions?"

As it turns out, pretty much all mobile app owners are as apathetic about that as most mobile app developers. (On the other hand, once you've got a reputation for being able to handle those sorts of flaky network edge cases, you get more and more work for the sort of apps that benefit from them. The downside of that is it's never the flashy resume-building-apps that come to you for this.)


> Probably there is some nice package to wrap up the whole “generate a virtual network and add latency + packet loss between these two end points”, I just never found it.

Linux has a built-in command: trickle

https://linux.die.net/man/1/trickle

https://stackoverflow.com/questions/10328568/simulate-limite...


trickle is clever, but when I've tried to use it for "use no more than this" bandwidth shaping (basically going for "rsync --bwlimit" but for a set of "related" rsync processes) the arguments given had very little to do with the amount of bandwidth actually consumed. (We found another way but definitely look at tc instead.)


> Yes it is perfectly possible, but it’s very difficult. ...

shrug

Disagree, it's no harder than any other linux subsystem, you can use `tc` or you can attack it a different way by using virtualization then messing with it at the vde layer. I've added storage and network random latency for testing code before, took a day or so to get working perfectly, but now it's in a shell script.

Hm, alternatively.... you're right, it's very difficult. You should hire a software engineer, such as myself, to do it for you. ;-) ;-)


Linux has been explained to death in the replies, but on Windows you can use clumsy[1] - it's as simple as it gets.

[1] https://jagt.github.io/clumsy/


Like parent said, Network Link Conditioner is exactly that tool. It's even built directly into iOS (though only visible in settings once "used for development" via XCode).

Both Firefox and Chrome dev tools also have built in throttling of network connections. https://blog.nightly.mozilla.org/2016/11/07/simulate-slow-co...

The tools exist and they're easy to use. Most people just don't bother.


> For example, Macs have long had the "Network Link Conditioner" tool as a free utility included with the Additional Tools for Xcode package, which then allows simulating configurable bandwidth, latency, and packet loss.

Well, I sure wish Apple would make use of that functionality themselves once in a while.

It's frustratingly impossible to enqueue an iMessage message while out of signal and have it be automatically delivered once back in cell coverage. Bizarrely WhatsApp, a third-party application (with all the background execution restrictions that go with that), manages to do just that!


Firefox's console in its Network tab also has a way to choose what tier of bandwidth you'd like: GPRS, 3G etc. Thought it was worth mentioning


I find Firefox and Chrome's network-limiter feature to be "unrealistically unreliable" insofar as it limits data transfer bandwidth, and simulates some kinds of latency issues, but doesn't seem to implement things like DNS suddenly stopping working for 5 seconds, or sporadic network drop-outs, or how some things will arrive out-of-order, or an entire web-page blocked by a single synchronous <script> from one particular external host.


I agree. It needs to simulate a phone right on the edge of connectivity, with the random 5 second dropouts, followed by 50 Mbps but downstream only, followed by another dropout, followed by 1kbps up and down, etc.

That's the way real networks behave, and what real users have to manage with.


> Now we need app developers to log off their fiber-served wifi when writing messenger apps

You should try WhatsApp - there really are not many messenger choices when you have unreliable 2G(!) connectivity. For all the faults of Meta, WhatsApp seems to be the only company that cares about people with bandwidth measured in kbps, sometimes fractions thereof.


> For all the faults of Meta, WhatsApp seems to be the only company that cares about people with bandwidth measured in kbps, sometimes fractions thereof.

WhatsApp was built with low-bandwidth/high-latency/high-packet-loss in mind well before Meta acquired them.

I suppose we can give them vague amounts of credit for not making those use cases worse since the acquisition.


They have made it worse...

It used to work just fine on 1kbps.

Now it is pretty much unusable over dialup or GPRS.


To be fair, it used to be "encrypted" using a scheme not too far removed from ROT13; now it's using the Signal protocol, which probably requires quite a few more bytes per message.

For example, messaging a contact with multiple devices connected means that your phone has to encrypt your message to each of their clients independently.


WhatsApp also gracefully handles being offline and queuing messages for later.

That can’t be said of iMessage.


Offline queueing is a fantastic feature! I know someone who lives in a remote area whose phone only gets network connectivity when they walk up a specific hill. In many ways, they use WhatsApp the way most people used email in the dialup days: read and compose offline, then go online once a day to send and receive new messages.

Edit: I sometimes get a chuckle when they say "Look at this cool picture I took", and then receive the picture some days later when the conversation thread has moved on. I guess they'd have stayed in thr coverage area for long enough for the entire image to upload.


This is so bizarre to me. How can a native application not implement that, while a third-party application nails it!?


I know when people are in low signal as I get a green then exactly the same blue some time later.


Which creates issues when you have signal but don’t have a sub to send that actual message over sms.

Sometimes sms failover creates further problems because when travelling you’re more likely to have intermittent wifi than intermittent sms access.


I think luckily you can turn off auto sms fallback - but it does indeed default to on.


Any idea how it compares to Signal? They should be essentially using the same protocol. I also used it with good results with slow connectivity, but never tried on bad 2G...


The encryption protocol is fairly unrelated to how an app handles poor connectivity. It’s a bit like comparing different websites handling poor connectivity differently even though they are all served over HTTP.


I have tried to use signal in a location that had voice, but no data (a very old, very rural tower in the middle of nowhere) a few years ago, and signal was completely worthless. Even though it can read and receive SMS, it seems to require a data connection to send..


Yes, everything except voice and SMS requires data


Yes, that would be the case for any OTT messenger (so, anything but SMS).


I believe their communication layers are fairly different – WhatsApp used to use something based on XMPP at least until a while ago; I'm not sure what Signal uses, but I vaguely remember it being based on HTTP?


Can definitely be, I never investigated in detail, although I used alternative open source clients for both (signal-cli and a long time ago yowsup), so the information should be in there...

I hoped that Signal's use of HTTP was as a fallback in case a direct connection could not be established to use a more "compact" protocol, but I possibly don't remember correctly what I've read...


Unfortunately I don't have any data points on how Signal operates 2G speeds


Telegram also has pretty amazing support for poor connectivity environents


Because Whatsapp is huge in Africa IIRC


It's huge in many countries where Internet service has been expensive or difficult to come by. Case in point, all my relatives in Eastern Europe use WhatsApp, and with better bandwidth options, some are using FB Messenger for video calls,.as well.


Facebook used to have (has?) 2G Tuesdays, where devs had to test their stuff at 2G speeds, for this exact reason.


Everyone I know in Belgium/The Netherlands/... uses Whatsapp. It's even got into the language: "to app someone", i.e. "iemand appen".


While it's plausible that I, too, will cancel my inreach subscription over this i'm not so entirely convinced.

My InReach (I have an Explorer, not the mini) is an exceedingly rugged bit of kit with an effectively infinite battery life.

It is, by no means, a sleek or svelte device like my iPhone but if I fell down on a trail and broke all my fingers (or had severe frostbite or something) I'm fairly confident I could operate the device with some combination of my teeth, nose and toes.

I'm also fairly sure that it will keep functioning even if I had accidentally dropped it off a cliff, in a blizzard, into an ocean.

The mini is basically the same kind of rugged so you know exactly what I mean :)

The battery also lasts basically forever.

I turned mine on about 48h ago before a trip and it's been on ever since and I just checked and the battery is at 86% with 2-minute tracking intervals.

Anyway I am still very happy for the starlink service for the 99% of people who don't have an InReach, it will definitely save lives, I'm just not yet ready to ditch my InReach.


For the InReach use case, there's still something to be said for a rugged, long battery life, dedicated device (assuming you're cautious enough to have one and subscribe to the service). On the other hand, everything is a tradeoff and if your lifestyle is such that having a dedicated inReach is unlikely to ever be a key piece of safety equipment, there's a lot to be said for just relying on your phone.

There's very little bandwidth you need when you're in the backcountry, especially given some reasonable pre-download of maps and other information.


I'm not a backcountry skier or much of a hiker but if I was, I'd much rather have a rugged device that can survive much better than an expensive piece of glass. I've damaged phones in the past by them getting bent in my pocket during rigorous activities. If my crumpled body is laying at the bottom of a cliff, I doubt my phone fared much better. A solid body device with a simple interface could be the difference in me being found in a couple hours or a couple weeks.

Direct to phone satcom is neat and I can see plenty of applications for it but I know that I wouldn't stake my life on it simply due to it being a consumer-grade phone.


I do a lot of hiking and have never broken my phone in the wilderness (or in town either, I've scratched and cracked the screen, but never enough to make the phone inoperable). I generally travel with someone else, so it's even less likely that we'll all break our phones at the same time -- if I were hiking by myself, I'd be a lot more worried about having an injury prevent me from calling for help rather than having a broken phone prevent it.

I do carry an InReach when I hike in remote areas but if my phone could make satellite emergency communications, I'd stop paying $144 a year for the InReach.

If InReach dropped the price to $50/year, then I'd consider still subscribing to it.


100% - They don't make phones strong enough to be reliable as a life safety device. As an active person I've broken my pocketed phone several times in incidents that were otherwise unmemorable as they were minor spills.

I laughed at that picture of the mountaineer in full Alpine climbing attire, on a tablet.

What's he doing on it, checking his email? The view behind him isn't good enough?


I'm thinking the same.. if I've fallen off a cliff or crashed my bike and broken my leg, possibly in shock too, I want something rugged & simple to activate like a PLB


I do trail-running, backpacking and skiing in the Colorado backcountry. I'd never trust my safety to a cell phone due to the fragility, temperature sensitivity and poor battery life. I carry an inReach and have used the SOS feature. That thing is bullet proof and can tether to my cell phone anyway if I want the convenience of the phone for typing messages etc.


So I went looking for stats to validate how reasonable this concern was and looking at what on average kills folks https://www.projectuntethered.com/hiking-statistics/#:~:text...

I think your concern is warranted as vehicle accidents and falls may call into question the durability of the device in question however based on the mix of accidents and especially the prevalence of medical misadventures its probably a worthwhile feature even if you don't opt for something desired for extra durability.

Another logical concern is battery life. A huge chunk of non-fatal misadventures are day hikers and the dominant cause is actually dummies wandering off trail as opposed to injury.


I literally just destroyed an iPhone which was in my pocket, in a case, and I apparently smashed it into a rock (which I wasn't even really aware of except it was rugged hiking generally). Probably protected my thigh.

I do carry a spare battery etc. I suppose I could keep my phone in a rugged case in my pack and use something else for pictures and maps but that's sort of getting away from the idea of one device you always have right with you.


> I've damaged phones in the past by them getting bent in my pocket during rigorous activities

I got a 4g nokia dumb phone just for this occasion - few days of battery life, sturdy build, good for small trips/hikes where you usually have signal or as a backup phone


this is my opinion, too. and i do a good bit of time in the backcountry.


Personally I just keep a spare cellphone battery in a pocket for emergencies. Would be awesome to have this service. I'm using a T-Mobile MVNO and last I checked they had no idea if T-Mobile would be extending them the service.


there's certainly a use-case for something like an inreach as a dedicated safety device for true backcountry adventures. but there's a whole lot of cases that aren't that, but you can still be out of cell service and have an urgent need to communicate.

for me, even just driving to the next town over, or going for a bike ride that takes me less than two hours from my house, there's places where the cell connection cuts out. i'm not packing an inreach for an hour drive, but it's nice to know that if i need it i can count on my cell phone.


Not sure if it’s still this way, but Kotzebue AK was the perfect place to test low bandwidth internet. The over the horizon satellite uplink the entire town shared could only allocate 56k speeds to each client and had horrific packet loss.

I had to rearchitect an entire file synchronization and uplink system for resilience while sitting in a hotel room in the middle of a blizzard during the shortest days of the year so I could deploy software for clients up there.


> The over the horizon satellite uplink the entire town shared could only allocate 56k speeds to each client and had horrific packet loss.

I found your uplink. IDK the date of the photo.

https://uploads.alaska.org/blog/Carl-Johnson-Blog/Kotzebue/_...


So weird to see a dish that size almost pointed horizontally!


Yes! It literally pointed at the horizon. Awesome find.


It's actually trivial to simulate a high-latency low bandwidth network on Linux with the traffic control subsystem, one need not switch networks at all.

  tc qdisc change dev enp1s0 root netem delay 300ms 200ms loss 10% 80%
This command delays all packets in and out of the wired interface enp1s0 300ms, +/- 200ms, with 10-80% packet loss.


all packets in and out

Are you sure about that ? According to man tc, it only works on egress.


Nope, I would definitely trust the man page.


Where's the low bandwidth portion of the command?


You can also limit interfaces with TC to a certain bandwidth. It's super easy!


I have an InReach Mini that I use when I go offshore fishing, where I'm out of cell range by 40 miles or more. The Mini is nice, when it works. Most messages I send take 15-20 minutes to send, and the delay is even longer on receiving most of the time. And that's in the middle of a large body of water with no obstruction of the sky.

I've been waiting with bated breath since the announcement of Starlink in the hopes that it could knock out the three satellite services I use the most: InReach Mini for text comms and location sharing, XM Radio, and XM Weather.

I'm really looking forward to when this becomes available for individuals and not just businesses. Also really looking forward to 12v Starlink equipment and affordable marine plans.


So basically app developers should build low bandwidth apps - that would be a win for high bandwidth users too. Everything transfers massive amounts of data exhausting even gigabit fibre fast.


There anre also environments that are high-speed but high-cost or capped gb/month.

I like how I can tell an iPhone that a wifi AP is a “low bandwidth” one so it holds off on many tasks. But you can screw yourself over if you tether a computer, even a Mac.


TripMode is great in those cases.


I’ve found recently that TripMode keeps forgetting the apps I’ve blocked. Haven’t done any investigation, but it’s disconcerting when I check the list and find that iCloud is once again enabled.


The US is fairly unique in that everyone still texts. The rest of the world use apps that need data, like WhatsApp.

While SMS would be great for low-bandwidth use cases like this, most people don't use it to the point where having contacts accessible with it on the other end would be an issue.


As far as I understood, this is because US carries throws unlimited SMS for free and data is costly. In my country, it's possible to get unlimited sms but for additional fee on some carries (other carries just provide N SMS per month) and data prices are rather low even now.


If they're on Whatsapp, then they've got their number. Surely everyone can receive an SMS? If its an emergency, you're not going to care what app you're using, just so long as you can communicate.


This messaging app already exists, it called Telegram. I was on a boat around Aeolian Islands and Telegram started working once my phone got Edge (=2G) coverage. Slack and WhatsApp were unable even to connect to server without at least 3G.


> Now we need app developers to log off their fiber-served wifi when writing messenger apps

or also any web app.

In a Rails app I recently started working on for a customer I saw many controller actions starting with a call to a method called simulate_delay_for_development. I checked the code and when run in development mode it basically sleeps for a random value between 0.1 and 3.0 seconds.

I'm sure that there are gems for that or proxy servers sitting between the browser and the server, but it's a cheap and effective way to make every developer experience delays with zero installation costs.


Browsers can simulate that without a proxy too: https://firefox-source-docs.mozilla.org/devtools-user/networ...

Chrome can also throttle the CPU power: https://www.wikihow.com/Throttle-Your-Browser-for-Testing


I was recently in a fairly remote area of the Sierras and my iPhone when totally haywire and then died. Somehow water had gotten inside. That makes me a little concerned about depending on my phone for everything in the backcountry, but maybe what I need is a really good case for it.


Or get a cheap "armored" waterproof Android phone. There is a whole thing of Android phones that look like Black and Decker designed ruggedized phones, some of them with extra large batteries. My son has one that is pretty much indestructible. Also ugly enough some other kids cornered him and wanted to "see it" a while back and decided it wasn't even worth stealing.


I have one of those - “builder phones” as some call them - for outdoor activities.

Cheap, fucking indestructible, and you could probably bludgeon someone to death with it in a pinch.

It also has a thermal imaging camera which is surprisingly handy for stuff like “finding where in the garden my cat is hiding at night” or “figuring out what’s overheating” or monitoring the state of my compost bin.


Not cheap, but CAT phones are pretty awesome: https://www.catphones.com/en-gb/


Depending on the remoteness and risks, I would go for a better case and a backup simpler and more reliable device.


The simpler and more reliable backup device is the Garmin InReach.


I can take my inreach mini underwater up to 50 meters with the official accessory dive case (I bottom out at 45). Short of making a custom case I can't do that with a cell phone and I'm not sure it's rugged enough to take even small waves.


> Now we need app developers

If I’m reading the docs correctly the satellites will offer SMS service directly so you won’t need an app, you’ll just need your carrier or roaming provider to support it.


Unfortunately, it's incredibly hard to convince an iPhone to actually send an SMS to a specific number, at least if the contact is registered for iMessage...


If you have no data connection it will fall back to SMS.


What if I know that the recipient doesn’t have a data connection, but I do?

This is all incredibly clunky and one of the weakest points of iOS in my view. How can it be so hard to provide a switch that lets me use SMS proactively?


Obviously we shouldn't pretend sat phones from Garmin and others haven't existed for decades, but Starlink direct to cell is very different. First, as others have mentioned, traditional sat phones requires dedicated hardware with a chonky antenna; this works with normal cellphones, which means order of magnitude higher adoption. Maybe more fundamentally, the satellites that existing sat phones use are in much higher orbits so there is irreducible latency (due to speed of light) and bandwidth costs that LEO Starlink sats will likely crush in the next few years.


Order of magnitude? Try 5 orders of magnitude.

I’d be willing to be the ratio of cellphones to sat phones is > 10,000 to 1


Tbc, this was a typo by me. I definitely meant “orders” (hence no “an”). Thx.


It’s too bad you’ll still need a “regular” cellular sub to take advantage of this. If I didn’t, this gets me a lot closer to “cutting” the wireless plan cord.

I’m usually in wifi range and being able to have a few kb/s of comms is basically the lifeline I need.

Most of my cell activities are cached on my phone: podcasts, maps, some increasingly out of date weather.

Would be even cooler if it could “broadcast” some regular stuff like news and traffic updates (dunno how that could integrate with Waze…)

The first kb/s gets me a loooot of value and each additional is less useful than the last. Data is very much diminishing returns.


> It’s too bad you’ll still need a “regular” cellular sub to take advantage of this. For the emergency use case: Would it even be legal to require a subscription for 911 calls?

As far as I know, these are possible even without a SIM card in the US and most other countries.

> The first kb/s gets me a loooot of value and each additional is less useful than the last. Data is very much diminishing returns.

And it’s arguably priced accordingly: The difference between metered plans and unlimited data is pretty small these days in my observation.


By lifeline I don’t mean 9-1-1, but rather some basic barebones comms.

> The difference between metered plans and unlimited data is pretty small these days in my observation.

That’s if you have a competitive environment. In Canada, the cellular providers are the fixed-line internet providers and very much don’t want to tank one for the other.

(But maybe Freedom Mobile, without much of a physical wireline footprint, will pull a t-mobile and go all-in on a 5G wireless home internet plan… we can pray)


> In Canada [...]

Oh, yeah, all of these considerations only apply to a somewhat functioning/competitive market, unfortunately.

I have my fingers crossed for you! I've had friends of mine live in Canada for a while – it sounds as bad as (or even worse than) what was going on in Germany in the 2000s, when mobile operators had spent ridiculous sums on 3G licenses and almost ruined themselves in the process financially and as a result just sat out on infrastructure investments for the next decade or so.


Likewise, this would be enough to replace my cell plan. I don’t need high bandwidth stuff, just text communications, PagerDuty, gmail. The rest I can do over Wi-Fi.


You can get 1 year global plans from esimdb.com pretty affordably for this use case right now.


Thanks, I’ll check them out


Given the price per bit per second per square meter, I'd be surprised if this will be more economical for operators to provide in anything but the most rural areas.


No Maps?


Maps can be downloaded offline and directions calculated locally. Only really need updates for traffic (which could be broadcast, which I think some old gps receivers had)


Yeah, I use offline maps. Lack of traffic info is sometimes an issue.


I suppose they need to piggyback on each mobile operator's licensed frequency bands, per country/region.


There's other potential advantage (or disadvantage depending on how you view it). Some countries like India don't like sat phones on their territory. Some countries like Russia are ok with them but require local connection to terrestial networks and local licenses. Some countries are in territorial disputes. What if SpaceX or other such company decide (or would be "gently asked to") it could ignore local regulations in such country? What if it would be done for good? Possible example: somebody lobbies ITU(via some kind of "emergency license for humanitarian purposes only") to allow SpaceX to allow SpaceX to serve Gaza for free with limits and USA is ok with it. Israel is against it but what they could do? Jam Starlink? What if they also use it? What if 10 years in future same thing is done to North Korea(it could be even more legal - as far as I understood, both Koreas claims they are only legitimate goverment of Korea so SK could issue formal license)?


Looking forward to a future Starlink+Apple/Google collab.

Dunno what frequency this is running at, but if it’s in the 10s of GHz, I’m assuming licensing doesn’t get tooooo pricey given how problematic it is on land-to-land links.


It works with existing 4G/LTE phones, so it will be using the standard 4G/LTE bands, which run up to 2.6 Ghz.

Presumably that's why SpaceX needs to ride on existing mobile operators: they already own LTE spectrum and can allocate a chunk of it for satellite services. It would be a lot of work (and cost) for SpaceX to go out and buy LTE spectrum in every country they want to operate in.


Sacrificing a chunk of that precious spectrum is huge though. Is LTE flexible enough to run some form of on-demand TDM between multiple base stations on top of it? Can it operate smaller cells on the same channels within the range of the larger cell, forcing the smaller cell to make do with other parts of the spectrum while the larger calls dibs?

Afaik older cellular protocols were relying on zero overlap between base stations serving the same frequencies, leading to a nice coloring problem that would seriously suffer if someone tried to fit in LEO cells.


I'm no LTE expert, but I'm pretty sure that it handles overlap: you can run LTE diagnostic apps on Android that will display all the visible base stations - often many are visible, sometimes 3 or more on the same band. Your phone looks at what's available and picks a combination with the best signal strength (sometimes it will connect on 2 or 3 different bands simultaneously with carrier aggregation).

But even then, most carriers have many bands/channels (EARFCNs) available - and presumably only a small slice of bandwidth is needed for this service considering it's (initially) only for text messages. Finding 5 Mhz on one of the higher LTE frequencies wouldn't be so hard for many carriers, even if it does need an exclusive channel.


TDM seems difficult with the long time delays to satellites, but maybe. LTE is pretty flexible and can use bands as small as 1.4 MHz.

It was only 1G and 2G (GSM) that needed to avoid overlap.


Apple/Google don't hold any RF spectrum rights worldwide...

This is presumably about 4G/LTE, so mostly between 0.4-2.5 GHz or so.

https://en.wikipedia.org/wiki/LTE_frequency_bands


I wouldn't be surprised if Apple were to end up buying Globalstar, which would get them access to valuable globally available L-band spectrum.

They have already invested hundreds of millions into Globalstar's ground station hardware [1] and future satellite launches [2].

[1] https://www.apple.com/newsroom/2022/11/emergency-sos-via-sat...

[2] https://spacenews.com/apple-loans-globalstar-252-million-for...


> Apple/Google don't hold any RF spectrum rights worldwide...

yet. Apple (and android baseband suppliers, “Google” may be the wrong name to mention here), have the ability to take relatively worthless spectrum and make it widely useful in a way nobody else can.

I wonder how spectrum allocation works for C-Band satellite broadcasters that just blast continents with their signals. Grandfathered?


Those several-GHz bands don't really work indoors.


Most satellite-y things don’t and Starlink’s direct to cell won’t either.

Hence why the spectrum should be pretty cheap (along with its incredible susceptibility to obstructions… which is less of an issue when you’re going roughly “up” without pesky considerations like curvature of the earth)

And you do get a ton of gain with a small antenna at those high frequencies.


> and Starlink’s direct to cell won’t either

I could see it kinda working, at low frequencies. They are not that far away and there's not many things blocking the signal besides your roof.


Me too, but at slower data rates. At 550km above, that’s still a big signal loss by inverse square law. Problem with slow data rates is that they clog up the channels.


The inverse square law effect can be remedied using similar tech as in the Starlink consumer antennas. You aim the signal to where it's consumed.

> Problem with slow data rates is that they clog up the channels.

Yes, we must protect the tubes! :)


Phased arrays work without having to repoint, but they don't overcome lack of incidence (insolation?)

(e.g. we can't build a solar panel that works perpendicular to the sun (or nearly perpendicular) using phased array technology because there just isn't much solar radiation hitting the "dish" in the first place)


SpaceX already works with Google, where Google allows Starlink to use its datacenters' network connectivity for downlink, while Google gets to use Starlink for data transfer.


SpaceX has applied to test the service at an Apple office


You'd expect T-Mobile/starlink to provide this service for free?


No, I want to sign up for it without bundling it with a local Canadian oligopolist’s other services.


The difference here is you don't need a dedicated device or service, you can just use your existing cell phone. As Starlink is only providing the backhaul and cell carriers their spectrum, service will improve as the constellation grows not whenever the carriers feel like investing in it.


> Stuff like this has existed from companies like garmin for some time.

No, this is entirely different from what Garmin and others sell.

Those devices require dedicated hardware (and a subscription plan) to talk to satellites. Also you can only send and receive text messages. (no voice, extremely limited data (like weather reports))

What SpaceX are doing uses a completely normal phone, with a completely normal phone plan. It's text for now, but voice and data are coming.


> It's text for now

Text is announced for 2024. And personally, I'd be really surprised if that actually works out for unmodified phones sold today. I've written more about that here: https://news.ycombinator.com/item?id=37848212


> This is for very low bandwidth text communications when you're out in the country and can see the sky.

From the graphic in the article:

> Text: 2024

> Voice and Data: 2025

> IOT: 2025


The starlink website directly says "texting, calling, and browsing"


I believe this is addressed in the video that I linked. There are very good compression algorithms to allow voice, and "browsing" means something akin to slow dialup.

This is not something that could, for instance, replace your cell phone carrier.

In the video at 21:00 Elon clarifies: 2-4 megabits per cell zone.


> and "browsing" means something akin to slow dialup.

So it will be faster than my carrier..


This is a service being provided by your carrier.


Sorry if that was confusing. I mean that for instance: tmobile will be adding this to their coverage map, but they're not getting rid of the towers, and the primary way your phone is communicating with the world is not going to be via starlink satellites.


I occasionally get kicked down to 128 kbps for a few hours at the end of a month and a surprisingly large fraction of the internet still works.

They are targeting voice which can go really low, but a moderately optimistic ~100kps is vastly better than 0. Much below that and the number of people using it is going to drop near 0.


Yeah, it says so in my AT&T contract as well. And I was concerned when I first signed up. But I haven't ever actually seen it happen.


AT&T actually enforces it for tethering. If I really need something I can still use my phone, but it’s convenient to leave the phone where it gets reception and then use the tablet nearby.


If I can get a plain text weather report, emergency notification or email then that is HUGE.

The difference between no communication and "take 5 minutes to get 1kb of text" represents a huge, huge jump.


> browsing" means something akin to slow dialup.

I mean, I fire up and use lynx from time to time for a couple newspapers and magazine. Beyond evading some paywalls, it also presents it the way I want it: a wall of text. “Slow dialup” to me is 2.4kbps.

The HN crowd could get a lot done over a console.


That’s not what the site says. It’s different than what’s available now.


The gain on those antennas must be insane, any idea how they work? There are millions of devices down there (from the angular resolution perspective of the satellite).


You can still access a lot of stuff via just SMS - could setup an SMS ChatGPT service for yourself and use it via this in the middle of nowhere


It’s still invaluable at low bandwidth text up there.

Satellite data to run infrastructure (even cc processing for cards) where there is zero connectivity or power for hundreds of miles will likely become at least one order of magnitude more accessible price wise.


What about the voice and data they have planned for 2025?


never trust a 2 year estimate from elon musk


Why is it limited to only very low bandwidth applications?


Because Shannon.


You're talking about the "coke freestyle" machines. Yes that story is true for those machines, but McDonalds doesn't use them.

(Burger king does)


McDonalds did have them for a few years. Allegedly they went back to the tried and true because the maintenance costs were too high.


Please don't take this criticism harshly:

You say that you've been working on this for 4 years, but there's no content here. Have you not been using it yourself for 4 years? Where is the content that you and your friends have been posting?


I've been working on the code for 4 years. Only opened up posting to others today. Obviously have test posts, but I deleted many of them prior to this post.


Hum there's one commit end of 2022, a few end of 2021, 3 early 2021 and then the initial repository creation in 2018.

You haven't been working on it for 4 years, you started it 4 years ago, have made disconnected updates here and there since, and are releasing a pre-alpha to capitalize on the Reddit story.


You should learn about git submodules before interpreting their git activity, especially if you're going to be so confidently judgmental.

You've missed over 900 commits. Oops. :)


My bad, I was on mobile and seeing the commits is always a pain there.


Oh my god. I remember this happening, and I remember email drew about it at the time.

It really never did recover from this. I "grew up" on Fark, and lots of my early web projects were clones of it/inspired by it. Fark really was a special part of the internet, and it's sad that it fell off. I wonder if the fact that they never moved on from this design was Drew being gunshy about changing it again.


I briefly conversed with Drew on what seemed like the obvious next step of threaded conversations. He was clear that this way kept things in order for him, and didn't let stuff get bogged down.

In other words I think he's happy with it, and doesn't want it to change. The main change since then was more ads or paid accounts, which seems a fair tradeoff. Anybody who wants Reddit knows (or knew) where to find it.


Squarespace was supposed to take away the jobs of people who make websites, but it didn't. The same people who you used to hire to make a website for you are still doing that, but now they just use squarespace for it.

I saw a recent quote for a very basic, static website that was greater than $10k. This would be about a day of work to put together in squarespace.

AI is not going to take away jobs, it's just going to make the people already doing them more efficient.


> AI is not going to take away jobs, it's just going to make the people already doing them more efficient

In the same sort of way that wordprocessing just made typists more efficient.


Or if you want to make a more direct comparison, it’s like high level languages and compilers making punch card writers more efficient.


A few points on this from someone who's been building websites for 25 years (I started around when images were added to HTML) -

The web designers don't hear from many of these previous prospects who now go straight to Squarespace.

When people come to me for a Shopify site, it's usually because they've done all but the hardest 10%. Then they want to pay me a tiny amount to do the most unpredictable and difficult 10%. Usually something custom/difficult within the parts of the platform that are locked down.

I've seen budgets from local brand-name companies go from $20k for a build to $2k.

Often, the people charging $10k for a Squarespace site are justifying the majority of that with related services (copywriting, photography, content, marketing, etc). Many surviving web companies needed to become agencies. Shopify has some automated marketing options now. Copywriting is increasingly done with ChatGPT/similar.

Don't get me wrong - this is all very liberating for the client side and a boon for platforms like Squarespace and Shopify, but don't underestimate the upheaval for web designers.


We've been re-inventing the wheel for mom and pops for so long it made sense that someone made it more efficient. I used to do it for small businesses and now I do it on the exact opposite side of the spectrum building web apps and my industry is ripe for the same kind of thing.

In the near-term I think we'll see it as an efficiency gain for developers, but longer term we will be able to just make applications on the fly.


It's hard to predict. Consider bank tellers. ATMs initially allowed banks to run more local branches, resulting in more jobs. But now that ATMs, especially the ones inside, are VERY full function, the job numbers are reducing fast.

https://www.vox.com/2017/5/8/15584268/eric-schmidt-alphabet-...


I don't think its because ATMs are so good/functional as much as it is people just don't use cash and deposite checks like they once did.

> “because of industry consolidation and technological change,”


The job numbers are reducing fast.

Are they? That piece is from 2017, yet we still see bank tellers everywhere.

It seems the increased efficiency of ATMs caused some reduction in the count of teller jobs - but their number has again stabilized.

ATMs meanwhile seem pretty much maxed out with their current feature set.


https://www.bls.gov/ooh/office-and-administrative-support/te...

Already reduced numbers, and a prediction of a 12% drop over the next 10 years, versus an average 5% gain for other jobs. As mentioned in other replies, not just ATMs, but cashless self-service of all types.


I stand corrected (thank you, btw) as to the continued reduction - but a 1.2 percent annual rate does sound a bit on the slow side.


but it's true that many banks are closing branches in less visited areas, and justifying it (to their local community) with an ATM.


Even that one's a little bit meta, since fewer people are using cash now.


Right. But now Squarespace can integrate a prompt that has a conversation with you and build a site and continues to iterate on it until you’re happy. This adds very little cost to Squarespace. Maybe it gets most people to 80%. The last 20% will be a service provided by a human.


Then the last 20% was the only interesting part in the first place.


Until gpt-5 shaves away 17% more, then gpt-6 erases that final 3%.


This is an interesting example, because they probably don't use squarespace for it as they are likely more effective with other types of tooling. Whilst squarespace is a great general purpose tool for people who don't know specialized tools, a specialist will be more effective with different tools.

That's where I see the difference in AI as well. A specialist is probably faster using their own tooling rather than muddling through an AI interaction. But the AI gives non specialists the ability to muddle through tasks they can't do on their own, or don't have specialized knowledge for.


Where do you see quotes like this?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: