Hacker Newsnew | past | comments | ask | show | jobs | submit | Numerlor's commentslogin

Apple doesn't expose the kind of introspection necessary to compare with the data the article is about. Any mention would just be about Apple's chips existing and being better

Wouldn't having an adversarial country to be spying on you be the better option for you personally? At least privacy wise, not using your machine as some infiltration point, as the country you reside in has many more opportunities to abuse the data

How can the solar panel itself radiate heat when it's being heated up generating supplying power? Looking at pictures of the ISS there's radiators that look like they're there specifically to cool the solar panels.

And even if viable, why would you just not cool using air down on earth? Water is used for cooling because it increases effectiveness significantly, but even a closed loop system with simple dry air heat exchangers is quite a lot more effective than radiative cooling


You take the amount of energy absorbed by the solar panels and subtract the amount they radiate. Most things in physics are linear systems that work like this.


It would be way more productive for you to ask these questions to ChatGPT or similar AI with reasoning. Equations are quite simple. But I'm not going to dump it into a HN comment.

You don't have experience of being in space, so your "intuition" about cooling is worth literally nothing without formulas / numbers


It's a matter of deploying it for cheaper or with fewer downsides than what can be done on earth. Launching things to space is expensive even with reusable rockets, and a single server blade would need a lot of accompanying tech to power it, cool it, and connect to other satellites and earth.

Right now only upsides an expensive satellite acting as a server node would be physical security and avoiding various local environmental laws and effects


> Right now only upsides ...

You are missing some pretty important upsides.

Lower latency is a major one. And not having to buy land and water to power/cool it. Both are fairly limited as far as resources go, and gets exponentially expensive with competition.

The major downside is, of course, cost. In my opinion, this has never really stopped humans from building and scaling up things until the economies of scale work out.

> connect to other satellites and earth

If only there was a large number of satellites in low earth orbit and a company with expertise building these ;)


> And not having to buy land and water to power/cool it.

It's interesting that you bring that up as a benfit. If waterless cooling (i.e. closed cooling system) works in space, wouldn't it work even better on Earth?


I mostly agree with you, but I don't understand the latency argument. Latency to where?

These satellites will be in a sun-synchronous orbit, so only close to any given location on Earth for a fraction of the day.


Samsung had been kind of side grading their flagships and offering worse SOCs depending on location, paired with there plainly being more options for Android there'll be more variety spread out over the different manufacturers


> It only became off by default after those "daily rage sessions" created sufficient public pressure to turn them off.

99% of the daily rage sessions happened before it was even released


Preventive care is better.


NTFS getting corrupted by the tiniest errors would be one reason to use ReFS

Using it for the OS partition is not very well supported right now though (for a consumer), installing etc. works fine, but DISM doesn't support ReFS so adding features generally doesn't work


Can't recall the last time I saw a corrupt NTFS volume... even when using Storage Spaces. I'm sure it's happened to someone given Windows is in use by billions of machines, but NTFS becoming corrupt can't be all that common.

Besides, ReFS doesn't do data journaling by default.


Have you tried e.g. Mojo that can vectorize/do SIMD without having to do intrinsics everywhere?


Computer monitors have been getting a lot better while being cheaper, with no ads or services. You can get a high resresh rate 4K ips for about $200 nowadays. Display tech is just advancing faster than other tech at the moment


Huh, interesting. My experience has always been that computer monitors have been more expensive than TVs, even when panels are ostensibly the same. I've attributed it to comparative volume in TV consumers and desktop computer consumers.

At this point (as opposed to a decade ago) there's arguably no difference between a TV and a monitor anymore outside of packaging and the bundling of a remote and input defaults.


How does this work with respect to using a remote? I know something like a Roku remote would work display-wise, but you usually program it to use the signal that the your brand of TV responds to. That way you can use the Roku/whatever remote to turn on the actual TV and control audio. Speaking of, how does audio work for this set up?


HDMI standards allow plugged in devices to control the power state of the TV. e.g. my Apple TV will turn the TV on when I press a button on the aTV remote and will turn the TV off when I turn the Apple TV off.

Audio is a separate challenge, I'm not sure what you'd do there. Do computer monitors have eARC outputs? None of the ones I have do. Again if you had an Apple TV you could pair it with a HomePod (or pair of them) to avoid the issue but that's a niche solution.


Samsung already makes a bunch of "smart monitors", putting there the same software they use on TVs. Not sure about other manufacturers, but would be surprised if they don't catch up soon.


Why is their browser using so much memory? I'm quite bad at closing tabs since I've switched to vertical and just open new windows instead. And even with that I don't think I ever broke 10gb on edge except for when I opened many YouTube videos at once and then went through them which kept the tabs loaded


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: