Its not where energy is sourced but wher its used. And assuming it will be civilisation's planet like Earth. The whole energy of the swarm will be used there. - It just has to increase temperature (due to additional energy on the planet)
If they can build a sphere or swarm megastructure then obviously they would have build orbiting habitats either from scratch or terraforming planets or astroids.
> The whole energy of the swarm will be used there
that sounds like a made up problem. the Dyson swarm isn't to collect energy to send to the home planet. it's to collect energy. where that is used is going to be wherever it's needed. mining the asteroids, local computing (the cloud is no longer just a computer on earth, it's the cloud of the swarm elements), powering interstellar trips remotely, etc. the only thing that needs to get to earth is the imports of goods and services.
That actually makes no sense. You can't use (destroy) energy, you can only run it through processor, which will change it into heat while transistors inside are switching on/off. It is like a water wheel doing work by water flowing through it, but ultimately amount of water before and behind water wheel is same.
Thus the question still stands, what happens to heat in such thing? Does it get recycled by some unknown device? Then it is closed system, you don't need input from outside. It won't get recycled? Then such device needs to get hot from dissipating that heat.
This hypotetical kid would have the same size of brain/number of neurons anyway. In case of LLMs one could create a model that could be smaller thakns to not including the knowlegde about unecessary languages. A problem though could be with lacking traing data in other languages.
Hmm, I would say its always worth to share knowledge. Could you paste some links or maybe type a few key-words for anyone willing to reasearch the topic further on his own.
If you only want your ads next to quality content or on sites that are not drowning in ads so your ad is more impactful, then Google won’t help. But a third party can provide that through their curated list of sites.
Basically they are in the business of providing human curated targeting parameters instead of the algorithm based that Google supplies.
My stab: A Mac Studio will have 400 GB/s or 800 GB/s of memory bandwidth. Not that you can't get there on x86 e.g. a 12 channel Epyc Genoa setup can do 460 GB/s or 920 GB/s total when doubled up but now you're talking about buying 2 latest gen Epycs and 24 high speed dims to get the raw bandwidth back all while ignoring the access is a bit different.
This. I was just looking for the exactly the same thing. For now I use tmux with tmux-ressurect to persist state between reboots. - It works okeyish, I would say it's a good hack but still a hack. It's sad there isn't really solution for this problem maybe aside for warp. My little UX dream is to have such a solution of saveing workspaces integrated with whole OS and apps inside it. - That would be cool.