> until there is actually a quantum computer that can break it
There isn't one yet (at least that the general public knows about), but that doesn't mean we don't need to do anything about it right now. See this problem, for example, which would potentially affect today's encrypted data if it were harvested and saved to storage for the long term: https://en.wikipedia.org/wiki/Harvest_now,_decrypt_later
There's also a lunar occultation of Mars (which is near opposition itself, making it relatively bright) happening in a few days, and then again in February, which should be visible from parts of the northern hemisphere: https://in-the-sky.org/news.php?id=20250114_16_100
By precisely timing them you can measure/check various facts like distance, diameter and so on. In fact, if you time them precisely from different locations on earth you can determine the shape of the occulting body (e.g. an asteroid occulting a star). And on occasion you can get a 'grazing occultation', for example a star goes behind mountains on the moon resulting in it blinking on and off; observe from multiple latitudes and it's possible to recover the profile of the range.
Occultations can tell you about the atmosphere of the object in front. Depending on the rate at which the background object fades can tell you about atmospheric density, composition etc. If it disappears suddenly it indicates there may be no atmosphere.
> If you do consider paying for either Wubuntu or LinuxFX, it's worth keeping in mind that in the past, the developer's activation system and registration database have both been investigated and found to be horribly insecure. However, from the database, it looks like some 20,000 people did pay.
Even if one wanted to use it for anything serious without paying or otherwise providing any personal information in the process, this is a huge turnoff.
It looks like there's also a fix for that nested virtualization bug causing host reboots on Ryzen 7000/8000 CPUs [0]. It's nice to see that the cause appears to be known and that it's being addressed (even if the kernel here is technically not at fault).
Apparently this problem is caused by a CPU erratum (or unsupported functionality?) with some (many?) BIOSes still reporting the broken/unsupported instructions (VMLOAD/VMSAVE) as being available with these CPUs, at least according to some discussion about it on the LKML [1].
Yep, you're getting it. That same universal principle where the interesting things all happen in that boundary between boring and random (ie. trivially-predictable, and not-predictable-under-any-conditions).
If you haven't already, I heartily recommend James Gleick's book "Chaos" (https://en.wikipedia.org/wiki/Chaos:_Making_a_New_Science) which is, to me, the introduction to both self similarity and sensitive dependence on initial conditions.
It depends on the type of sailing and where the race will be taking you. The participants in a race such as the Vendée Globe [0] are almost certainly using the likes of synoptic scale models like the GFS and ECMWF to plan their routes.
The GFS is a coarser model which covers the entire globe, so while the overall situation at the synoptic scale will tend be modeled quite well (at least inside of a few days into the future), the resolution of smaller-scale weather phenomena taking into account local factors just isn't going to be there.
For something maybe more useful on the local scale, you can also look at a model like the HRRR (which I believe does take into account the terrain and other local effects from things like larger bodies of water). While this model only really covers the conterminous United States and southern Canada, I've generally found it good for showing the shorter-term, local weather details, including forecasting convective storms and winds on and around the Great Lakes.
Since light travels at 100% the speed of light in a vacuum (by definition), I have wondered if latency over far distances could be improved by sending the data through a constellation of satellites in low earth orbit instead. Though I suspect the set of tradeoffs here (much lower throughput, much higher cost, more jitter in the latency due to satellites constantly moving around relative to the terrestrial surface) probably wouldn't make this worth it for a slight decrease in latency for any use case.
Hollow core fiber (HCF) is designed to substantially reduce the latency of normal fiber while maintaining equivalent bandwidth. It's been deployed quite a bit for low latency trading applications within a metro area, but might find more uses in reducing long-haul interconnect latency.
Absolutely! The distance to LEO satellites (like spacex or kuiper) is low enough that you would beat latency of fiber paths once the destination is far enough.
I am pretty sure this was one of the advertised strength of Starlink. Technically the journey is a bit longer, but because you can rely on the full speed of light you still come out ahead.
Obviously, nobody is going to outright admit they put profits above security; indeed, they will often state the opposite. But their closely-held beliefs will shine through when it comes time to make decisions and the outcomes of those decisions are exposed to their customers and to the public.
Does Bill or Satya write code anymore? It could very well be that they consider security the top priority but it's a moot point because they're so removed from operations.
Although I would suspect that you're effectively right in that they either don't have it as a top priority or think they do but have a reveal preference of they don't. For example, an engineer that does rigorous security testing and finds nothing as well as launches one project gets promoted less often than an engineer that launches two projects and doesn't do rigorous security testing.
I'm sure there are some companies that realise security (or rather the critical lack of some important aspect of it) can impact profits, but that depends a lot on who their customers are too. Ultimately, if the customers who pay for a vendor's products and services don't value it, then the vendors won't value it either, short of any regulatory or legal requirements that might compel them otherwise. However, given that many large organizations (including governments) are Microsoft customers, it's strange to see in this case. Maybe there's a kind of "it can't happen to us" or "nobody will find out about it" arrogance going on, but they must now be seeing that the reputational damage is likely to have negative impacts, including hurting future profits, down the road.
There isn't one yet (at least that the general public knows about), but that doesn't mean we don't need to do anything about it right now. See this problem, for example, which would potentially affect today's encrypted data if it were harvested and saved to storage for the long term: https://en.wikipedia.org/wiki/Harvest_now,_decrypt_later