My "fear" has always been that Meta/Alphabet would slowly but surely migrate their apps over to their own third-party App Store to get past the pesky IDFA limitations[0] and other tracking hurdles.
So far nothing seems to indicate that it's happening. The question is if it's due to Apple's "measures" or just because it is not worthwhile for Meta/Alphabet. I think it's a combination of. But if it was as easy to "side-load" an app on iOS as on macOS - per your suggestion, I'm confident Meta would have done the switch in a heartbeat.
Just imagine if Apple provided nice API's for auto-updating, essentially no limitations on what binaries can be attested, API's/mechanisms for easy migration between AppStore apps and side loaded ones, no scary screens etc. Essentially implementing the DMA to the fullest extent, really honouring the intent of the law. Why wouldn't all the mega apps just move over? And what consequences would it have?
I think it would be awesome to e.g lift the JIT blocking and allow more strange niche things in alternative app stores. But getting all regular people on a wild-west third party app stores for the (ad financed) apps the use every day is just begging for a huge _actual_ loss in privacy.
I think it's just not worth it for them; look at Android, where sideloading was always available as an option.
Facebook does offer separate APKs on their website (so do, in fact, most major services - Netflix and Spotify also offer APKs from their website), but practically the only reason people end up using them is if they're on a device that doesn't support the Play Store (for whatever reason).
The only serious Play Store competitors on Android are either vendor specific (like Amazon's store) or wouldn't host Facebook apps to begin with and are unambiguously a positive force for users due to their standards (F-Droid, whose policies are designed to protect users from the typical mobile app rugpull of filling it with ads down the line). Anything outside of this tends to be independent hobby projects or corporate business apps.
The inertia of being the default is still really strong (for a slight alternative, much of Google's strength comes from the fact they paid millions to browsers to be the default search engine for them, a practice that's been found to have violated antitrust laws - it's telling that Google really wanted to keep doing this), which is still enough to keep Facebook attached to the Play Store and is probably why they won't try to leave the App Store either.
Hmm. I'm not a WebRTC pro but looked into it recently for a hobby project and felt that the typical WebRTC TURN implementation still leaves the TURN server in a quite trusted position. My rough understanding:
- (1) Each client generates a key pair
- (2) The fingerprint of the public key is part of the SDP message each client generates
- (3) The SDP message is somehow exchanged between the clients, out of band, and the session is started. The client's verify the remote peer using the public key from the SDP message.
The problem is that it's not really feasible in most circumstances to exchange the full SDP message out of band, so instead the service provide some mechanism to map a shorter ID (typically in a URL) to a centrally stored copy of the SDP. I think this might be where it happens for filepizza [0].
This means that a malicious file sharing operator, being in control of both the TURN service and the out-of-band signalling mechanism, could just MITM the DTLS connection and snoop on all the data going by. The peer's think they have each others public keys but they just have two keys owned by the server.
Only the initial SDP needs to be fudged. The attacker could just set up two clients that pretends to be the sender/recipient. Then the data can just go through regular Google TURN servers.
In other comments to this link people are describing GPS according to my mental model, which is hard to combine with cryptography making it un-spoofable.
If someone can re-broadcast the keystream and control the latency I perceive as a receiver, how would me checking that the MAC is correct help?
"Modern" .NET (previously ".NET Core" v1, v2, v3 - but now just ".NET" v6, v7, v8, v9) works really well on Linux and in containers etc. "Legacy" .NET, .NET Framework, version 4.X, does not.
If you build something new today in .NET land you are using a version that is compatible out of the box with linux, but there's gazillions of LOC .NET Framework out in enterprises that have yet not been migrated/rewritten.
But I don't actually know if Mono is stable enough to run Framework services on Linux?
I don't know how (if at all) Kudu is related to Mono, but on Azure you can run .NET Framework in e.g App Services (which uses Kudo under the hood). It's probably the only way to host a Framework service outside of IIS on a Windows VM. And Kudo contains references to Mono, and looks really linuxy when I've used it.
It 100% is! I've used Mono years ago to run older .net in VMs and containers. I found a single service that didn't really work well and we spent our time working on that instead of rewriting the world.
> If you build something new today in .NET land you are using a version that is compatible out of the box with linux
This is not a generally true statement, particularly with anything involving UI. WPF or anything modern WinForms are not supported. MAUI has no official Linux support.
And neither is recommended to be used over AvaloniaUI (or Uno, or Gir.Core if you want to target just Linux), which supports it.
Given the amount of whining about Linux GUI support despite good and proven solutions being provided, it leaves me with an impression some parts of Linux community actually don't want support at all and the goal is to have a convenient scapegoat to complain about.
> as if no amount of solar and wind power generation capacity could be an adequate substitute for any amount of geothermal power, because you don't have solar power at night, for example. But actually that's just a question of how much it costs to store the energy until it's needed or transmit it from where it's still being produced.
I guess this depends on the region, at least to some extent. In Northern Europe we've had these periods during fall/winter in recent years where it's cold, essentially dark, and (worst) no wind. It's not really feasible to store ≈multiple days of consumption for tens of millions of people.
In three of the four Swedish price regions I think we are essentially in a situation now where wind power is "worthless" and can't be built out any more, at least not without major changes to consumption patterns. When the wind is blowing there is such high production that prices go almost to 0, and the operators earn ish nothing, and when prices go up there is no wind so no-one can produce.
Storing multiple days of consumption is feasible but definitely harder than the usual case of storing hours.
The pricing problem sounds like an artifact of how you've structured the market, not a fundamental obstacle to the profitability of intermittent power sources.
An alternative structure that would solve the problem would be for generation operators to buy put options for energy they expect to be able to produce, eliminating the risk of a price collapse. Consumers who want access to such intermittent energy would have to write those put options, which would be limited to particular times on particular days when they could use the energy. Having written the option, they would have to accept the generation operator's decision whether or not to exercise it. Utility-scale storage providers could write puts for low-demand times and buy puts for high-demand times, or they could write puts for low-demand times, write futures contracts for high-demand times, and make up the shortfall on the spot market. This might produce major changes in consumption patterns, but, more likely, would enable continued investment to minimize those changes.
A step forward would be to show for all those solar/wind + battery proposals the expected uptime and distributions given historical weather and how climate change might affect that.
For example this solution would be sufficient and have no blackouts/brownouts per year in 99% of historical data....
We can do a pretty good job of predicting solar and wind production; that's done routinely. What's harder is predicting how load will change when electric energy is nearly free in most daytimes and expensive on calm nights.
That does not matter. You can only postpone washing your clothes for so long and heating in northern winters is non optional. The problem are things like: https://en.wikipedia.org/wiki/Dunkelflaute
Extreme weather events are what is important and we do have the data.
Heating in particular is an especially easy problem to solve; various kinds of thermal energy storage (sensible-heat, phase-change, TCES) can store heat for later climate-control purposes several orders of magnitude more cheaply than batteries. Washing your clothes is a harder problem, although a stockpile of clean clothes is easily stored.
Predicting how willing Swedes will be 10 years from now to buy extra jeans and install phase-change energy storage in their houses, however, that's beyond anybody's ability.
This phenomenon is something I would like to learn more about. Of course there is an element of frequency illusion mixed in, but this happens every now and again. Some random subject is all of sudden talked / written about by unrelated actors.
It doesn't necessarily have to be anything nefarious about it, papers and YouTubers need stuff to write and talk about after all. But at the same time that can be very beneficial for e.g Quaise in this case. How does it work, I'm guessing a "publicist" is involved somehow? How much does it cost? Has anyone here done something similar?
I worked both as a reporter and in public relations (not at the same time) and stuff like this happens for various reasons:
- an institution can publish a report as part of a regular schedule (e.g. unemployment by the BLS) or as one shot thing (e.g. a study on clowns distribution in arid areas). This leads many reporters to publish articles about basically the same subject, but in an uncoordinated manner;
- PR agencies often coordinate with media outlets from various backgrounds and markets to publish about some particular topic (company, product, campaign, ...) either at the same time, or in coordinated waves;
- trends and public discourse can make it so that many sources cover almost the same thing at the same time (e.g.: bitch resting face, rat boy summer, ...);
- luck is, always, a factor.
I recently checked it out as an alternative to renewing our signing cert, but it doesn't support issuing EV certs.
I've understood it as having an EV code signing cert on Windows is required for drivers, but somehow also gives you better SmartScreen reputation making it useful even for user space apps in enterprisey settings?
Not sure if this is FUD spread by the EV CA's or not though?
(Really) not saying it's a good idea, but if Swedes were required to fill in the paperwork, or even better/worse actually transfer the taxes (maybe including payroll), we would probably be more upset with how our taxpayer money is spent.
If you are politically motivated to minimize the tax burden, it makes sense to be skeptical of direct filing (even if you are not bribed by Intuit).
Is there a drop-in "thing" for using DeviceCheck? I would guess that something like Auth0 uses it (or maybe not? [0]). It seems like this could be a feature in any API Gateway / WAF'y product?
Not that I'm hoping for it, I too like to play around like OP. But I'm surprised how little I've encountered it in the wild.
So far nothing seems to indicate that it's happening. The question is if it's due to Apple's "measures" or just because it is not worthwhile for Meta/Alphabet. I think it's a combination of. But if it was as easy to "side-load" an app on iOS as on macOS - per your suggestion, I'm confident Meta would have done the switch in a heartbeat.
Just imagine if Apple provided nice API's for auto-updating, essentially no limitations on what binaries can be attested, API's/mechanisms for easy migration between AppStore apps and side loaded ones, no scary screens etc. Essentially implementing the DMA to the fullest extent, really honouring the intent of the law. Why wouldn't all the mega apps just move over? And what consequences would it have?
I think it would be awesome to e.g lift the JIT blocking and allow more strange niche things in alternative app stores. But getting all regular people on a wild-west third party app stores for the (ad financed) apps the use every day is just begging for a huge _actual_ loss in privacy.
[0]: https://en.wikipedia.org/wiki/Identifier_for_Advertisers#App...