They ported it to 64Bit proper last year, and that helped performance on modern systems a bunch! Of course, it's still a far cry of being able to run on the hardware of the time it came out in 2000.
The institution I work for is a completely Google-based shop, with an Enterprise agreement with Google for data retention and various other requirements.
Even for us, Google Takeout is a complete mess that fails all the time, or straight up corrupts files that need to be exported. It doesn't surprise me at all that the service sucks for general users, but the fact that it's terrible for Enterprise customers really tells you all you need to know about Google.
Similarly to the OP, we also run into issues with logs wherein the Google Admin Console also just straight up doesn't provide detailed information about what went wrong, and questions via our relationship manager often get passed around for what feels like years.
If you're on enterprise, you can use the Legal Discovery feature (vault) instead - it worked at my previous employer when all the HR files mysteriously vanished. I think they maintain it properly because otherwise they might get summoned in a case. The downside is that it does a format conversion to word format (and presumably Excel for sheets), and doesn't export in the same directory structure (although I bet the metadata exists in the dump somewhere)
> If users could set their own preferred AI bot as the go-to for Siri requests or writing help, like Anthropic’s Claude or — say, xAI’s Grok — it’s doubtful that Musk would be yelling this loudly about the dangers of such an integration.
This is basically what I thought of when I saw this headline at first. Musk does not give a damn about the usage of ChatGPT, he's just annoyed that it's not _his_ AI tool that's ingesting Apple user's data.
> “if your objection is honoured, it will be applied going forwards”.
It is absolutely wild being in the "before times" (i.e. before any real legislation is applied to the use of data for AI training) and seeing statements that basically amount to "eh if we feel like it, we'll exempt you".
Not very often you see an "if" in statements like this.
it's probably going to get worse. Now that we have normalized using 2GB of ram for a text chat app because it's a bit easier to code, I bet it's only a matter of time before they go one abstraction level higher and start emulating an entire OS.
Nowhere have I mentioned the conflict nor have I mentioned whether reporting on it is easy. Not blatantly lying is pretty easy though and Al Jazeera fails that test.
More importantly, Al Jazeera is not merely reporting on the conflict, they have been „reporting“ about many other things for a long time. That’s enough data for me that can be compared to the sum of ALL other outlets.
None of that is surprising either - Al Jazeera is a government propaganda tool, run by a dictatorial kingdom that ranks rather low on freedom of the press.
The fact that you drew a comparison to BBC is telling.
> The fact that you drew a comparison to BBC is telling.
That indeed was a specific choice - you could also state that it's a state-run "propaganda tool". It's also going to have a western (specifically UK) based view on the media it's reporting on. The same is true of Al Jazeera, or the Australian Broadcasting Company.
That, in and of itself, does not mean it is "blatantly lying". No news agency is without bias, and believing so is naive.
Does Al Jazeera have areas for improvement that are likely to not materialise due to the context in which the company is based? Sure, but that's also true of Fox News, CNN, the BBC, ABC, or random Reddit commenters.
I didn‘t ban it and the Israeli government certainly didn‘t ask me for my opinion. They had their own reasons and their own evidence and I only gave you _my_ reason for considering it fake news, which most of it is.
[0] - https://www.teamfortress.com/post.php?id=238809
reply