Being so absolutist is silly but their counter argument is very weak. Can I invalidate any memory safe language by dredging up old bug reports? Java had a bug once I guess it's over, everyone back to C. The argument is so thin it's hard to tell what they're trying to say.
It's just as reductive as the person they're replying to.
> Being so absolutist is silly but their counter argument is very weak.
The entire point is that being so absolutist is silly.
The comment reflects the previous poster's logic back at them so they (or others) can hopefully see how little sense it makes.
You seem to be trying to see some additional argument about rust being bad/invalid, but there isn't one... The reason that argument is, indeed, "very weak" and "so thin", as you say, is that it isn't even there at all.
It seems odd to me to put this much effort into misunderstanding what people are saying. You just end up talking past everyone, essentially talking to no one about nothing.
If it wasn't obvious from my ramble, Rust concerns are pragmatic, not absolutist. The only absolutism is that for memory safety to be truly upheld, you can't half-ass it (Zig) or ignore it (C).
> An ABI can't control whether one or both parties either end of the interface are honest.
You are aware that Rust already fails that without dynamic linking? The wrapper around the C getenv functionality was
originally considered safe, despite every bit of documentation on getenv calling out thread safety issues.
Yes? That's called a bug? The standard library incorrectly labelled something as safe, and then changed it. The root was an unsafe FFI call which was incorrectly marked as safe.
It's no different than a bug in an unsafe pure Rust function.
I'm choosing to ignore that libc is typically dynamically linked, but linking in foreign code and marking it safe is a choice to trust the code. Under dynamic linking anything could get linked in, unlike static linking. At least a static link only includes the code you (theoretically) audited and decided is safe.
You're both wrong, the Mirai uses a fuel cell as the voltage source for an otherwise EV drive train. The Mirai is an EV with a fuel cell instead of a battery.
For Vulkan you already ship "pre-compiled" shaders in SPIR-V form. The SPIR-V needs to be compiled to GPU ISA before it can run.
You can't, in general, pre-compile the SPIR-V to GPU ISA because you don't know the target device you're running on until the app launches. You would have to precompile ISA for every GPU you ever plan to run on, for every platform, for every driver version they've ever released that you will run on. Also you need to know when new hardware and drivers come out and have pre-compiled ISA ready for them.
Steam tries to do this. They store pre-compiled ISA tagged with the GPU+Driver+Platform, then ship it to you. Kinda works if they have the shaders for a game compiled for your GPU/Driver/Platform. In reality your cache hit rate will be spotty and plenty of people are going to stutter.
OpenGL/DirectX11 still has this problem too, but it's all hidden in the driver. Drivers would do a lot of heroics to hide compilation stutter. They'd still often fail though and developers had no way to really manage it out outside of some truly disgusting hacks.
There's two tiers of precompiled though. Even if you can't download them precompiled, you can compile before the game launches so there are no stutters after.
Yes, many games do that too. Depending on how many shaders the game uses and how fast the user's CPU is an exhaustive pre-compile could take half an hour or more.
But in reality the exhaustive pre-compile will compile way more than will be used by any given game session (on average) and waste lots of time. Also you would have to recompile every time the user upgraded their driver version or changed hardware. And you're likely to churn a lot of customers if you smack them with a 30+ minute loading screen.
Precisely which shaders get used by the game can only be correctly discovered at runtime in many games, it depends on the precise state of the game/renderer and the quality settings and often hardware vendor if there are vendor-specific code paths.
Some games will get QA to play a bunch of the game, or maybe setup automated scripts to fly through all the levels and log which shaders get used. Then that log gets replayed in a startup pre-compile loading screen so you're at least pre-compiling shaders you know will be used.
I don't think this is as much of an issue as you are making it out to be. I have my Steam Deck on the main branch release which seems to exclude it from downloading precompiled shaders. When a game updates it has to compile the shaders first, but even on a big game this does not take an unreasonable amount of time. Less time than it takes for game updates to download at least.
Steam could improve the experience here by having the shaders compile overnight in the background so it presents zero delay but the current way doesn't bother me much at all.
I remember Star Wars Jedi Survivor had a 5-6 minute shader pre-compile on my 5950X. I heard of people well into the 30 minute mark on lower core count machines. Battlefield 6 was a few minutes on my 9950X, higher again on lower core count CPUs.
Really depends on the game.
There's no easy way around this problem. It never came up as much in the OpenGL/D3D11 era because we didn't make as many shaders back then. Shader graphs and letting artists author shaders really opened pandoras box on this problem, but OpenGL was already on its way out by the time these techniques were proliferating so Vulkan gets lumped in as the cause.
You're getting lucky with the games you're playing, then; there are absolutely PC games that have had 20-30 minute long shader compilation times _on high-end gaming hardware_. (I think some of Sony's ports were known for this; Googling tells me Borderlands 4, Stalker 2, and Starfield also had notably long shader times.) Typically those occur within the game's UI after launch but before the game starts playing, though, which makes me wonder if Valve might still be caching a non-GPU-specific intermediate of the DX12 to Vulkan conversion, and _that's_ what Linux Steam clients are compiling pre-launch and/or sharing with other clients. That's pure speculation on my part though, as I haven't played any of the worst-case-scenario games on my Deck, nor have I done anything that would cause the shader downloading to not operate.
So is this why on my laptop when I start a game after an update it starts "compiling vulkan shaders" for a few minutes? I've never understood what that was actually for but it takes 100% CPU on all cores so it's clearly doing something
The CPUs in their SOCs were not up to snuff for a non-portable game console until very recently. They used (and largely still do I believe) off the shelf ARM Cortex designs. The SOC fabric is their own, but the cores are standard.
In performance even the aging Zen2 would demolish the best Tegra you could get at the time.
You should note that the Switch, the only major handheld console for the last 10 years, is the only one using a Tegra.
And from everything I've heard Nvidia is a garbage hardware partner who you absolutely don't want to base your entire business on because they will screw you. The consoles all use custom AMD SOCs, if you're going to that deep level of partnering you'd want a partner who isn't out to stab you.
It's probably worth watching TNG before DS9. The contrast between TNG and DS9 with DS9's darker tone is an important part of the show. Probably the best episode in the whole series, "In The Pale Moonlight", is made all the better when you've seen what they're contrasting against.
The first season is definitely the most conventional (for the time) and I think that reflects in some of JMS's statements saying the show was still getting onto its feet through the first season. Having the serialized story was very unfamiliar territory for Hollywood television back then, they were learning on their feet.
If I recall correctly JMS wrote basically every episode after season 1, where as season 1 had a few guest writers. The guest written episodes did not do well, including episode 14 which is probably the worst episode in the entire series.
Agreed! In fact it is kindof annoying. Every set of orderable elements has a worst element, therefore every show has some bad episodes. You want to tell new viewers to just skip those episodes if they want, but it’s practically impossible with B5. If you skip TKO because part of it is cliche then you also miss the essential key to understanding Ivanova.
Tangent, but a cartoon I immensely enjoyed as a young kid popped up recently on my YouTube feed - Jayce and the Wheeled Warriors. That day I learned JMS wrote the story and it too featured an overarching story that backed the otherwise “episode of the week” format.
It's just as reductive as the person they're replying to.
reply