Inversely, in my career there's only been a handful of times where cryptographic randomness was too slow.
I'd argue it's better to do the safe thing by default and switching to the faster alternative when you have proof you need it. Doing the fast thing by default and fixing security later is how we got Meltdown/Spectre.
It's going to depend on your experience, for me I have often run into the exact opposite extreme... where the non-secure random is to slow for my uses-cases (games, graphics, procedural texture gen, etc) and a much faster but less statistically random generator better suited my needs.
I would argue that the default behavior should favor the novice and non-domain-expert. Should game programmers, graphic programmers, etc, be expected to know that they need to tune the performance of random() or should the domain-experts writing cryptographic algorithms be expected to understand the limitations of random() as it applies to their use-case?
The game programmer who needed better performance can "just" switch to a faster algorithm when the profiling calls for it (and if you don't notice it, well, no harm anyway).
The guy who just needed to generate some cryptographic keys? Rotate everything, and you had some pretty horrible hidden vulnerabilities in the meantime.
> Inversely, in my career there's only been a handful of times where cryptographic randomness was too slow.
Yeah, that's the opposite of my experience. Often the libc random is too slow and limiting performance and I need to substitute something even faster.
But I agree that using something that is securely random by default is a good idea. People can substitute faster thing fit for their purpose if needed.
I'd argue it's better to do the safe thing by default and switching to the faster alternative when you have proof you need it. Doing the fast thing by default and fixing security later is how we got Meltdown/Spectre.