It's trivial when you can pass arbitrary legislation.
Back in the 90s we had to deal with US gov restrictions on encryption export. Software companies fell into line. It was a big deal when 128-bit keyed Netscape became available globally in 1997, per State Dept approval, but even the the full-strength server-side SSL was still restricted to 'approved' entities.
gen x = y if z > 4 // headaches abound
* For reference, stata uses +Inf as missing value, so any operation with "greater then" is going to assign missing values to something. And yes, there have research papers retracted due to this behavior.
Their next goal is 6,000 a week. Musk should just take a week off and let his team work on it.
I see more value in it for clear technical tasks with technical descriptions, and not so much for discussions. Then proposals can be peer-reviewed before accepted as work that needs to be done (as a merge request to master, for example).
Then another problem that comes along with that, especially for golang users, is that your master branch is getting bombarded with issue updates, causing what looks like releases, when there's no effective change.
I'm not sure genetic mapping is really the core of the issue anyway. What we need is the equivalent of a reverse vaccination, since celiac is essentially an immunity to wheat. Such a concept might do for chronic disease what vaccination did for acute.
> However, building around sequential reads gives you the wrong idea when the hardware changes underneath you.
If one is going to build a system tuned to the hardware, they can't ignore the hardware. What abstraction should the author be using?
If it’s unnecessary to ever prefer /dev/random to urandom, then why does /dev/random’s current blocking implementation still exist? Surely between all of the kernel developers, between all of the Unix-ish OS’s, over the last couple of decades would have realized that /dev/random should simply act like /dev/urandom?
Some consulting company like McKinsey might have advised big retailers on how to reduce/recoup the shrinkage. These cease and desist letters, vague laws, etc are product of that effort.
This doesn't refute his argument, which I read as being, essentially, against performance optimization of software during initial construction. Knuth famously bemoaned premature optimizations, as well.
sorry for leading with a negative, I know it can come across more strongly than I intend.