I think you are using "sparse" and "non-linear" as scare terms. Sparse is a good thing, as it reduces degrees of freedom, and non-linear does not mean unsolvable.
Also "impedance mismatch" doesn't mean no go, but rather less efficient.
That doesn't fix suboptimal algorithm choices, but neither would a small NN in the compiler. A big NN could rewrite large sections of code without changing the logic, but why do that during translation instead of rewriting the source?
It’s not music from the classical period. Indeed, it’s from the baroque period. But in my decades of talking about and performing classical music, the term has never led to confusion.
That comment is mind boggling. I've spent a large fraction of my life playing in classical orchestras and also never heard anyone get confused. Yes, classical music is a genre and a period. Bach is in the genre, but not the period.
I hope the commenter learned something from their attempt at pedantry.
There are two uses of classical in modern parlance: classical as in the classical Mozart period, and classical as in anything that spans from Baroque to the late romantic Rachmaninov era and other composed music that uses largely traditional harmony or atonal music along the lines of Shoenberg.
"builtins" are primitives that Bash can use internally without calling fork()/exec(). In fact, builtins originated in the Bourne shell to operate on the current shell process, because they would have no effect in a subprocess or subshell.
In addition to builtins and commands, Bash also defines "reserved words", which are keywords to make loops and control the flow of the script.
Many distros will ship a default or skeleton .bashrc which includes some useful aliases and functions. This is sort of like a "standard library", if you like having 14 different standards.
'[' is an external binary in order to catch any shell or script that does not interpret it as a builtin operator. There may be a couple more. Under normal circumstances, it won't actually be invoked, as a Bash script would interpret '[' as the 'test' builtin.
I don't think this is in general true for planets or stars. You're confounding multiple effects. For a fixed number of particles, increasing metallicity, which follows average particle mass, should reduce radius, but for a fixed metallicity and temperature, increasing particles will increase radius. Temp has the effects stated. You can roughly validate this by the fact that massive planets and stars are bigger than less massive ones. Obviously many other things start happening as stars reach end of life...
Gas giants generally only get slightly larger than Jupiter (even with adding a lot of mass), until they start to shrink - and eventually with enough mass, turn into actual stars [https://en.m.wikipedia.org/wiki/Gas_giant]
So generally, gas giants don’t get much bigger than Jupiter.
Yes, that's why I said buff. The radius as defined would increase, because surface pressure is 90 bar, so at 1 bar, you're pretty high in the atmosphere. I can see merit in such a definition because that is the level at which we wouldn't have to pressurize our space stations to be comfortable. (Really 1/3 bar is fine too.)
An EMR will necessarily have low churn rate because the cost of switching is tremendous. I've been doing this a long time, and Epic is kind of the "EMR of the day". No doubt a new one will come along, solving the UI problems, offering new capabilities, and hospitals will switch over.
Eventually Epic may be replaced but it's really hard to break into the hospital market. Any new competitor would have to meet all of the ONC Health IT Certification requirements, plus a bunch of other checklist requirements imposed by hospital purchasing departments. It doesn't matter whether they solve the UI problems or offer new capabilities if they don't have the basics finished first, and that takes many person years of work.
The only way a startup might be able to eventually replace Epic is to target a niche ambulatory care specialty first where meeting all the checklist requirements is less important than really nailing an optimal clinician workflow. Then gradually expand out from that foothold by adding the features that hospitals need.
Also, the amount of antimatter storable in a Penning Trap is limited such that its mass energy is comparable to the stored magnetic energy of the trap, which is small compared to the energy released by the same mass (as of the trap) of high explosives.
Make an antimatter liquid with strong intra-molecular bonds (e.g. anti-H2O) that is only slightly ionized. That would be easier to contain magnetically.
Fusion of protons is theoretically possible in the sense of you can theoretically build a star from antihydrogen. Laboratory fusion of protons is very unlikely ever to be practical. The "S factor" (representing the fusion cross section aside from geometric and tunneling rate factors) for pp fusion is something like 24 orders of magnitude smaller than for DD fusion.
I'll add that maybe one could achieve fusion in a different way, by a means that's highly wasteful of energy. After all, the goal here is to get the fused nucleus, not achieve net energy production in doing so (as is the goal in fusion of ordinary nuclei).
Here, one might exploit the reaction p(p,pi+)d (or, rather, it's antimatter equivalent, which makes a negative pion), at a center of mass energy above threshold for creating the charged pion (which has a mass of 139.579 MeV. This is wildly energy negative, but if one has already invested many GeV in making each antiproton that's presumably acceptable. The cross section for this reaction is only ~200 microbarns, but that will be many orders of magnitude higher than the ordinary fusion cross section.
The solid antimatter one would probably target as an end goal would be anti-(lithium hydride).
There was a time that was said about nukes. They didn't really make sense. We were only able to make minuscule amounts of U-235 / Plutonium at very high cost... and had we wanted evil bombs, we had a thousand ways already to make them.
Didn't stop people then. And it won't stop sufficiently criminal governments today.
I'm not saying we already have nukes in the sense that will stop people from wanting a better bomb. I'm saying we already have a much better bomb than antimatter so why would people invest in making a much shittier bomb?
And we don't even have to reach for nukes. Dynamite is also a better choice for blowing things up than antimatter.
Look up comparative damage stats for Tokyo and Hiroshima/Nagasaki in WWII. The nukes were nothing compared to firebombing.
Now do the same for Gaza and anywhere other than WWII Warsaw. Carpet bombing isn’t necessary if you can aim with precision at the support infrastructure of occupied structures.
You're making my point: We had comparatively worse and cheaper ways to butcher each others. And yet, we build nuclear bombs.
Carpet bombing is not meant for eliminating military targets - it's an act of state-sponsored terrorism to get a population to rise against its leaders. Only, it has been proven over and over again: That doesn't work and often results in the exact opposite, making them rally behind the flag against the barbarous enemy who would attack civilians. Which is why we stopped doing that. Mostly.
Energy. Creating a single anti-hydrogen atom requires an absurd amount of energy to first create a collision in a particle accelerator and then capture that anti-hydrogen before it eliminates against another atom.
Only about 0.01% of the energy used to operate the particle collider creates antimatter, the vast majority of which is impossible to capture. All in all, the efficiency of the entire process - if you were to measure it in the e^2=(pc)^2+(mc^2)^2 sense - is probably on the order of 1e-9 or worse.
Has there been research on more efficient ways to generate antiprotons? (By the way anti-hydrogen isn't how you would store it as anti-hydrogen can't be trapped.)
Nothing we do creates it at any kind of scale, and it's a pain in the ass to store.
Not to mention the only way to create it is with energy (it doesn't exist on Earth), and we can only do so at terrible efficiencies. So even theoretically it's pretty bad.
The sad thing is, cutting down on the streamers does make an actual dent in outgo. Each platform is at least $9USD, and subscribing to them all at this point is easily $100/month. Obviously, some are higher than $9, but cutting the cord to save money tends to come out higher than the dreaded cable bill.
In a perfect world, maximizing (EV/op) x (ops/sec) should be done for even user software. How many person-years of productivity are lost each year to people waiting for Windows or Office to start up, finish updating, etc?
Also "impedance mismatch" doesn't mean no go, but rather less efficient.
reply