So, the biggest question I have about all this eminently theoretical physics video thing is, has anyone ever managed to experimentally demonstrate you can __voluntarily__ bend the brane our reality resides on by any measurable amount, and also demonstrate this causes an actual translation in 3d space? At all? Ever?
Or are we conjecturing what would happen if this was even slightly viable?
We've only very recently managed to experimentally demonstrate that unimaginably huge cataclismic events can make this happen, but, has anyone ever built some device that can make a crease of any significance at all in reality, and do so at will?
Apologies in advance if this sounds excessively rude, but I'm not asking for anything as there's noone to actually ask to, I'm asking about that.
That distinction matters, since I personally find it easier to derive enthusiasm from concrete, tangible tools that may prevent the entire species as a thing from dying choked in a toxic miasma of our own creation than from the purely numeric advances in our grasp of the rules of a universe we're basically guests in.
> I'm not asking for anything as there's noone to actually ask to, I'm asking about that
We have witnessed spacetime ripples bending the length of an inferometer chamber. These ripples are far weaker than those we can, with known physics, create. (Most of LIGO’s work involves removing the effects we unintentionally create.)
> tangible tools that may prevent the entire species as a thing from dying choked in a toxic miasma of our own creation
Bending materials with spacetime ripples is known physics. You are asking for (or about or whatever) something we have.
If you are unsatisfied with anything short of a functioning interstellar drive, see the earlier point about demanding light bulbs from those just grinding lenses. There are those doing the work. There are those oblivious of it. And there are those shouting at it.
- Move all documents to a simple doc server like idk Alfresco
- Any any all single read requests to this server are authorized after biometric confirmation of any parties involved, which are then added to the users' "file".
- Any one use of this data that can be construed as malicious carries 4 to 8 years of jail for anyone involved in the chain of custody of the specific bits of misused data.
You don't need superstars on the project, you need the right balance of incentives around mucking with it.
It's oddly infuriating to read this from a "developing" (ie poor) country.
We get permanently schooled by all this rich countries who ransacked both our and later their natural resources in order to build themselves into prosperity.
Yes, it would be awesome if every one of us adopted a hectare of land and personally hugged every little bird and fox every day to try ensure their precious existence (which is undoubtedly precious and a tragedy if lost), but we cannot and won't do that if they sit on a source of precious, scarce income.
It's because of stuff lke this that poachers exist; you don't risk death to kill rhinos for eating, you do it because you may feed your family for a week by selling a chunk of it.
People don't risk angering whatever arcane disease some random swamp animal has for the thrill of it, they do because that's their only means of making money.
If richer countries wanted to stop poorer countries from exploiting whatever sources of income they can find, then they should help build other means of obtaining said income.
As in, not just throwing money at people and expecting that to magically solve anything, but actually build means of making a living and doing so in a way that doesn't set up a predatory structure like businesses will do.
EDIT: Loving the insta-downvote with no comment. Please, come out of hiding and explain yourselves.
OK, given the sudden surge in anonymous morons unable to follow basic prompts for civil discussion and the refusal of the implementors to do something about the mounting garbage on this site, I'm adding hacker news to my block list.
Yeah... About that... I mean... no. When the possible death of thousands upon thousands of people is on the line, your proposal better be ironclad and well equipped to stand many lines of attack if you want it to be taken seriously.
"Hey maybe let's not do the one massively researched thing that we've done since times of the plague because there's not enough data" is neither of those things.
Open dialogue in science needs to take place in the forums already in place, not on clickbaity money-starved legacy media.
EDIT: Dear rando downvoters, please instead of lurking help foster debate by stating why in your opinion untestable indefensible claims should be allowed to pass as science in the name of "open dialogue".
Twitter has degenerated into one of the least friendly things to build an app for. I've had a better dev experience with Gab, ffs.
Not only that, but also their community building and tending is abysmal too, so your app will end up used mostly by the kind of user that brings infinitesimal value with them.
Yeah I'm pretty sure the language that has birthed conventions like HasThisTypePatternTriedToSneakInSomeGenericOrParameterizedTypePatternMatchingStuffAnywhereVisitor and RefreshAuthorizationPolicyProtocolServerSideTranslatorPB are rife with great practices that are in no way obscured by layers upon layers upon layers of half-assed abstraction.
The one with System.ServiceModel.Install.Configuration.ServiceModelConfigurationSectionGroupCollection in it surely musts be awesome too; I've heard great things about it.
The footguns in a given language being more ornate and towering than in another doesn't make that language better, but the ability to produce resilient, maintainable solutions for a given problem space in it.
Ah yes, Sargon of Akkad, of "I wouldn't even rape you" fame.
If anything, I'm pretty glad the absolutely despicable thing is no longer funded through them and kinda makes me want to contribute more money to Patreon somehow.
I still remember when Clang bringing LLVM along was seen as SO OUT THERE and I'm just mentioning it because I find it weird to be old enough to see fads in system languages come and start to go.
Just curious, do you have any examples of this "limitations" you speak of? Sounds like a very interesting read.
In broad strokes, LLVM chooses to optimize for generating good code for statically compiled code more than for, for example, memory usage, compilation speed, or ability to dynamically change compiled code. That doesn’t make it optimal for JavaScript, a language that’s highly dynamic and often is used in cases where compilation time can easily dwarf execution time.
Worth noting that B3’s biggest win was higher peak throughput. It generated better code than llvm. It achieved that by having an IR that lets us be a lot more precise about things that were important to our front end compiler.
It’s not even about what language you’re compiling. It’s about the IR that goes into llvm or whatever you would use instead of llvm. If that IR generally does C-like things and can only describe types and aliasing to the level of fidelity that C can (I.e. structured assembly with crude hacks that let you sometimes pretend that you have a super janky abstract machine), then llvm is great. Otherwise it’s a missed opportunity.
And aliasing. The aliasing story in B3 is so wonderful. That was one of the biggest wins - being able to say for example that something can side-exit (and can do weird shit after exit) but doesn't write any state if it falls through.
LLVM's MCJIT library is 17MB. If you have a language that you want to JIT and you thought you could embed your language like lua (<100k), Python (used to be ~250k but now <3M), you're looking at almost 20MB out of the gates. Not ideal!
Also if you want to use llvm as a backend for your project and expect to build llvm as part of a vendored package, the llvm libraries with debug symbols on my machine was about 3GB. Also not ideal.
Llvm makes some questionable choices about how to do SSA, alias analysis, register allocation, and instruction selection. Also it goes all in on UB optimizations even when experience from other compilers shows that it’s not really needed. Maybe those choices are really fundamental and there is no escaping them to get peak perf - but you’re not going to know for sure until folks try alternatives. Those alternatives likely require building something totally new from scratch because we are talking about things that are fundamental to llvm even if they aren’t fundamental to compilers in general.
I dislike UB, but I do at language level. When LLVM is reached, UB can only have and only be continued to be removed, never added (from a global point of view, applying general as-if rules a compiler can always generate its own boilerplate in which it knows something can not happen, then maybe latter leverage "UB" to e.g. trim impossible paths, that are really impossible in this case -- at least barring other language level "real" UB). So are there really any drawback to internal exploitation of "UB" (maybe we should call it otherwise then) if for example the source language had none?
It is true that compilers sometimes have to have operations that have a semantics that are defined only if some conditions hold. But LLVM's and C's interpretation of what happens when the conditions don't hold is extraordinarily liberal and I'm not sure that is either beneficial or sane.
Like, LLVM tries not to add UB, but design choices it made to support optimization with UB do sometimes result in new UB being introduced, like the horror show that happens with `undef` and code versioning.
So, I think that optimizing with UB internally is fine but only if it's some kind of bounded UB where you promise something stronger than nasal demons.
> Llvm makes some questionable choices about how to do SSA, alias analysis, register allocation, and instruction selection.
Do you mind expanding more on these points or directing me to some places where I can learn more about them? Compilers are a fairly new field for me, so anything I can learn about their design decisions and tradeoffs are worth their weight in gold.
I mean, if I google the phrase "shirts -stripes" and click the Images tab I see mainly shirts without stripes.
So it's essentially the same input, and essentially the same expected output, but there must be quite a knot between understanding the word "without" and literally just using the - operator.
Right, but the - operator requires prior knowledge that the search engine understands boolean operators and that a -b is an alias of "listings with a on their text that don't have b in their text too" whereas "a without b" is inmediately recognizable by whoever is writing the search as "I want something of kind A without property B"
Or are we conjecturing what would happen if this was even slightly viable?
We've only very recently managed to experimentally demonstrate that unimaginably huge cataclismic events can make this happen, but, has anyone ever built some device that can make a crease of any significance at all in reality, and do so at will?