Seriously, depending on how long your game is, it might make sense. You would expect at first the claim would be completely disregarded, and then in some number of centuries enough inter-planetary infrastructure could develop that includes a legal framework under which you could sue for damages. Or, something...
I wonder if you could send over some type of networking equipment, and then anticipate charging one heck of a bandwidth premium?
The resource curse and the Dutch disease are two distinct, but linked, phenomena.
The long and short of the dutch disease is that if you introduce a revenue source that is independent of the rest of your economy and pour all of the money into your economy, it will cause your currency to appreciate, making every product made in your country less competitive. So even while your country keeps getting richer through oil revenues, all the rest of your economy stagnates or decays, until you import most of what you consume and are wholly dependent on your oil revenues.
Note that this is not limited to oil. The Spanish went through this in the 16th and 17th centuries because of silver imported from the new world.
The effect seems different for regions of countries rather than countries as a whole.
For most of the first half of the 20th century, the Saudi Arabia of the world, in terms of oil extraction, was the United States.
But the situation was moderated by a few factors, at least as I see it:
1. The US had an industrial capacity (though much of that developed over this period), largely in the northeast. Much of the oil production was in economically underdeveloped regions of the country, particularly Texas and Oklahoma.
2. There wasn't a large world market for oil initially. Oil demand was being built up as supply was established. Flooding the market with cheap oil money simply didn't happen. Actually, demand was so low relative to extraction costs that oil prices hit a low of two cents per barrel after the 1930 East Texas Oilfield discovery (see Daniel Yergin's The Prize for an extended discussion of this).
3) WWII. The economic climate from 1930 - 1945 was not one of what we consider conventional growth markets. Initially the problem was the Great Depression, during which global economic activity contracted markedly. Then came the command economy of WWII. In the aftermath, the US was both the sole oil and economic power left standing.
4) By the time other countries were establishing themselves as substantial consumers of oil, the US itself had begun importing oil from Saudi Arabia.
I'm not aware of formal economic treatments of this, though it strikes me that the US experience was significantly different from that of, say, Argentina, Saudi Arabia, or even Russia / the USSR.
> We are talking about 3/4 of games, easily. Games where you are a character and you run around in virtual space.
Then don't make those games for VR.
This might be cliche, but VR is not just a new accessory for an existing platform. It's a completely new platform. It needs new content that fits it -- shoehorning existing content into it will generally suck. Wandering around Skyrim sounds cool but will not work. Sorry.
As an example I've tried a lot, sim games (car, plane) work really well with VR. If there clearly is a cockpit around you and a seat you are sitting in inside the virtual world, your brain is fine with you sitting in your comfy chair.
Of course, not everyone will enjoy sim games. However, I'd be willing to bet a lot of money that there are and will be more new games that fit VR well. These will probably include entirely new genres of content that are not compelling or interesting on a monitor but work well on VR. Conversely, some genres will never translate well into VR.
This 100%. From my experiences with VR, the "character moving freely in the world" paradigm never works. As you point out, developers need to rethink the type of software they're putting out for these platforms.
I think games where the player is static is a good place to start. In addition to simulators, I could see god games, rail shooters and puzzle games all working well in VR. Even a game with restrained movement like a slow-paced mech shooter or old-school first person RPG (think Ultima Underworld) could turn out great. Making the player aware of the dissonance between the virtual and physical worlds is the immersion-killing poison that VR game developers must avoid.
Finally, I'd like to make note of the many non-gaming applications of VR that could turn it into a compelling platform. The platform has tons of potential in fields such as art , manufacturing , education , health care  and the list goes on. VR can be used to enhance these applications but not if it's treated like a cheap gimmick.
If VR flounders it will be because developers failed to tap into the vast market of VR software waiting to be unlocked. The key is imagination.
Exactly. I'd go so far as to suggest a new term, since "Virtual Reality" seems to bear such a terrible burden of near-complete immersion in some practically unbounded realm with total mobility and Holodeck-like possibilities. Instead of virtual reality, think about virtual presence. VP, if you will.
What are some really cool situations in which you can simply be present and be completely immersed? When I refocus my perspective this way, I think about being at a movie -- in a movie. Maybe I'm the gunner in the Millennium Falcon. Maybe I'm just a spectator on a nearby ship. Or maybe I'm sitting in a sports arena, as if really at the game, watching it and taking it in in a way I can't through my TV alone. Maybe I couldn't afford tickets to the Radiohead tour, but I can buy a virtual seat at any of the worldwide tour venues and see the concert with those crowds. That seems really fucking cool. I want to do that. I want to do that right now. And I don't need to run or jump around when I'm there. All I want to do is sit, and watch, and listen, and be present. And be immersed.
Whenever people think of VR, their thoughts immediately turn to gaming, and then they get disappointed when they realize that VR Skyrim or VR Grand Theft Auto (or what have you) isn't in the cards anytime soon. Instead of thinking about awesome scenarios where VR will fall short of the status quo, we should think of fun scenarios where VP will be a big step up from the status quo.
> Instead of thinking about awesome scenarios where VR will fall short of the status quo, we should think of fun scenarios where VP will be a big step up from the status quo.
Does anyone have links to actual examples? The Rift dev kit has been available for a while now. Developers should have gotten past the 'VR makes too many people sick' phase and started looking at new ideas. What are these new ideas?
Personally I'm a little disappointed that most of the hype around VR has been explicitly focusing on the gaming aspect. The non-gaming applications of VR are much more exciting to me. I want a meditation room, where I can sit on a stormy mountain to think. I want to program in a virtual room with monitors on every surface, and pieces of code fixed in space, where I move to the code instead of the other way around. Maybe it'll work and maybe it won't - its impossible to know until the hardware gets good enough to try.
VR has been around since the 60's, and has not "unlocked" anything noteworthy.
Sure, this "round" of VR might be the best yet graphically, but it still hasn't solved any of the problems that prevented the original 1960's implementations from taking over in storm.
These being: No "AAA" support, too few games, too expensive of devices, motion sickness, and more!
It's almost like folks think this is the first time VR has come to market... "imagine all the possibilities!" - the same promises time after time after time again. What makes this iteration anything different?
Maybe some folks have high hopes due to the Facebook stewardship - I don't mean to burst anyone's bubble, but Facebook has been anything but a good steward of Oculus.
They have not used their size or bank accounts to convince "AAA" studios to embrace VR, nor have they subsidized Oculus purchases to lower cost-of-entry for consumers. About the only thing they have done is go against all of their initial promises to not "Facebook-ify" VR and turn the thing into yet another extension of the Facebook Social machine.
> The platform has tons of potential in fields such as art , manufacturing , education , health care  and the list goes on
VR will ultimately do nothing for these fields. AR advances will ultimately be what these industries end up embracing. You lose far too much environmental awareness when both eyes are staring at a narrow-focus screen without peripheral visibility, etc. AR allows the user to work and behave like normal, only their vision is augmented with additional useful information. It's less intrusive and disruptive, and has very few of the issues that plague VR.
AR is already widely deployed and in mass use. Any HUD is a form of AR (this covers everything from airplanes to cars to robots to google glass, camera viewfinders, etc...).
> it still hasn't solved any of the problems that prevented the original 1960's implementations from taking over in storm.
So much progress has been made with VR since then. A lot of problems have been solved.
Motion sickeness has been solved - when your motion 1:1 matches that of your virtual body there is no problem. Where there is a mismatch, there can be a problem (for about 30% of the population who are sensitive to this). So informed developers know how to create experiences that induce no sickness.
Regarding no AAA support / too few games - they are working on it. There will be over 50 games usable for the Vive at launch. There are some pretty big name games, like Elite Dangerous. Google, Valve and other big names are working on/have made stuff.
>What makes this iteration anything different?
Because it's actually at a point where it's usable now. Mobile phone popularity was one of the things that made this wave of VR possible as it drove forward the technology used.
>VR will ultimately do nothing for these fields.
There are VR applications that are currently for those fields mentioned, for example Tilt brush is generally received very well by artists and creative people.
I believe VR and AR will converge somewhat. VR has a headstart, and AR will adopt many of the technology, tools and techniques that VR uses. The Vive has a camera and can do limited AR, as can most mobile VR headsests.
I agree, that for a lot of applications - especially work-related then AR is better.
But when I game, I don't care about peripheral vision, I want to be in another world. Ultimately I want a headset that can switch between AR and VR modes.
I agree too. "3D" had another push a half-decade ago in theaters and on TVs, and by this point it's reasonable to call it a failure. VR is much in the same vein: it's inconvenient enough to our lives that we're more likely to justify getting away from it than going to it. The Visicalc moment should either have happened by now or be right on the horizon. It hasn't. The demos are still basically reliant on the same kind of "spectacle" elements that 3D movies use, and the track record there is bad for franchise media - that is stuff that sells tourist tickets for theme park installations, but not repeat visits from loyal fans.
But AR, AR stands a chance. It is not hugely different to our existing uses of media technology - one more screen, in a different location, supplementing the existing experience. And it has good cross-over into VR for the remaining experiences that do work well in immersion mode. That might buy VR time to develop gradually for a few decades, like silent film. There are some things that are worth exploring with VR, but the tech really needs to be in mass adoption first.
As an indie gamedev myself. I can think of dozens of potentialial games that'd likely work well in be and not on monitor. And don't require physical movement and can be done sitting with swivel chair. Having the limit of not being able to run around may be disappointing to some. But any limit just forces us to innovate and achieve greatness thru new paths.
There are several instances in last 150 years where a patent or some other restriction caused others to innovate around a restriction and the end result was far superior because of it.
We're talking about big-budget "AAA" games, which drive console and pc sales alike. Excluding them, immediately destroys any notion that VR may become anything more than a niche product and go fully "mainstream".
> As an example I've tried a lot, sim games
I enjoy these games as well, but recognize their market appeal is very narrow. These games don't drive sales of consoles or PC's.
> I'd be willing to bet a lot of money that there are and will be more new games that fit VR well
Unfortunately this is unlikely to happen until VR becomes mainstream, which it probably will not for the reasons listed in the GP post. In order for a studio to take the enormous risk of developing something entirely new, they need as close to a guarantee as possible to get back at least their investment. With VR poised to be nothing more than a niche product, this becomes a huge risk.
Now, maybe Facebook and their deep pockets could subsidize development of these games (effectively pay "AAA" studios to natively support VR), but that doesn't appear to be happening... otherwise they would have done that before the flagship product launched, and "AAA" studios are not in a "release cycle" at the moment (which tends to be October-November for the really big "AAA" projects).
Again, you're looking at this from the perspective of what the industry looks like now, not what VR enables. AAA games are generally of a certain type now, but that is not a universal law which will remain true forever.
> you're looking at this from the perspective of what the industry looks like now, not what VR enables
VR has been around a long, long time. I know, most folks say "but this time it's different", but in reality it's plagued today by the exact same issues it has been plagued with all along.
No "AAA" support, too few games, too expensive of devices, motion sickness, and more!
> AAA games are generally of a certain type now, but that is not a universal law which will remain true forever
Essentially, it is. These are the games that grab people's attention, have grabbed their attention, and will continue to grab their attention. Nobody is going to spend $600+ to play Bejeweled in VR... but they might to play CoD, Fallout, Battlefield, GTA, Skyrim, etc. All are "AAA" titles, and all could offer a new level of immersion with VR - but none support VR natively (and they had plenty of opportunity to do so before launch).
What hardware was there "all along" for consumers? As I understand the latency and tracking problems have just now been solved with the latest revisions of oculus/vive. You can't say it failed in the past because of X and Y external factors when the product itself wasn't even a thing.
I enjoy these games as well, but recognize their market appeal is very narrow. These games don't drive sales of consoles or PC's.
Well, they don't drive PC and console sales, perhaps in part due to the limitations of the existing platforms.
Driving and especially flying games suffer more from the limited point-of-view. I'm often switching to the 3rd-person view to get a better handle on how my car is placed on the track.
For combat flight sims, one of the UI staples for the last 20 years has been showing markers on the edge of the screen to indicate the current target. You spend a lot of the time in those games without your enemy in view, which, frankly, stinks. But that's the limitation of the current platforms based around a monitor / TV. There have been hacks to help with this, but they aren't that great IMHO.
VR could really help these types of games have a much broader appeal, because of the immersion.
I would imagine that Command & Conquer-like games could be a good fit for VR as it requires overview where you'll usually have to scroll in cases where you'd otherwise just lift and eyebrow.
I could also imagine a Window Manager where every window is placed in a large (huge) sphere that surrounds you and your chair (you are stationary in the middle). You could turn you head to see and interact with windows that are nearby or you could zoom out and rotate the whole sphere by using some shortcut or other device (maybe an ergonomic mouse with a ball on top of it , such that the sphere around you rotates in the same direction as the ball of the mouse). You could still map virtual desktops to a zone on the sphere (or perhaps just create virtual spheres). With some handy keyboard shortcuts I could imagine this to became quite awesome. Especially for programmers that use many windows at the same time.
Just to clarify: Archive.is decided to block all of Finland not because of some Finnish legal requirement, but because the guy running it had a bad experience in Finnish customs and wanted to have revenge of some sort. Not really censorship.
Nothing the size of the Med, of course. There are several major below-sealevel basins in Northern Africa. The most significant is probably the Qattara Depression, which probably won't ever naturally flood, but is the target of many plans to flood it intentionally for power production and improvements in local climate.
They are currently developing the methalox Raptor  engine that is planned to have 2300kN sealevel thrust. That would be more than six times powerful than the Merlin 1D+s in current Falcon 9. That would be solidly in the heavy lift rocket engine category, specifically as they intend to group them like they group engines in Falcon 9.
"foo" in Rust has the type &'static str. That is, a pointer and a length to an immutable piece of memory that never goes out of scope, which in the case of a literal is allocated statically. Because Rust tracks ownership of memory, this kind of a reference is of a different type than a heap-allocated owned string.
Rust also does no allocations or conversions without you telling it to do so. The programmer wanted a heap-allocated String, so he had to call a conversion function, which allocated memory and copied the string.
Oh, far from that. Last time I checked Cargo didn't understand the difference
between downloading things (dependencies) and compilation. Even its building
process didn't have that separated, and happily downloaded a random pre-built
ELF binary to build itself.
I'm not sure what you mean by not understanding it. It is true that if you don't have dependencies, "cargo build" will go get them, but if you don't, then it shouldn't. There have been some bugs that caused it to over-download, but those were bugs.
> happily downloaded a random pre-built ELF binary to build itself.
Yes, Cargo builds itself with Cargo, so you need to get a previous Cargo. The binary is very much not 'random'.
> It is true that if you don't have dependencies, "cargo build" will go get them, but if you don't, then it shouldn't.
The main problem is you can't download dependencies separately from compiling
your code. It should never be the same step. Even pip had it mostly correct,
by providing an option to disable network communication.
It is somewhat mitigated by documentation mentioning where the downloaded
dependencies are put, so you can do all that manually. This sucks balls
> Yes, Cargo builds itself with Cargo, so you need to get a previous Cargo.
The binary is very much not 'random'.
It's fetched outside of regular downloading you do on tarball. You have no
control over where is it downloaded from. It means it's pretty much random for
your purposes. Heck, it would be even unexpected, if I wasn't used to
developers not understanding the difference between building and downloading
But this is not the main point. Cargo should not need already compiled Cargo
to build itself. This is ridiculous. I understand that every good compiler is
built with the very same compiler you're building, but look at how they do
that: by bootstrapping. Not so much with Cargo. Who ever thought it as
a valid idea?
This is just going to be a preference then. I much prefer not having to type two things to do what's conceptually one: please build my code.
"cargo fetch" means you don't have to do the download step manually. IE, you _can_ do it in two commands if you'd like.
> It means it's pretty much random for your purposes.
We are just going to have to agree to disagree on this one. First of all, you _already_ need an equally 'random' binary: rustc itself. Which is also bootstrapped. Second, it's not clear to me why bootstrapping is valid for a compiler, but not for any other kind of project. Cargo is a build system, written in Rust, so it uses Rust's build system.
Related: we're wondering how to use Rust because currently all non-Rust dependencies in our org are pulled from:
- corporate source code control systems
- corporate central repositories
- caching/proxying immutable repositories
These ensure all projects are built from known sources. We _know_ we can get consistent builds.
When using Cargo:
- Project owners update projects and don't bump the version. New bugs / security problems could be injected even though we haven't changed a thing internally.
- crates.io isn't always up.
- Trust: we legally can not trust (PCI compliance violations - 2015 rules (viral) ) the public crates.io repository. Besides PCI compliance, it's not possible for crates.io to guarantee perfect security (so many reasons, obviously).
* I'm hoping folks who have addressed this issue (or are addressing it, or are planning on addressing it) would comment.
> Project owners update projects and don't bump the version. New bugs / security problems could be injected even though we haven't changed a thing internally.
You can't update a crate on crates.io without bumping the version. Once a version is published, it cannot be removed. (It can be "yanked," but even yanking it does not make it completely inaccessible.)
> These ensure all projects are built from known sources. We _know_ we can get consistent builds.
Cargo isn't coupled to crates.io. You can run your own registry index. (Note the `[registry]` config section: http://doc.crates.io/config.html) --- All of the code that powers crates.io is open source. On top of that, crate dependencies can be specified via git URLs or locations on disk. Repeatable builds are well supported IMO.
I think Burntsushi gave a good answer here. We want these features! It's a matter of getting the requirements correct, and then helping build them. We have some of this stuff already, and are working on what Firefox needs, which is very similar, but would love for anyone who has a stake in this to help tell us about what they need, specifically. If that's you and or your org, starting a thread on http://internals.rust-lang.org/ would be quite helpful.
You can use git or path deps if you want; or set up your own registry (there is code for it, but I don't think it's easy to set up yet. IIRC it's planned, thoughts welcome!)
> Project owners update projects and don't bump the version.
You cannot update code in a crates.io dep without bumping the version. And new versions only get pulled in when you do a `cargo update` or you update a package which bumps the version number of its dependency.
* It's not always that you have internet connection when building
* It's not always that you have access to the website code is downloaded from
(websites break, and companies have different policies about internet), and
even if you do, you don't always have it direct (e.g. proxy).
* And at the last, people will build code for various purposes,
including building packages. Building a package should be a repeatable
operation and should always use only clean source (no artifacts from previous
builds), and downloading random things on every build from clean source is
very easy way to break the process.
>> It means it's pretty much random for your purposes.
> We are just going to have to agree to disagree on this one. First of all, you _already_ need an equally 'random' binary: rustc itself.
No. I downloaded Rust compiler myself (or had it installed from a package).
I controlled that, it didn't hit me in my face unexpectedly.
> Which is also bootstrapped.
s/also//. Cargo is not bootstrapped, because it needs pre-built Cargo to build itself.
Compare this to Rust compiler: you start with nothing but C(++? I don't
remember) compiler, and end with Rust compiler. No intermediate download
> Second, it's not clear to me why bootstrapping is valid for a compiler, but not for any other kind of project. Cargo is a build system, written in Rust, so it uses Rust's build system.
It's not invalid, quite the contrary. It's just Cargo doesn't boostrap
itself out of clean code. It's that simple. I wouldn't have as big problem
with it if its build process produced an intermediate, crippled Cargo binary.
I would then complain about requiring external dependencies to be downloaded
(in contrast to included), but that would be difference on strategy
> * It's not always that you have internet connection when building your code.
Cargo does not require this.
> * It's not always that you have access to the website code is downloaded from (websites break, and companies have different policies about internet), and even if you do, you don't always have it direct (e.g. proxy).
Supporting local proxies is on the Cargo roadmap.
> Compare this to Rust compiler: you start with nothing but C(++? I don't remember) compiler, and end with Rust compiler. No intermediate download involved.
This is incorrect. Rust has been self-hosting for years. Before it was bootstrapping, it was written in OCaml.
> It's not invalid, quite the contrary. It's just Cargo doesn't boostrap itself out of clean code. It's that simple. I wouldn't have as big problem with it if its build process produced an intermediate, crippled Cargo binary.
Cargo is basically just part of the Rust compiler. Rust needs Rust to bootstrap itself, like tons of other languages. So Cargo needs Cargo to bootstrap itself. This really isn't a problem.
The fact that you didn't even know that Rust is self-hosted is proof that it really doesn't matter to the user—it was so invisible you didn't notice it!
"Before it was bootstrapping, it was written in OCaml."
That's been my exact recommendation for getting compilers started in robust way. You people keep surprising me in pleasant ways. :)
Curious, did Ocaml's clean syntax and ease of decomposing functions make the transition to Rust easier vs a language like C++ or Java? I predicted ML languages should have that benefit but I couldn't test it on small projects.
That's so funny. More support for my recommendation. Since you mentioned it, my main recommendation today if someone wants to get somewhere is to use Ocaml but target LLVM. I'd like to see LLVM re-implemented in Ocaml and parallel developed. Doubt that will happen but using Ocaml to generate LLVM code seems quite doable.
> Both allow to use C compiler to compile them (even though they generally advise to use pre-compiled compilers).
How many people actually do this, though?
We had this conversation early on in Rust's life and the consensus was that we could make a non-bootstrapping compiler in theory, but that'd be asking us to do a huge amount of work (writing a separate compiler!) for something that very few people are going to use in practice. There are so many more important things that we could be (and are) working on than something that's basically just for purism, because most people just "apt-get install rust" or "brew install rust" and don't care how it's built.
Enough to warrant this being kept after eight years after I've seen it for the
> We had this conversation early on in Rust's life and the consensus was that we could make a non-bootstrapping compiler in theory, but that'd be asking us to do a huge amount of work (writing a separate compiler!) for something that very few people are going to use in practice.
Note that it's not necessary to have fully blown compiler. Just a subset of
language would be just enough, if it allowed for compiling the compiler. And
how many of the Rust features are embedded in rustc, anyway?
> There are so many more important things that we could be (and are) working on than something that's basically just for purism, because most people just "apt-get install rust" or "brew install rust" and don't care how it's built.
So you basicaly say "fuck you" to distribution developers and all the
sysadmins that care about their systems, am I right?
I thought that rustc was boostrapped from the old c based compiler(allowing you to build clean from source) multiple times(C -> Rust 0.x? -> Rust 1.0)?
From what I last saw of cargo there's not a clean path that lets you build from source that doesn't involve pulling binaries(and is really painful if you want to use cargo on a non-binary platform like RPi.
There's never been a C-based compiler. Rust started out as OCaml. A true bootstrap would take weeks of non-stop compiling. You'd end up compiling rustc itself about 900 times, ignoring all the different LLVM builds, I think. Rust bootstraps from a binary snapshot of itself, and so you would have to work through the list of every snapshot ever (about 300, but you need to build rustc 3 times for each snapshot to properly bootstrap that copy). The snapshots are listed here: https://github.com/rust-lang/rust/blob/master/src/snapshots....
In the past the snapshots were taken almost weekly as massive language churn dictated. Now they're quite rare.
The situation with Cargo would be exactly the same: before Cargo existed, it couldn't use Cargo, because Cargo didn't exist. You can boostrap your own Cargo from that time period if you'd like, but that doesn't mean it's easy. But it's no different than rustc; where you can bootstrap your own snapshots from the OCaml compiler.
> I thought that rustc was boostrapped from the old c based compiler(allowing you to build clean from source)
What's with all the baseless assumptions of the initial Rust compiler being written in either C or C++ in this thread?
Contrary to apparently popular belief, C and C++ aren't some primordial lifeform compiler language which all language implementations have to be initially written in. Believe it or not, you can write a compiler in any (general purpose) programming language! Amazing, I know.
1. Funny. Last time I built Cargo it insisted on downloading Cargo, and I've
never seen even a hint that it can build itself without that. I needed to
download dependencies manually, compile them manually, and compile Cargo's
code manually as well.
> 2. You are incorrect about Rust. Rust is written in Rust, not C++,
And how that renders me incorrect? GHC is written in Haskell, but you can
build it with C compiler (and then recompile again, now with Haskell
But indeed I was wrong. I remembered incorrectly that I built Rust cleanly and
without network activity. I actually downloaded sources and issued compilation
command, without any regard for packaging, so it could do any stupid stuff
like downloading things when compiling.
> and so building Rust involves downloading a binary of a previous Rust. That's what 'boostrapping' means.
No. "Bootstrapping" means building a compiler using a compiler when you don't
have that compiler. As I said elsewhere, compare that to OCaml and GHC, which
can be built without downloading OCaml or GHC, respectively.
Going to have to agree with dozzie here, he's right on almost all accounts.
There's a ton of things I like about Rust but the points he outlined above are things that I've seen very much impede the adoption of Rust into some environments that have similar restrictions(no download on build, build from source without binaries) for security and reproducibility reasons.
> There's a ton of things I like about Rust but the points he outlined above are things that I've seen very much impede the adoption of Rust into some environments that have similar restrictions(no download on build, build from source without binaries) for security and reproducibility reasons.
They're all on the roadmap for the very near future, because Servo (and Rust-in-Gecko) needs them. There's nothing in the design of Cargo that prevents these from being solved.
Personally, I love Cargo. Autoconf is miserable and I would hate to go back to it.
As I said to the parent, if offline builds do not work, please file bugs. Integrating Rust code into Firefox requires the same restrictions that you're talking about, and is a use case we explicitly want to support, and already do, to a certain extent.
Reproducibility is the thing that bothers me most about auto-downloading build systems. On the other hand, Cargo (IIRC) doesn't do what Maven does: if you say you want version X, you get version X or anything newer (unless you jump through hoops that I've never seen anyone jump through).
[Security just scares the piss out of me in general; downloading binaries is probably the smallest problem.]