The point I'm about to make in this comment is so old and has been said so many times that we all are tired of hearing it. But why do we use a language like C++ to implement something where we don't wish to have bugs?
C++ is not memory-safe (no GC or whatever Rust does), it's not type safe (in the ML sense), it relies on writing to memory a lot (instead of having pure functions). Had this program been written in Ocaml or Haskell or a strongly typed functional Lisp, this bug would have been impossible to make.
Why do we do this, still? Is it not irresponsible?
You are looking at a 1k LOC function in a 7k LOC file with a bug that can be easily spotted by -Wall and you think it's the fault of the programming language? The only thing irresponsible is putting your hard-earned money into the 100th Bitcoin fork maintained by some guy with an Anime avatar. Once you stop trusting the developers it essentially becomes an underhanded ... contest and no language wins.
Didn't try, but I could imagine that running -Wall over that codebase produces huge amounts of noise rendering it practically useless, until somebody gets to fix all warnings which are not bugs.
> -Wall over that codebase produces huge amounts of noise rendering it practically useless
As a c++ developer I would nope the hell out of there or spend a month fixing the warnings - depending on pay. No way would I work with something in that state long term.
> until somebody gets to fix all warnings which are not bugs.
Code that is filled with ignored warnings generally gets worse, not better. Having little to no quality standards does that.
I think that's kind of OP's point... Use a more modern language that actually treats these kind of bugs (and many many others) as errors rather than as optional warnings.
There's a reason practically no one uses a memory-safe C++ implementation.
Yep. I'd add that if it's a warning you're absolutely certain of, use the warning control pragmas to suppress them, and document why the warning's suppressed. Keep the compiler output clean so you can see where you need to examine.
Worth noting you should probably disable -Werror for distribution if you expect users to compile your code. They aren't going to fix your error for you and you can't know what warnings will be added to future compilers, nor what previous compiler versions marked as errors. Among many other people saying the same thing see [0] if you don't trust me.
Also IMO you often want to explicitly disable a few of those from -Wall and -Wextra -- Looking at the last c++ project I wrote, I turned off a few things like `-Wno-unused-parameter` for example.
As you get further into a project you tend to need subtler tools. But for the rough work when starting out, my preference is to be stricter, which can be relaxed as necessary, rather than starting out lax and reaching a point where tightening becomes Sisyphean.
Which does actually turn on all warnings (so you'll probably want to add things like -Wno-unused -Wno-padded).
Under gcc, '-Wall -Wextra -Wpedantic' does not enable all warnings. There are still more than 40 flags you have to give if you really want all of them.
Zcash, for example, is built with `-Werror` (edit: not `-Wall`, but we're working on that). So this is absolutely feasible on a Bitcoin-derived codebase.
In any trusted codebase fixing every -Wall warning is the least you should aim for, possibly with a few intentful exceptions. Not doing so is very sloppy.
What is even worse that those guys are not able to use C++ properly. They created function with more than 1000 lines with so many branches! It will be almost impossible to write test which tries all code paths. I do not think that changing language would help in this case with such programmers.
The bug happened in "inlined constructor" so they should check all uses of this class/struct to check if it is not copied to other places (and probably all other structs/classes).
Because pretty much any language that's been touted as a replacement has problems which need solved before they'd really be viable. They're either single vendor with no independent standardization, like Rust or D, or have lower performance like go, or both. Once both problems are fixed, it'll be a lot easier to start looking at alternatives. Until then, there's not a language that settles into a lot of the niches C and C++ operate in.
This case specifically, a lot of the issues that allowed this to happen were logic errors which would have been valid in most languages. A functional language would not have helped very much here.
C and C++ are still used because they work, and have huge ecosystems around them. There's a lot of tooling around them, the compilers build fast, code, and are available for pretty much every planet under the sun. This particular case, it would have helped a lot if the developers looked at the warnings of the questionable code they had made, but they seem to be of the "it compiles, ship it" type that would have let those things fly regardless of the language.
Ada has multiple implementations, is pretty safe, and produces fast code. Yet it isn't popular, so I guess there are other factors that are more important than number of compilers and speed.
languages suffer from the same network effect as social networks. You could argue that Ada is a better programming language or that Ithkuil is a better spoken language but unless other people are also using them, you'll struggle to get help or hire people (or get contributors).
Ada isn't my favourite language to code in. But if I was tasked to build something safety-critical, I would strongly consider Ada SPARK [1] among the possible tools to use.
Haskell has not even close as many libraries, sure. I don't know ZCoin but I have implemented a (simple) Bitcoin wallet. For that you need to have a good crypto library, but otherwise it's not magic. Mining as well.
I think the lesser amount of libraries and lacking the chance of fine-tuning performance (but also your risk of screwing up your performance badly) is a very low price for the provably impossibility to have a mistake like this.
I'm just thinking back on my career, which has been full of bugs of all magnitudes. If I categorise my past bugs, there are many common patterns. Famous old-timers have names, like buffer overflows or memory leaks etc etc. Just not having the possibility to make those errors is such a freeing experience.
Also what's the big problem with no independent standardisation?
Code like this, there were just as many, if not more, logic errors that wouldn't be caught as there were mechanical problems that would have been caught with a more diligent compiler. There's just not the discipline there for good code, period. Too many corners cut, too much code commented out, etc.
I think with the types of errors you mentioned, that part of it definitely is ecosystem. Look at OpenBSD, for example. They've built things like a safer malloc and fairly extensive patches to gcc that have caught a lot of errors and bugs. I think C/C++ can definitely get safer without changing that much, the ecosystem just needs to get a bit angrier about warnings, etc.
Standardization, and multiple implementations, means you can have multiple sets of compiler eyeballs looking over things. The zcoin bug, for example, emits a warning with the default settings of clang, but doesn't emit a peep with the default settings of gcc. For someone working with a lot of embedded code, it also means that there's a better chance that there's an implementation of the language ready to go when you need to move to a different platform.
This bug had nothing to do with c++ and its ergonomics, let alone it's safety.
> Unfortunately when populating the denomination member variable, an equality operator was used instead of assignment, resulting in the denomination always being zero, as set in the class constructor.
That is absolutely a bug in C++'s ergonomics. Languages that use pascal-style := for assignment would not have this problem. Languages that don't have the surprising value behaviour of = would not have this problem. Hell, even as crude a language as Java would not have this problem because it doesn't allow an integer to implicitly decay to a boolean.
Building with warnings on would also have caught it, but the fact that someone started a new project without this warning turned on shows that C++'s defaults around warnings are bad, which is an ecosystem ergonomics issue.
I was using C++ in 1994. I am fully amused today to hear 'ergonomics' attributed to the language, even with the qualifier 'bad.' I'll spare you the rest of this "back in my day" post. I just can't fault the language for a project creator not including -Wall in the Makefile or whatever they used to build.
Indeed, just as in every other ML-family language since 1973. (I'm glad these things are finally getting mainstream attention, but I'm not sure why people get so excited about "Rust features" that are almost all taken straight from Ocaml)
Hey, I'm just glad to see 1970's PL research finally making it to the mainstream(ish), regardless of which language gets the marketing credit for it.
I think the main advantage Rust has over OCaml is that it is syntactically more like C. But, hey, if it's what gets HM-style type inference to the masses, I'll take it.
This doesn't have anything to do with the value behaviour of =. The statement in question was intended to be a simple assignment, throwing away the value of the = operator and only using its side-effect. Instead an equality operator was used, which produced no side-effect.
So if you want to blame anything, blame the concept of expression statements, I guess?
The idea that you form a value by creating an uninitialized one and then filling in its fields is dumb. (I suspect the field value wasn't even 0 as such but rather undefined, and the implementation happened to make it 0?). Better languages have you initialize the value directly and you can't access it until it's fully initialized. C/C++ are just really bad at expressing nontrivial values, even value literals.
(Discarding a value ought to be an error too, though that's more cumbersome to work with)
You can rightfully blame having state setting be part of your language. It is easier to program without state than with it, just like it's easier to juggle 2 balls than 3. Without state (or with the absolutely minimum of it) you can not make mistakes with it. Also, if your state-setting operator is super similar to your equality-checking operator, you can fall into this trap (like the devs in question did).
Programming without state is something that has helped me personally make my code much better. I produce less bugs by far since I stopped relying on state for so many things. A lot of micro-optimisations are not doable for me. In exchange I guess I get to run things in parallell very easily. But mostly, I get fewer bugs, which makes my life and the lives of my customers better.
Amen to that! If Rich Hickey is reading this - it is to you I owe my thanks for the many hours I have not spent on fixing state bugs. Clojure was my first functional love story.
So, just as in c++ there is a different operator of assignment and equality that requirs you to type a different thing. User error and testing failure.
Yes it is. Assignment is another word for writing state. If your programming environment doesn't include a way to set state, you will never do it. Why is doing the dangerous task of writing state so easy you just have to hit one key to do it in C++?
Let's collectively put our seat belts on, colleagues!
Even in Haskell, writing state is as easy as += or <- and this is used in nearly every actual program.
A cryptocurrency implementation like this doesn't need a language that makes state impossible. It needs static analysis, model checking, and proofs. This would be true even if it were written in your favorite functional language.
Sorry for being unclear. In my timezone it's pretty late.
If there was not state, the coder wouldn't had tried to set it. I don't think a language needs to have the `=` operator easy and close at hand. Had that not been the case, this bug (and oh so many others) would never had existed.
ZCash was built upon the Bitcoin codebase. This inherits a lot of bad decisions. Moral purity, demanding they start over again from scratch, just isn't practical.
The bug in question could have been solved had the simply compiled with minimal static analysis -- by which I mean -Wall.
C/C++ is memory safe if you turn on dynamic checking. Sure, it's twice as slow as C/C++, but still tons faster than nonsense languages like Ocaml or Haskell.
This bug had nothing to do with Zcash. Please correct your comment to say Zcoin, if that's what you meant.
In any case, Zcash is also derived from Bitcoin and builds with `-Werror` (edit: not `-Wall`, but we're working on that). That kind of minimal static analysis is certainly not sufficient to catch the majority of bugs, though.
C++ is not memory safe in any meaningful sense. There have been efforts to define a memory-safe subset, but typical large codebases, including Bitcoin, do not come close to falling within that subset.
> C/C++ is memory safe if you turn on dynamic checking.
What's the option to turn that on? Which compilers is it in?
I know there were several fat-pointer patches to GCC back in the day, but I didn't think anything remotely similar had ever gone mainstream. There's just too much existing code that relies on undefined behavior last I checked.
Yes, this bug is not about memory safety - but having the state-writing operator so easy at hand was probably what lead to the mistake in the first place. In my GP comment I went on to rant about other classes of bugs which are obsolete since the 1970s, but still happen.
Companies exist to make money so the most responsible thing to do is make as much money as possible. If breaking the law means you will make more money then do it.
See: a x b x c = x formula from Fight Club or Uber's entire business model.
Your assumption that we don't want bugs is probably why you are asking the question. Companies don't care about bugs, they care about losing money so if the bug doesn't cost money it's not worth fixing until it costs more than the fix.
I think this is actually very short sighted but that's just how the incentives align in our current system.
I make no claim to the philosophical implications of any of this.
> Why do we do this, still? Is it not irresponsible?
One reason is security - if cryptography isn't done in constant time then there is the potential for timing attacks. This is why BitCoinJ ditched BouncyCastle for their own JNI wrapper around BitCoin's libsecp256k1.
There is a rust client for Ethereum, called Parity - they do FFI calls to libsecp256k1. Similarly rust-crypto vendors most of its crypto in C that it calls through FFI. Similarly go-ethereum also uses libsecp256k1. However, I know that Tendermint uses Ed25519 instead of secp256k1, and in turn they use NaCl which does not involve FFI.
Mining for Ethereum is done on a GPU, so the miner code is necessarily written in OpenCL or CUDA.
There are a few blockchains written in functional language but they are new to the scene. There were a couple of Haskell Ethereum clients but they appear to be abandoned. I know kadena[1] and cardano[2] are written in Haskell. There is also Aeternity[3], which is the only currency I know of written in Erlang.
In the early days Charles Hoskinson apparently wanted Ethereum to be written in Scala. I don't think any of the core devs know Scala, but then again Charles Hoskinson was booted from the project so his language choice was dropped.
Synereo was to be written in Scala, but the founders (the CEO and the CTO) split in December. The CEO doesn't want Synereo to make a blockchain anymore, he wants to make some kind of escrow for ponzi schemes. I hear from the devs they want to code the thing in javascript. The CTO is making something called RChain; I heard he wants to use some pi-calculus interpreter he wrote in C90 back when he worked at Miter corporation in the 90s, but maybe he's ditching that.
From job postings I can see MaidSafe is looking for Rust programmers. MaidSafe has been in R&D for a decade, so they must have had yet another pivot recently.
To be frank, while there are functional programmers in the scene, these guys are struggling to establish themselves. There's a lot more talent in procedural programming which is why so much code is written in those languages. I'm not 100% that functional code is going to have fewer bugs which result in financial exploits.
I really hope amateurs with "a {} is a {} is a {} is a {}" level skills aren't working on cryptographic or financial software in C or C++. Write Rust, though, which is {}, and at least the box of hand grenades you're giving them doesn't have all the pins pre-pulled.
Luckily there are some Blockchain/Crypto currency startups that do understand the value of using something like Haskell. See https://iohk.io/projects/cardano/
These pump-and-dump schemes are getting ridiculous. Implementing a cryptocurrency client in Haskell isn't enough to justify creating an entirely new currency. Whether or not the protocol itself is sound is (almost) completely orthogonal to the language in which it's implemented, and at this point there are dozens of Bitcoin clients written in many languages (including Haskell [1]), so it's extremely unlikely that Bitcoin as a whole is still at risk from unsafe language choice.
Using 'unmanaged' languages for 'performance' reasons is no longer a good reason, because the likes of C# have shown multiple times that there is no reason to choose C++ over C# if you look at performance alone (difference is neglible).
Sure, it's silly benchmarks, so I wouldn't take the results as gospel. But the fact is, nobody who writes C# managed to produce a benchmark yet that beats a C++ program.
Fast languages like Rust have managed to at least match the performance of C in some benchmarks (and even beat it in others).
Properly unmanaged code will probably always be faster than managed code. But the point still stands. Code in a managed language, identify the hot spots and inject highly performant unmanaged code there. There is no need to go full C++ for 99.999% of all projects.
> nobody who writes C# managed to produce a benchmark yet that beats a C++ program
No, but I've seen Haskell programs that were much faster than the C++ program they replaced, which is what matters in the real world.
Microbenchmarks measure what happens if you have the time to polish every single line to perfection. If you have infinite developer time then C++ will be faster than most languages. But infinite developer time is a negligible use case.
It's worth pointing out that the majority of large C++ projects contain non-deterministic memory management (reference counting) and cache-inefficient object structures like C#.
Absolutely. But note that if you don't have cycles, reference counting of SOME objects is faster than having a GC for everything. The RAII idiom when used in C++ helps with statically determining that some objects can be released at defined points without having to reference count them.
Actually the type system catches that particular mistake:
error[E0308]: mismatched types
--> src/main.rs:5:6
|
5 | if x = 1 {
| ^^^^^ expected bool, found ()
|
= note: expected type `bool`
= note: found type `()`
error: aborting due to previous error
error: Could not compile `playground`.
To learn more, run the command again with --verbose.
I haven't yet reached the level of Rust mastery where I'm confident in this answer, but as far as I can tell it's very hard to make that kind of mistake. I'd be very interested to hear what others have to say on it though.
The problem isn't in the implementation, it's a problem with Zcoin itself. One should not be able to create Zcoin via software whether that's a bug or deliberate.
Sorry, that wasn't clear. With bitcoin you could create it at will, but it requires doing a certain (constantly increasing) amount of computational work. Work that can be verified. There is no way for a bug to accidentally create a bunch of bitcoin, and no way for anyone to deliberately create any without doing the work.
HN's law: all problems with C can be solved by writing a sufficiently high-brow post containing snide observations only a sophomore is smart enough to come up with..
Au contraire. I've had a long career in the field (various domains, I'm always the "code guy") and it's actually useful to me to hear people besides myself expressing the level of surprise/frustration/bemusement they experience. It keeps the trade from being utterly dry.
First of all, there is no need to be mean, even if I understand you mean well.
Isn't it true though, that with some techniques certain bugs are impossible to make? If you have not experienced it, I recommend you to seek it out and try.
Hello world in Rust is still 2.5MB if you statically compile all the libraries in since it doesn't strip unused functions. The tools and ecosystem are nowhere near what they are for C++
To be clear, you think that 829kB is the size of a statically linked C program that uses printf to print 'hello world'? In visual studio it is 25KB - 40KB. All of musl compiles to under 500KB.
A purely functional statically typed one, like OCaml or Haskell, because there would be so much less to review, and an insane amount of the reviewing work is automatic.
Zero bugs around state, zero bugs around memory mgmt, zero bugs about error conditions not being handled, zero bugs due to some data being expected but not being written, etc.
Well, a couple months back I attended a talk on a Haskell implementation of the Noise protocol.
The programmer admitted that he was a cryptography novice, and in fact a Haskell novice.
As a result the code he wrote is needlessly abstract - for one thing the guy uses Free monads and in turn ropes in template Haskell as part of his state model. I really have no idea what code he's generating.
The code has other features that make review challenging - for instance he doesn't qualify any of his imports so it's hard to tell where to look for the functions he's implementing.
Maybe you are better at auditing Haskell than I am. As DJ Bernstein writes in various places, one common exploit is to construct an elliptic curve Diffie Helman shared secret with input that isn't a curve point. I really can't tell if the guy is mitigating against this attack or not, but here you can have a look:
"We" don't. But yeah, using C++ is stupid, and anyone paying attention knows this by now. At this point the only thing is to point and laugh and raise the new generation better - I fear that, like science, programming practice will have to advance one funeral at a time.
This indicates that the code in question is not covered by tests. I work on a database, for which correctness is paramount, so it's unnerving to see any code fixes not associated with test additions or changes. I would have thought that cryptocurrency had similar standards.
We've modified the dlang Github tester to run coverage tests over Pull Requests, and uncovered changes get marked with a scarlet bar. While some changes can't be tested with a test suite, the result has been a marked improvement in the quality of pull requests (including mine) because it is embarrassing to submit untested code.
It should be an error, not a warning (default or not). There are a number of things like this that should just not be accepted anymore. Here's another one:
if (condition);
dothis();
No compiler should accept that, regardless of switch settings.
It should be an error, not a warning (default or not).
In C++ you can overload '==' to do anything you want, including things that have side effects[0]. Without deeper introspection into the code, it's tricky to check if it's an error or not.
-Wall sure (also, it should really enable all warnings, just as the name implies. Currently it doesn't)
-Werror, I'm not sure. During development and on CI systems it makes sense. But if you intend to ship source code, it's probably best left off. Other people might be using different compilers that emit warnings for things yours didn't. You don't want the build to fail for those people.
> Other people might be using different compilers that emit warnings for things yours didn't. You don't want the build to fail for those people.
Actually, you do. That way those things will be fixed and hopefully your inclusion of their fixes is a well motivated PR away.
Switching -Werror off should be a decision made with great care and understanding of what's going on under the hood. If there are platform specific issues then people on those platforms will be the ones in the best position to determine if such a warning is actually an error or not.
I want to agree with you but in practice I can't really imagine switching -Werror outside of dev builds. The problem is that different compilers have different warnings and if your code is meant to be portable it's going to be a serious pain to avoid all the warnings all the time.
Having compilation break for a user because of a spurious warning due to a different compiler (or different compiler version) would make it a lot more painful to maintain.
For instance I started maintaining an old C++ codebase that generates a bunch of warnings when build with a modern g++, code like this:
#define M "toto"
const char *s = "tata"M;
Generates warnings like:
warning: invalid suffix on literal; C++11 requires a space between literal and string macro [-Wliteral-suffix]
Is it helpful? Sure. It is worth breaking the build for anybody using a modern g++ until I manage to fix all occurrences and release a new version? I don't think so.
An other example is building code with -Wextra which warns you when some comparisons become "nops". For instance:
int toto(int i) {
if (i <= 0xffff)
return 1;
return 0;
}
On most modern machines this will build without a hitch, but if you compile on a 16bit machine you'll get a warning saying:
warning: comparison is always true due to limited range of data type
In my experience this particular warning is generally not a problem. On the other hand the following code:
int i = 0xabcdef;
Will generate the following warning on 16bit architectures:
warning: integer constant is too large for its type
And that's potentially a lot nastier and should be an error IMO.
So I'd say the main problem is that C and C++ are way too permissive by default, things like comparison with no side effects ought to be errors, not warnings. And there are a bunch of others, for instance why on earth are these not errors :
warning: no return statement in function returning non-void
warning: ‘i’ is used uninitialized in this function
So -Werror is just killing a fly with a bazooka IMO.
warning: comparison is always true due to limited range of data type
Surely this is a serious problem on 16 bit builds; the comparison implies that larger values are expected to be stored in this variable, the warning is a strong hint that the (size of the) types are incorrect.
In my experience it's generally not the case, you generally do these tests if you're about to truncate a value or do sanity checks. If the value is too big to fit the variable then it should've been caught earlier, when the value was loaded in the first place (i.e. strtol or whatever).
It gets worse for this particular project. They do build with -Wall and -Wextra, but they ignore many warnings that the compiler does emit, and a lot of them are adding/removing a single word. Also, compilers have different default warnings. Gcc doesn't emit a warning on that code, but clang does, for example.
Our user base is steadily growing. Coming soon is a nice upgrade to the language designed to eliminate stray pointers into expired stack frames.
I feel it is my duty as a language designer to make it hard for programmers to commit common bugs. It's just too expensive for companies to ship these kinds of things.
Any C++ static analyzer worth its salt would catch this. I don't think this is so much of a C++ language issue, but more of an issue with C++ environments/compilers not doing static analyzing by default, and/or the authors of this software not being aware of the fact that they need to use static analyzers.
You're right. My contention is that the language itself should no longer accept such code, it shouldn't be relegated to an ignorable warning or a 3rd party tool that isn't run.
Indeed, the next standard should ban such things, and modern compilers should reject these patterns by default, unless a --legacy_security_unsage flag is set.
so nobody is going to type that in by accident, and nobody is going to want to defend it in the code review process. And the compiler vendor will still get a gold star for being 100% language compliant :-)
Yes, major improvements could be made by changing the defaults of the tools that beginners use. I edited my original comment to be more clear on this as well.
I think we're going to have to disagree. It isn't just an issue for beginner code. Experts make such mistakes, too. Expecting them to not make mistakes is unrealistic, and sending them back for more training isn't going to work, either.
I can't think of a reason any modern language should accept such code.
If you're interested, I recommend the series "Air Disasters" on TV. Each episode analyzes one crash, where investigators determine the cause of the accident, and what the corrective action should be taken. Often, it comes down to "how could a pilot with 10,000 hours experience make such a mistake". While it would be easy for investigators to blame the pilot, often they find that the mistake was just too easy to make, and they recommend changing the design or procedures or both.
It gets more fun when you're dealing with things like memory mapped I/O, where reading from a register is a destructive action, so even a non-overloaded == can cause side effects.
There's even room for marginally technically skilled to rake it in. I think Karpeles is still out on bail. And there's a few that did well with just marketing skill and vaporware mining gear.
I'd take that as not even needing to sell what's needed, sell approximately what's needed. Your snake oil just needs to make their cough hurt a bit less, not cure their tuberculosis.
I am genuinely curious as to what the arguments would be one way or another. People are taking a risk with these things and they know they're risky. So it's not really stealing.
I believe it is a crime, both in spirit and on paper. In spirit, you took something that was not yours and not meant for you, and by selling the coins you decreased the value of everyone else's. On paper, let's say that the developers behind whatever coin you hack decide that they are going to take no action. You still run afoul of countless computer security laws designed, however poorly, to protect people from hacking, so at least in the first world you could find yourself in a legal bind if your identity is ever discovered. Even if the devs don't go after you, the rest of the community may, or the government may of its own accord. Now, in keeping your identity private, you have to launder the money, putting you in violation of even more laws and ones that the government will most definitely care about. The biggest player left is probably the IRS, who will take all kinds of interest in either your newfound, undisclosed wealth, or your half million dollars in untraceable gambling wins. It's an extremely reckless move unless you are effectively a pirate living in some lawless part of the world with good connections to foreign banks.
The OP's point was about morality, and it is as immoral as cheating in any system. Whether it's a crime is entirely different. Crime necessitates a judicial system, and the whole framework of crypto currencies has be outside the purview of any government or judicial systems. So the word crime itself doesn't mean anything in this context. Secondly, your argument relies heavily on the gravity of this act, which is entirely derived from the implied value of the coin. In isolation, this is no different from someone cheating in a computer game. Note that you cannot make the computer game analogy with other markets and banking systems easily because of dependence on other inputs. This, in contrast, is completely isolated from the rest of the world. As such, the guy may not do anything with the coins, and they would cease to exist (in their universe much like how they came into being). Would your argument still hold ? The fact that people have created a secondary market out of the coins, which came into existence out of thin air in the first place, doesn't automatically determine the criminality of the act.
This doesn't really address the morality issue. Suppose you could get away with it. The question remains, did you hurt people who didn't expect it etc.
When someone comes to play poker with chips, is it immoral to beat them and take their chips? No, they factored it in.
However, is it immoral to figure out a new collusion strategy and beat them?
Was it immoral to figure out a card counting strategy to beat casinos?
In other words, is it immoral to figure out new strategies in what is essentially an experimental game where participants assume there is risk?
In well established games like poker, which explicitly have gentlemen's agreements of "no collusion", it may be immoral.
But in a new experimental system such as a cryptocurrency whose software may be buggy, is taking advantage of the bugs really immoral?
When is it "buyer beware" and "a fool and his gold are soon parted"?
Isn't there any threshold for taking the money of a risktaker?
Well, crypto/bitcoin businesses in general tend to be quite lucrative. However lots of risks as well. For a security researcher there are no downside risks as much.
You're never beyond the reach of a civil suit. If you identify yourself with the scheme and you're in the same country where there's adoption, you could be easily targeted.
Plus, the language in 18 U.S.C. § 1030(e)(2) (courtesy of CFAA) refers specifically to "any computer, when [it affects] use by or for [a] financial institution." And later it mentions "...affecting interstate or foreign commerce or communication...". This statute is one that gives so much leeway to prosecutors that it's frequently abused.
An EULA ("End-User License Agreeement") has to do primarily with copyright (17 USC [1]). As such it's pretty much orthogonal to the criminal statutes related to this act of "misusing" cryptocoins.
> next generation of cryptocurrencies can have a simple EULA
This is anathema to the nature of cryptocurrencies, all (?pretty sure all that matter anyways) of which have Open Source implementations.
Really, most cryptocoins which are worth anything have already had a big market cap "bounty" that has effectively proven their safety.
We don't need a EULA. Like I said, you could maaaybe sue for damages. Unfortunately there's no easy way to undo the impact to a coin's reputation once it's shown to have a weakness like this. Even once the bug's patched and the nodes all get back on board, exchange rates will suffer for a long time.
You are getting attacked for poor wording.
It's reasonable (and there is precedent) for something like a blockchain fork that voids the value of bad coins. You'd need some implementation of majority vote, though (a jury of your peers, weighted by hashrate?)
And paper notes have the unfortunate characteristic of being.. paper.. the risk:reward logic is much different if you can pick your own jurisdiction to operate from and hide behind 7 proxies. It's also a very modern business model to avoid big capex expenses like printers and special paper.
No. Zcoin implementation != Zerocoin. Also, I wouldn't say that Green is a "mastermind behind Zcash". He is one of the team members. Zooko Wilcox is generally seen as the "main guy" in the Zcash team.
People complained a lot for 10% of Zcash going to the founders (in the form of a 20% draw on the first half of mining). But that value and expertise are used to ensure bugs like this Zcoin attack do not occur.
Not if they cash out during "IPO." I don't like the idea of a founder/developer pool, but if they are going to have one, there should be a vesting period, such that the granted coins cannot be moved for X years after receiving them. This will keep the incentive alive, and I think Ethereum would have a lot more potential if something like this was employed.
This is an really interesting idea, but I'm genuinely curious if you have an idea as to how it might be implemented. I'm not an expert on cryptocurrencies, but I don't know how I would encode the condition "has a vulnerability been discovered this year" into the protocol.
Im curious. Is this actually illegal? Is there any requirement on how valid crypto currency is created?
I understand the network could implement a fix and prevent future transactions or even roll back old ones if there's enough desire, that's what democracies are about right?
But was there anything illegal about creating and then trading them? A rollback would hurt the exchanges, but if they didn't already have something in their T&Cs about currencies needing to be created legitimately (and what legitimately actually means), then is it anything other than morally wrong?
I recently spoke at an event where we talked about the crossover of cryptocurrency and the law with a bunch of attorneys present. Many attorneys spoke up during the Q and A to remind everyone that at the end of the day the legality of anything will be decided by either a jury or a judge. It's going to come down to the arguments both sides make, and I think any defense would have a hard time explaining how what they did was legal in the face of the CFAA. Analogies will be drawn to hacking of other systems, banks, etc. How people feel about it culturally will also affect the outcome.
One important distinction to make is that it really is also going to depend on the jurisdiction of those involved. Is the defendant in the US or somewhere else? Who is prosecuting and what are the charges?
I think if you explained this in layman's terms to "a bloke down the pub", he'd consider it theft. The law generally feels the same way. (IANAL.)
Not the same, but it reminded me of years ago, I had £50 deposited in to my account, by mistake, a couple of months running. I asked my girlfriend's dad - who actually was AL (WAL?) - if I could keep it. He said it would be like someone parking their car on my driveway and leaving the keys in the ignition. Inconvenient, yeah, and I don't want it there - but it doesn't make the car mine.
Depends on a country. In Poland, if a company sends you money, once you spend it, they have no right to get it back. The reasoning being that you have a right to assume companies know what they are doing.
If a private person sends you money, you have to give it back.
This seems very surprising to me. And it also seems criminally exploitable.
So if a bank in Poland sends you a million dollars by mistake, is it yours?
What would happen if someone stole it and transferred it to your account, does that make you a criminal?
And even more interestingly, if a malicious bank programmer (who's in league with you) introduced a bug on purpose (let's suppose we cannot prove the intent here, it's very difficult) and the bank mistakenly deposits you the money, do you have a duty to return it?
I don't know about Poland, but in Austria there is a similar law (IANAL but my lawyer friend told me about it), and banks are excepted from it. So if a company sends you money by mistake you can spend it and don't have to give it back after that, but if a bank makes a mistake in your favour, you have to return it in any case. Happened to my lawyer friend's mother.
I'm under the impression that the general philosophy of cryptocurrencies is that the code is the law. You can do anything that the code will allow you to do, because it's designed so that you can trust the system and not people. It's a small step to say that it's also not the programmers' intent that matters, but the code itself, because it's the code that matters, not people.
This isn't true, given how frequently hard forks happen in crypto. This means that the code said one thing, but the community decided that it was the wrong thing and everyone moved on to the new fork.
This could probably be spun as a violation of the cfaa. IANAL, but the practical interpretation of that seems to be "don't do bad stuff using a computer", kind of like mail fraud but worse.
It does. It seems like the devs are of the "it compiles, ship it" sort. Building the source, I'm seeing all kinds of warnings -- the code trying to return consts, warnings about sign differences, etc. Looks like lots of potential places for naughtiness to occur.
Indeed. I was hoping that the blog post would conclude with a section describing how they had learned an important lesson and would now be much more strict about dealing with this compiler warning in particular, and indeed compiler warnings in general, but there was nothing like that. Which implies it will quite likely happen again.
I always wonder how these "five conditions ANDed together in an if" make it past code review in general (i know thats not where the actual bug was). I don't care how brilliant a programmer you are, but mistakes in those are very difficult to grasp.
I'm definitely not advocating for unnecessarily complex or long conditionals on a single line or any other hard-to-grok code, but this comment reminded me of something I just learned recently! Someone on my team introduced me to property-based testing, which generates ranges of test cases that would otherwise be very repetitive to write manually. We've used http://hypothesis.works/ in a few places recently and caught bugs that we might have missed if we had just written tests the usual way.
As convoluted as they can be, what alternatives are there other than nested conditions, which are equally convoluted? Surely if you need to logically compare five variable then you'll be doing it in some variation of 5 chained operations.
There is a couple of ways to do it. From storing your conditions in meaningful variables outside the conditional to pattern matching and more. If conditions aren't the only branching operators.
I always wonder how these "five conditions ANDed together in an if" make it past code review in general
Easy. Either there is no code review, the code review is done by someone who has far too much on his plate a works on the theory that as long as it compiles and runs it's probably fine or the code review is done by someone likes writing code with five conditions ANDed together in an if and thinks it's fine
I'm not too familiar with cryptocurrency or generally blockchain implementations, so I wonder: shouldn't it be normal that at least the reference clients of any such protocol should be proven correct, and follow a rigorous scheme accepting contributions?
I mean, all the mathematical rigor and proofs of the underlying theory and protocols are basically useless if the rigor isn't carried over to at least the reference implementations. (If people choose to use alternative, non-proven clients, fine, that's their own decision and risk.)
That's not really how the whole scene works. There's little upfront demand for such rigour, because the potential userbase for any given coin largely don't know to demand it. Instead, poor implementations fail and either get discarded or patched, and eventually, hopefully, whatever remains will be solid.
What I'm trying to say here is, the whole cryptocurrency field is much, much messier than you seem to expect.
"Programming today is a race between software engineers striving to build bigger and better idiot-proof programs, and the Universe trying to produce bigger and better idiots. So far, the Universe is winning" --Rick Cook
The whole world runs on ducttape and bubblegum and it's a small miracle that satellites aren't falling out of the sky and everything isn't on fire all the time.
I spent quite a bit of time and effort on Zcoin during the launch, including setting up the first block explorer and mining with hundreds of EC2 instances to the point where I possessed more than 30% of the money supply.
It was a disaster. Some founders started selling their Zcoin as soon as it was listed on an exchange. The mining algorithm was changed several times.
It's a shame. Zerocoin, the underlying technology, is interesting.
C++ is not memory-safe (no GC or whatever Rust does), it's not type safe (in the ML sense), it relies on writing to memory a lot (instead of having pure functions). Had this program been written in Ocaml or Haskell or a strongly typed functional Lisp, this bug would have been impossible to make.
Why do we do this, still? Is it not irresponsible?