I have coded Solidity a bit, and it's pretty obvious the language is not up for the task. I mean its 2018 and we have a brand new "business language" that can have overflows and stuff. It's like making all mistakes over again. Many problematic vulnerabilities and lost funds (e.g. parity freeze) is due to this language being not the right choice. If "code is law" we need a better language and not just tooling on top of a broken one.
I don't know, how about Cobol? I'm serious, that was designed as business language at least.
They probably should've started with a Wirth-style language plus Design-by-Contract and static analysis. That would knock out tons of stuff at compile and runtime. If looking for mature language, the best one for predictability is SPARK Ada hands down. It was designed for easy analysis in automated provers. It's automation level is a lot higher than it used to be. Folks wanting provable programs should either be considering using it or prototyping their programs in it with semantics-preserving translation to the blockchain language/VM.
Frama-C and SPARK both build on Why3 platform with its intermediate language WhyML. Work built on any of these three can usually be ported to the others. So, I'll drop a recent example in Frama-C that's really math heavy with a style amenable to high automation.
Cobol has all of C's memory pitfalls, and a few extra. It can be truly painful.
A lot of dev's avoid dynamic allocation altogether, because of the issues you can face:
* Integer overflow
* Buffer overflow
* Free does not mean freed memory. It means freed memory at some point in the future.
If you want something safe, time-tested and performant for business logic, then something like Ada would be a better fit.
If you don't mind the safety problems, then just about nothing beats out FORTRAN in the numbers game. (Though the memory pitfalls are huge). It was one of COBOL's main competitors, and half of the world's banking infrastructure still runs on it.
Yea, Solidity is not the only language of code that compiles to EVM bytecode. There are other languages that also do so: Serpent, LLL, and Viper. "Better" is subjective, but it looks like Viper is intended to be more audit-able. But it's also still a work in progress.
Serpent and LLL are apparently lower-level than Solidity.
Bamboo looks quite interesting. I think it’s mainly focused on modelling the state changes smart contracts undergo more explicitly.
https://github.com/pirapira/bamboo
It would (probably) be fairly trivial to get any modular compiler to spit out bytecode for the Etherium VM...fairly trivial for someone with the gumption and skills I mean.
Or someone could add some analysis stages to the solidity compiler since all the information you need is floating around the web in various papers -- actually spent a good part of the last week reading about abstracting abstract machines and static analysis for my next investigative coding project when I quit being lazy.
I feel most of these vulnerabilities should not exist in the first place. When you create a language that's specifically made to handle money that also is immutable (so without possibility to rollback), it should be generally a bit more secure than the average language.
The VM should be protected against integer overflows and functions should be protected by default, I don't really get those design choices.
There is quite a lot to criticize Solidity but this is quite a bad example. Operator precedence is often different between languages.
And because some languages miss a boolean type, you can't deduce if 1 was because of the bitwise or operator or the less-than operator.
Let's take a look at (2 | 0 < 1) instead.
C and JavaScript result in 3.
Ruby and Python return False.
Java throws a compilation error: error: bad operand types for binary operator '|'. first type: int, second type: boolean
Now which of these behaviours is the one that is the universal right one? (Hint: no one, you should always know the rules of the language you program in or if you don't, just throw in some parentheses to be on the safe side)
There's a, what, 50 year tradition in C of boolean ORs evaluating left to right. I don't know truthiness rules in Python, but the REPL just told me 1 is not False. So we have something that is not False ORed with True. What is your thought process?
There is only a single bitwise OR here, and no logical ORs. Whether they are evaluated left to right or right to left makes no difference in the execution of this expression.
Swap it left and right:
(1 > 0 | 1)
And you'll see that C still gives you 1, and Python still gives you False.
>we have something that is not False ORed with True
No. We have (in the original Python equation) 1 ORed with 0. That results in 1. Then that resultant 1 is compared ('<') with 1. That comparison results in False .
The difference between Python and C is that in Python '|' has higher precedence than '<', but in C '<' has higher precedence than '|'. Dennis Ritchie admits that this is a mistake in the C language[1]. Python (and Solidity) have corrected that mistake.
The reason that '|' should have higher precedence than '<' is the same reason that '+' should have higher precedence than '<'. Namely, people want to do arithmetic calculations ('|' and '+') then compare the result of those calculations ('<' and '=='). There is essentially no situation where it is useful to have '<' have higher precedence than '|'.
What a bad language design... I thought they mostly copied javascript but not even that one works in javascript so who knows what they thought when designing the operators.
You shouldn't be creating a language for that anyway. Even implementing smart contracts on a self-rolled blockchain touches the risk ceiling of dealing with new tech for handling people's money. Doing it with a new vm and language is just reckless.
There is a LOT of things to cover here, this tool chain is just the tip of the iceberg. The EVM is open source on steroids - not only is the code open source but every cpu cycle, memory access and IO activity is executed in the open, and replayed across thousands of nodes by anyone, anywhere as a necessity.
This is probably one of the most exciting paradigm shifts in recent times, being able to call services / functions that are cryptographically verified is an incredibly powerful tool, and may one day become a standard part of the web stack. There is still a long way to go, but what a time to get involved!
Are there no frameworks for formally validating smart contracts or using thereom provers? The small code size, non Turing completeness, and extremely high penalty for failure seem to all align.
Ethereum contracts are effectively Turing complete (well, technically linear bounded automata complete). But you get the same state space explosion issues as with formal validation of any other language.
Hopefully these and other smart contract analysis tools can become a part of a CI/CD system for Ethereum. It seems like that would have prevented some of the massive losses due to simple programming errors.
While working at the level of code can undoubtedly reduce the number of errors, it is not enough to solve the significant risk of making logical errors in contract design.
I don't know, how about Cobol? I'm serious, that was designed as business language at least.