The main logic is a few hundred lines of code: https://github.com/tlrobinson/tomcoin/blob/master/src/node.j...
1. Open https://tomcoin.herokuapp.com/ and https://tomcoin1.herokuapp.com/ in two browser windows (the private key is stored in localStorage so using different domains ensures each gets a different wallet)
2. Click "Start Mining" on one or both nodes
3. Once you've mined some TomCoin (should just take a few seconds unless more people start mining) copy the public key from one node to the other's "publicKey" field, enter an amount, and click "Transfer"
4. Wait for another block to be mined and you should see the balances transfer
(There's no persistence so the chain will reset when all the Heroku nodes idle out and other nodes close)
(It also just uses WebSockets to connect to the instances on Heroku, or in theory other server instances people are running. I meant to implement WebRTC for P2P between browser nodes but never got around to it)
With my post, I tried to boil down the concepts as much as possible to save some mental bandwidth and make understanding the overall easier. We will see how that turns out
} = require("./util");
And will this really run in a browser? I think brosers don't have require()?
So yes, everything can be polyfilled or transpiled.
But that does not really meet my understanding of "Runs in the browser".
This is a common explanation of PoW, but is actually incorrect. If you think about it, this would mean that the PoW difficulty could only increase (or decrease) by a factor of two. In reality, the block hash is simply interpreted as a (very large) number, and this number must be less than some other very large number (the "target"). So you do end up with a lot of leading zeros -- but these are just a side effect, not the thing being measured.
I try to keep these concepts rather simplified to ease the reading and learning process. Perhaps there should be a note for these situations.
(I should have probably used 32 octets so that it fits neatly in 256 bits)
The "mapping" you may be referring to is happening outside and subject to how your brain /your browser interpret it. (How do you know that when I posted the comment, that I didn't supply the raw bits of the number?! ;-))
Anyway, it was just a joke for low level-programmers I guess. Have a good day ;-)
> The proof-of-work involves scanning for a value that when hashed, such as with SHA-256, the hash begins with a number of zero bits.
Performance as in "it takes less time to verify the PoW," or something else? Because the "leading zeros" and "less than target" approaches would both take a negligible amount of time.
btw, the whitepaper probably used "number of leading zeros" because that's what Hashcash does.
I don't know who told you that, but they were weirdly confused.
It not unlikely the case that sometime early in development (long before publication) that it was bits-based, not only is this how the standard hashcash code works-- but the Bitcoin code calls the relevant field that encods the difficulty "bits".
But Bitcoin itself never worked that way, not from the first instant. Changing it would have been a consensus incompatible change, and Bitcoin is still more or less consensus compatible all the way back. (There are outright bugs that make the old software get stuck, but if you fix those, it follows along).
The bitcoin whitepaper told them that, in more than one place too, so it's not a simple typo, e.g.
...we implement the proof-of-work by incrementing a nonce in the block until a value is found that gives the block's hash the required zero bits.
No bitcoin software was ever released that worked another way.
-log_2 (H / 2^256) = 256 - log_2 H leading 0s
and allow for fractional number of leading 0s.
The nBits header field closely resembles a 32-bit floating point representation of this number (I say resembles because it's really a floating point representation of the threshold H value).
I'm quite surprised I wasn't aware of this if this is actually how it is implemented, the leading zeroes seemed right (and the difficulty by factor of 2 also!)
Equivalent check in btcd, an alternative Bitcoin implementation: https://github.com/btcsuite/btcd/blob/991d32e72fe84d5fbf9c47...
After 2 weeks there should be 2016 blocks mined, 1 block every 10 minutes. After each 2 week period every node recalculates the new difficulty based on the actual number of blocks mined in the last 2 weeks. If the actual number of blocks produced was 25% higher than the target of 2016, then simply increase the difficulty by 25%.
Once embedded in the header, everyone will agree on past timestamps as they agree on the chain with the most accumulated difficulty.
(+) For the people with background in CS: it might be that the video at some point talks about the number of zeros instead of the comparison, but really, it does not detract a 'bit' from its value.
However: when it comes to implementations, the real hard part of the whole Bitcoin affair is where the gnarly details hide, namely consensu, and it is missing at this point.
- Acts as the primary mechanism to make the data in the chain immutable, as the cumulation of the work (chain of valid blocks) becomes harder and harder to reproduce.
- Anti spam, i.e. you need to put something at stake (electricity and hardware cost) to participate in the issuance and transaction ordering
Mining is about authorship of the blockchain. The system is simple enough; people broadcast transactions, and they get collected and entered into a shared ledger. But the details are tricky. Order is one major issue; does A or B come first? A and B may be mutually exclusive, so whichever is first is critical. Because the network is imperfect, and network users may be malicious, it is very important that we come up with an extremely robust way of deciding how transactions are entered into the blockchain. Authorship - that is, mining a block - is everything.
The simple solution is to make it random. If it's random, then no individual miner can reliably control authorship, and thus cannot engage in attacks like double spends (unless they control a majority of the mining power, i.e. a 51% attack).
POW is a way to achieve randomness in a robust way. The only way to mine a block is to solve a problem that can only be solved via brute-force hashing. This is costly to do, so incentives are offered to get people to do it via coinbase rewards and transaction fees. This adds to the security model because miners now have strong economic ties to the POW algorithm and blockchain with capital and marginal costs.
Of course, there are other ways to achieve randomness, but can you think of one that will force a decentralized network to achieve robust consensus (i.e. to not fork)? Much of the 'magic' of Bitcoin comes when you realize the incentives under which miners operate. Once a valid block is broadcast on the network, miners have extremely powerful economic incentives to accept this block and begin mining on top of it.
Consider what happens if, instead, the miner continues to mine at the 'old' block height. They may come up with a valid block fairly soon (it's random, so they have no way of knowing), but each moment that passes, the other miner's valid block is being transmitted over the network. Any subsequently transmitted valid solution is at a disadvantage in becoming the consensus, and that grows as time passes. Basically, any time spent mining on the old block once a valid one has been seen is pure waste. Since the entire network operates from these same incentives, consensus grows very strongly as no one wants to mine on a minority chain.
Difficulty is a different issue; this is just a way to keep block times fairly consistent, even as the network grows and more mining power comes online. Because it is calculated directly from the block chain, difficulty rules follow the same strong incentives that build consensus on the majority chain.
Is that so? The code actually signs with the private key. Although I might have misunderstood what that sentence refer to.
I already qualify for the "blockchain expert" tag by working in the space for 2 years, though I don't find it that desirable. Maybe a few years ago.