Somewhere, someone is probably working out a way to exploit this right now.
For windows the CA improperly revoked the certificate in response to a renewal request and back dated the revocation and are refusing to issue a new certificate (my understanding and that the CA now believes open source collaborations aren't eligible for code signing certs, or some such nonsense... it's being worked out)
The releases are still PGP signed by an official project key which has been in use for many years and is cross signed by many parties-- just as they have the last decade--, as well as by another 16 parties that independent reproduced the binaries through the deterministic build process: https://github.com/bitcoin-core/gitian.sigs/tree/master/0.21... which has also been used for the last ten years. Anyone who already has an older faithful copy has the scripts to automatically download and verify the other signatures.
By comparison, the windows code signing provides relatively little security assurance. Not too dissimilar from the HTTPS used on the download site. It, like the https, does have the advantage that it is widely checked while many users don't bother checking the actual release signatures. It's useful to have for sure, but it's not the only protection and, unfortunately, it exist only at the whim of a third party that isn't accountable to the users of Bitcoin. But if someone can spoof the download site the executable signing won't do much to stop them from causing problems.
Nobody actually checks the executable signatures, they're buried in the file properties.
If you have a quick, trusted way of getting an EV code-signing cert trusted by SmartScreen, please, let me know.
What do you expect the results would be, if within the next few days it were discovered that there were two different files, with the same SHA-256, each claiming to be the Bitcoin-0.21.1 release?
And if you wanted to profit from it, via submitting the transaction or not, you could likely make far more by speculating on the effects of that announced collision on the Bitcoin price – likely sharply negative.
"It's always signing a hash.", sure, but S(m) and S(h(m)) is a different operation with very different properties.
Also, inside a block itself, transactions are part of a Merkle Tree, so if you can find a SHA-256 collision, you can also create confusion on what transaction where actually included in a given block.
You cannot really “find a solution around it” for two reasons:
1- the bitcoin blockchain is immutable, so the old blocks can't really be rewritten to work with another hashing mechanism. What you could do is reboot the bitcoin blockchain with another hash algorithm, with a Genesis block summarizing the last know state of the bitcoin blockchain. But that would need consensus among the entire bitcoin community, which will never happen because of reason #2
2- Bitcoin miners own millions of dollars of custom hardware designed for SHA-256 generation (mining), there's no way they'll just be like “OK, let's change hash function, I'll throw this useless stuff away and buy new ones”.
PureEdDSA does not break when a hash function without collision resistance is used.
And the kind of theoretical attack that non-prehashing protects you from requires that you sign an attacker chosen message -- relevant in some applications but kind of contrived.
I never understood how this is useful... If someone would replace the executables, wouldn't they also replace the SHA-256 hashes that you are getting from the same source?
In Bitcoin's case what is actually provided is a PGP signature. The relevant key isn't provided along with the download files for the reason you mention. Though there are instructions on obtaining and validating the key on the download site. ... and if you're depending on reading those instructions' they're vulnerable to substitution.
Though those checksums are not extremely reliable and have tiny chance of being corrupted.
It uses a 16-bit ones complement addition. At the packet sizes used on the internet it's extremely weak, much more so than you'd expect just from it being 16-bits, especially on data that isn't a uniform binary.
The question is rather why distribute a separate file with hashes and their signatures, rather than just the signatures themselves, seeing how all useful cryptographic signatures contains the hash anyway?
I suspect the answer is that it is somehow more convenient for the build and release process to distribute one file with all hashes, rather than lots of individual signatures for each artifact. The release process is quite elaborate with several parties independently reproducing that exact build. But someone would have to correct me on that.
That's probably also why they don't consider the broken Windows certificate important enough to stall the release. Relying on a single CA does not provide them with the type of security they are looking for.
that is what needs to be fixed, and ideally would become part of the archive format so that it can be automatically checked (but also should have the option to disable it).
At least that was the original point. These days you usually just host stuff yourself (on your domain or someone you pay, like a paid CDN) rather than having people mirror stuff independently. And it's all https, so it's mostly moot like you say, but old habits seem to die hard though it's already far less common than in the past.
Whenever you still see one, look at where the download really comes from. Often it's third party whereas the hash is first party. If not, then it's probably just a page with old habits.
With http, you can man-in-the-middle the file and the hash.
If the checksum is hosted in a different place than the file, it requires an attacker to compromise two separate systems.
In the typical system, though, the hash sitting next to the file isn't particularly useful.
Someone should tell the nixos devs that. Not being able to verify the authenticity of something as important as your OS does not seem very safe.
A number of years back openssl changed their pgp key and for a month didn't publish it anywhere. AFAICT I was the first person to complain about it-- which also suggests that all the distros that had already shipped the update hadn't checked it.
But maybe the next one.
Last effort to improve Bitcoin (Segwit) was basically a giant shitshow.
Let's hope the ecosystem has matured and this one will be a little more orderly.
Thank you for continuing the discourse all these years and keeping people informed.
The monetary incentive for Bitmain was without dispute last time. Such a thing doesn't exist this time.
Taproot will pass without any drama.
The reason segwit got involved was mostly that it was framed as "either we do segwit, or we do bigger blocks"
Well there were people saying it, how much anyone feel for their false claims is unambiguous.
Rick Falkvinge, for example, put out a series of articles and videos falsely claiming that segwit was patented.
People falsely claimed that segwit removes signatures from the blockchain.
People falsely claimed that segwit would make it impossible to prove ownership of coins.
People falsely claimed that segwit would make bitcoin less secure and that users funds would be stolen.
People falsely claimed that segwit would reduce scalability by using 4MB of data for 2MB of transactions. (other people claimed that they were 10% larger, which is also untrue).
People falsely claimed that segwit would not fix the malleability attacks.
People falsely claimed that segwit would require everyone to upgrade, and that it required "all bitcoin software to be partially rewritten".
People falsely claimed that coins using segwit would be less valuable.
People falsely claimed that segwit didn't solve the quadratic computational cost of signature hashing.
People falsely claimed that segwit was not a softfork.
People claimed segwit was activated in spite of majority opposition (in fact when segwit activated >90% of nodes were running the software and 95% of hashpower).
People claimed that segwit was a "poison pill that would result in a network suicide".
People falsely claimed that segwit was a plot by Bilderberg to give some shadow "jew" coalition control over Bitcoin. (I really had no idea how much anti-semitism was still a thing online until I was getting a bunch of fucked up shit sent to me ... and I'm not even a jew, but the facts didn't stop the people hating on segwit).
I'm actually copying these brain damaged arguments out of published articles, I just don't want to give that retardation traffic. This doesn't even include the more hyperbolic nonsense that got taken offline as it was embarrassingly disproved.
By the time segwit activated bcash already existed-- so the bigger block people had thing and were basically in full on sabotage-bitcoin mode.
>the bigger block people had thing and were basically in full on sabotage-bitcoin mode.
Failing to raise the block size and the resulting loss of real economic activity on the chain were what sabotaged BTC, exactly as the "bitcoin should be like gold" people wanted. Twisting that around and blaming the bigger block people for "sabotaging bitcoin" by warning that this would happen is beyond insane.
Segwit was a good change. But it was used by wall street to kill bitcoin and it worked.
This upgrade will boil down to "will mining pools bother to upgrade their nodes"?
[Perhaps you think you'd name Gavin Andresen? but he had stopped contributing years before blockstream even existed. ... and left the space entirely after an infamous incident where he vocally endorsed a pretty obvious scammer.]
The developers that left bitcoin aren't the developers putting commits on the protocol implementation. It's the developers that made tools on top of bitcoin itself. From not so legitimate sites like betting sites to sites and apps like openbazaar, retailers/payments processors, etc. After the 2018 debacle and the ever growing mempool/fee market most developers put bitcoin in legacy mode and started adopting other more frictionless coins like ethereum, bcash, litecoin, etc.
I have a multireddit setup that has the big crypto subreddits inside (r/btc, r/bitcoin, r/bitcoincash, r/ethereum, r/monero), I check it once or twice a week when I'm bored, and I barely hear anything new coming from the bitcoin camp, no development discussion, no cool new service, no adoption talk. It's all HODL memes, value talk and the occasional lightning network update. I don't see development talk among the bitcoin community anymore.
I'll repeat what I said, the developers that cared about bitcoin have left, the only people left that have a saying are people building layer 2 solutions (blockstream mostly) and miners. This upgrade doesn't affect miners, so I doubt they will contest.
There were no block size disagreements going on when gavin wound down his involvement-- but now you've moved from "blockstream" to block size disagreements.
Regardless of the history, -- bullet dodged there considering what eventually happened.
> Mike Hearn are the big ones yeah
The sum total of hearn's contributions to Bitcoin core were a dozen commits which were mostly one line string changes. To have left he would have had to have started rather than a couple drive-by tweaks.
> the developers that cared about bitcoin have left
Just repeating it doesn't make it true. Pretty ironic to say you rarely "hear anything new coming from the bitcoin camp, no development discussion" on a thread exactly about such a thing.
I dunno what if it's you or I that inhabit a weird alternative reality, but one of us does because there is constant exciting Bitcoin news... and sure, a lot of memes too, they're often pretty funny even if a bit much. But over time the character of the news will become different as Bitcoin becomes more ubiquitous.
I think you're being disingenuous.
We're talking about the speed and volume of new tech being added to the Bitcoin ecosystem, not just about the fact that one such feature is (trying) to be added.
The reality is: if you compare the speed at which and the amount of new stuff being built in and atop the ethereum ecosystem to the innovation in the Bitcoin ecosystem, the difference is absolutely striking.
And BTW, IIRC, Vitalik walked away from Bitcoin and went on to create ETH for exactly the reasons outlined by the GP: a power grab by a small clique of fanatics that led to stagnation.
You've been fed a marketing pitch that doesn't have a lot to do with reality. The only 'bitcoin development' Vitalik did before creating eth was running an investment scam for developing "quantum miners" -- no joke.
What you were led to believe here is simply false, a convenient excuse for an embarrassingly massive premine.
Here's a list of technical developments related to Bitcoin (including LN) grouped by month and topic as covered by a single publication over the past three years: https://bitcoinops.org/en/topic-dates/
I'd be interested in seeing a similar such list for Ethereum or any other cryptocurrency that shows a similar pace of development.
Edit: removed paragraph based on accidental misattribution.
I can't seem to find one for ethereum, the closest probably would be going through their blog entries and filtering by "research and development". https://blog.ethereum.org/category/research-and-development/
I did find a similar list for bitcoin cash: https://cash.coin.dance/development
Note that both lists are a bit deceitful because they include proposals not yet included/implemented. The bitcoinops list also seems to double entries if you don't sort them alphabetical, so take that into consideration if just comparing pure amount of features.
Also the coin.dance list does not list layer2/sidechain implementations. Stuff like SmartBCH would probably over inflate such a list, it would also be unfair because SmartBCH is based on work the ethereum devs did (EVM/web3).
Both lists seem to start around the same date so I'd say they're comparable.
Nice and well-formed ad hominem, but before proceeding, I'd suggest re-reading the name of the user that made the claim about crypto sub-reddits.
> Your earlier post seemed to indicate that you only scan "the big crypto subreddits" for news; may I suggest that maybe popular subreddits aren't the best place for news about research and development.
In any way an ad hominem? Check out his link, it stands on its own.
If you're surprised about that level of activity it might just be that you're not reading the places where that sort of thing is discussed. But there is nothing bad or insulting about that suggestion.
You are both essentially speculating and accusing me of being ill-informed without providing a shred of evidence.
And even if you were correct, you're both attacking the messenger and not the message. That pretty much fits my definition of an ad-hominem.
I claim (I might be wrong, please provide counter-evidence) that there is more innovation in the Ethereum ecosystem than in the Bitcoin ecosystem.
As proof, I offer, however misguided these efforts might be (you can't innovate if you don't try "silly" things), sorted by more to less silly:
- scalability via layer 2 solutions
- scalability via sharding
- actually working smart contracts instead of a vague promise of their eventual feasibility based on a yet-to-be-deployed piece of infrastructure
- a concerted effort to move to PoS
- ongoing work on integrating ZK (zcash, beam, grin, monero)-type transactions.
The list provided by the GP (https://bitcoinops.org/en/topic-dates/) IMO very much falls in the "polishing the turd" category: many tiny improvements that very few people care about and show the disconnect between Bitcoin development and what the market wants.
The fact is : ethereum is here, it has real smart contract, it is well on its way to have PoS, and it's slowly gaining market shares over Bitcoin because they move faster.
Don't get me wrong, I'm a big fan of Bitcoin and its ecosystem, but the pace at which innovation occurs in there is probably my number one peeve about it: if Bitcoin doesn't get its act together it will get the rug pulled from under it by Ethereum.
colored coins existed on bitcoin before ethereum was a thing
meaningless buzzword that vaguely gestures at scripting possibilities, which again, were in bitcoin before ethereum was a thing
> scalability via layer 2 solutions
was on bitcoin before ethereum was a thing
> scalability via sharding
not a thing for at least few years and will be either a spectacular failure or multiple separate chains with atomic swaps between them
> actually working smart contracts instead of a vague promise of their eventual feasibility based on a yet-to-be-deployed piece of infrastructure
smart contracts were a thing on bitcoin before ethereum existed
> a concerted effort to move to PoS
a gigantic failure that undermines security of ethereum and market will not be merciful
> ongoing work on integrating ZK (zcash, beam, grin, monero)-type transactions.
happened on non-ethereum ecosystem
there's really not a lot to be excited about ethereum these days. years and years of promises of global scalable computer, finally an admission that it was all a giant lie and now again years and years of promises of ethereum 2.0 that will not actually for real be global scalable computer, except there's no proposed solutions for any problems because it's easier to part fools from their money with empty promises than with serious discussion about tradeoffs.
What could go wrong I guess. Are there distinct advantages to this?
This softfork also provides the necessary tools to make pay-join economically advantageous.
It improves fungibility (if that's even a word).
nullc seems to be saying that revealing (part of) the script isn’t necessary in case of cooperation: https://news.ycombinator.com/item?id=27023391
Am I misunderstanding you or nullc?
The root pubkey could be an N of N of parties to the transaction (e.g. everyone cooperates), or any other condition you can express with pubkeys alone.
So say that you have a coin that can be spent by 2 of 3 Alice, Bob, Charlie or it can be spent by Alice alone after one year has passed. The A/B/C 2of3 could be made the root, it would look like a single pubkey, and be indistinguishable from a ordinary single key wallet.
The timeout+alice condition would only get revealed in the even that Bob or Charlie will cooperate and the timeout gets used.
One of the big improvements is that multi-signature transactions, where both Alice and Bob have to sign to spend some money, used to require adding N signatures (number of signers) to the block chain to have a valid transaction. So it improves efficiency and should allow squeezing more transactions, but it also improves privacy, because there's no way to tell if how many parties are involved in the transaction.
Merkelised Abstract Syntax Tree (MAST) are the other big feature that allows for more complex scripting to also be represented in a compressed and obfuscated way (example in the link).
See the full implementation here by Delft University of Technology scientists and students. 
Security needs work, functionality works. (disclaimer, I'm the responsibile professor)
If accepted by the miners, Bitcoin will a bunch of low-level upgrades:
- more compact signatures (decrease blockchain size)
- paves the way to smart contract
- improve transaction privacy
Usually the rule is just "provide a digital signature with key X", or some threshold of keys. Though they can be more much complicated, and even simple scripts can have powerful applications (e.g. https://bitcoincore.org/en/2016/02/26/zero-knowledge-conting... )
The flexibility is nice but there are some gotchas-- Your rules are included in the transaction, storing and transmitting expends system resources, costs you fees, distinguishes your transactions from other users, and in doing so discloses information about your business practices.
Taproot addresses this by doing some relatively simple elliptic curve crypto magic to effectively hide the conditions inside a public key without impairing it or making it larger. Then, if the parties transacting cooperate they can keep their fancy conditions private and just sign using a single jointly controlled key. If the parties don't cooperate some of the conditions may need to be exposed, but only the absolute minimum to show the transaction is allowable. Cooperation is likely however, because non-cooperating won't create a benefit.
The effect is that instead of being limited to a few thousand bytes of operations for the rules governing your coin, you could have gigabytes of rules, and yet never expose them (and burn network resources and your privacy) -- or if you do need to to expose some of them, you only need to show a small amount. ... and as a bonus have your transactions look just like a really boring maximally simple wallet's transactions.
There are also some small efficiency gains: somewhat smaller signatures and making batch signature validation possible.
It's a little difficult to estimate the impact this improvement will have: Most users only use relatively simple rules, and as a result they will just become a little more private and a little more efficient. But if taproot was marketed like most altcoins are the headline would be "Taproot increases script expressiveness FORTY ORDERS OF MAGNITUDE while reducing resource usage!", because-- indeed-- you could make a script that was 1.3e41 times larger than the current rules allow if only you had a computer powerful enough to handle such data locally. :P So for common uses it's a small to moderate efficiency and privacy improvement, but it could enable some new and interesting applications.
... and you might not ever know about some application that benefited from it, because unless you're a party to its transactions there may be nothing identifying published about them. :) Not the best for marketing, but better for human rights.
If I have e.g. a 1GB script, how much of it will I need to reveal if my counterparty doesn’t cooperate?
For example, is it realistic to have 1GB of script and only needing to reveal, say, 1KB in case the other party doesn’t cooperate?
So, for example, if you create a coin which could be spent by any Californian (assuming you knew a key for each), your full script would be 1.17 GB, but when someone spends the coin their signature would be 896 bytes, assuming you set it up to be equally efficient for all people. That 896 bytes plus the 32 byte public key would be all that ever hits the blockchain.
You're not constrained to assume equal-probablity however: if you have a probability model for how likely each condition is you can construct a huffman tree and the overhead for any particular condition is just the number of bits in its huffman codeword times 32 bytes. If your probabilities are accurate this will minimize the average size.
So for example, say your directory had a list of 100 known bitcoiners that were overwhelmingly likely to spend the coin their signature could be 288 bytes, while being just a bit larger for the other residents. If you, correctly, assumed that I would be the most likely to spend the coin you could stick my key at the root and I could spend it with just 64 bytes -- the same as an ordinary wallet controlled by one person rather than controlled by 40 million people.
For some usages there is a key creation vs spending size tradeoff-- e.g. you can expand the tree further to make the revealed nodes smaller, at a cost of an exponential blowup in the time it takes you to construct the script.
One thing that some protocols should be able to do (at least eventually) is set things up so that in the event of non-cooperation the additional transaction fees get mostly covered by the non-cooperating party.
This sounds amazing. I’m really looking forward to seeing what can be built on top of Bitcoin if this activates.
Question though: why is it called Taproot?
One is that if you imagine the probability-of-usage weighed tree over the DNF form of your script, taproot is most efficient and private if most the probability mass is clustered along a central path-- like a botanical taproot-- particularly the "all participants agree" case that almost always exists.
The other when you spend using one of the hidden scripts, you tap into the public key to expose a hash tree root hash.
But really, the name exists because easier to talk about stuff when you have a name for it, the idea long predates having a formal spec for the construction, and taproot was the name I picked. Today we could also call it BIP341.
Semi-off topic: In another comment you said you're no longer developing BTC, what on earth could be more appealing right now?
read: other chains are centralized databases that have no business being blockchains in the first place and can be replaced by single mysql instance.
the trick is not to have high TPS, bitcoin could have unlimited TPS too, it's just a single line of code change. the trick is to have a decentralized system that functions at saturation on commodity hardware.
the conservative minds prevailed during scaling debate in 2015-2017, creating selection pressure for scaling solutions that optimize limited resource - chain space, rather than populist simplistic "solutions" pushed by cheap propaganda slogans like "we can do this much TPS!".
This, too, has been a thought of mine, but I'm not exactly sure how I would block diagram the 'locking' mechanism for the MySQL instance.
I've come to believe the locking mechanism is the nonce produced to sufficient solution parameters that appears to use the difficulty of finding said solution as its defensibility.
So how do you get the 'strength' of the miners throwing ExaHashes at a solution with a single instance? We could easily snapshot the DB and sign them, however, the point of failure is no longer in a 50% attack, but in losing said key... which intuitively feels far less secure?
Curious to hear your thoughts
blockchain and miners only make sense if you need and actually maintain decentralization. if you don't and/or can't - miners (and validators in PoS systems) are useless overhead.
To build on the discussion, I'm more amazed at blockchain's capabilities as a cooperation engine outside of the normal channels. To that end, I'm really surprised we haven't seen significant attrition in the current legal system to smart contract systems. I guess the lack of adoption more highlights just how 'customized' a 'standard' contract or dispute is in the real world.
i don't believe the hype of using blockchains to track some logistical or supply chain data will prove useful simply because all these international corporations already have a system to resolve conflicts and enforce contracts - law.
And thinking out loud, it sounds like if blockchain were used to run financial operations for a company, we could make the infamous audits for Luckin Coffee's and GSX's a thing of the past. Or at least after one mistake (the LC coupon issue was an interesting hack to avoid detection), update the contract, then all future instances are robust to the same issues.
GAAP Rules and Arm's Length Transactions could be factually monitored too... interesting
We have easy to use wallets , easy ways to run lightning nodes .
And if you look at the average transaction value, you can see that the blockchain itself is acting more of a settlement layer rather than "coffee transactions" as it should .
That is a sign of failure.
Many of the cryptos that claim to scale to thousands of tps on layer 1 are all sacrificing decentralization to achieve it, at which point, I might as well use Paypal/Visa. There's no point in being able to scale to those levels in theory if in practice no one uses them.
it is, if the original goal was to be something else.
If you start a marathon and stop to eat the greatest hot dog ever made, it wouldn't be considered a successful run, although you can say "but it was the best!" and be happy about it.
Payment channels aren't in the white paper but they were in the first public release and had dedicated opcodes. That first implementation wasn't very practical nor was it secure.
The fact that modern payment channels are implemented with other opcodes may be a pivot in implementation but not in concept.
Bitcoin was supposed to be much more than it currently is, and the "success" of it is a product of continuous goalpost shifting.
There's nothing wrong, though, with someone taking the open source code and creating LightningCoin from it, though. That would be the equivalent of a pivot.
I just checked https://mempool.space
A low-priority transaction cost $2.36; medium is $4.39 and high is $7.48.
To put this into context: you can send $10,000 worth of BTC anywhere in the world in an hour for $2.36 right now.
A bank wire transfer for $10,000 is going to be at least $35; often it's more, depending on which country you're sending to and if you have to use an intermediate bank to get to the bank you're ultimately are attempting to reach.
Bank holidays, weekends, etc. don't apply to bitcoin.
There have been periods where the prevailing feerate has remained over 150 sat/vB ($12 - $24) for 24 hours.
And unless you're already in $10K of BTC, you will lose exchange fees and are subject to the forex (equivalent) rate. Your receiver will have the same burdens.
Wire transfers/SWIFT/FedWire etc are effectively instantaneous, the fees are 100% predictable, and they are accepted everywhere worldwide for all legitimate business. As you note, they can be inconvenient or impossible on overnights, holidays, and weekends.
I'm not disagreeing with you -- just noting that the quoted Bitcoin fees at any point in time are not reliable, and that the whole process is more complicated than it might appear.
 (background for other readers) Bitcoin fees are a function of the transaction size in (virtual) bytes (minimally either ~140 vB or ~240 vB, depending on the type of addresses used), multiplied by the feerate in satoshis per virtual byte. Converting to USD, you have to consider the BTC-USD exchange rate which has been bouncing around $55K lately. So, e.g.:
240 vB * 250 sat/vB == 60_000 sat
60_000 sat == 0.0006 BTC (100_000_000 satoshis per BTC)
0.0006 BTC == 33.00 USD (at $55_000 USD per BTC)
You cannot send $10,000 anywhere in the world via Bitcoin for $2-7, you have to first convert the $10,000 to BTC at a local exchange, wait 3 days, pay 1-2%, suffer slippage, pay $2-7, wait an hour, pay 1-2% at the remote exchange and wait for the money to deposit into a bank account. This is true because you can't spend Bitcoin for goods and services - generally speaking anyways. Bitcoin in this context is just the intermediary unit which is elided in a wire.
Re: wires, domestic US wires are offered free of charge by many institutions, and are instant during regular business hours. Obviously delays apply outside. For reference, an ACH transaction costs banks $0.002 in bulk to the depository institution. A FedWire costs $0.033 in bulk to the depository institution. 
Transfers outside the US cost more and take longer because of AML and KYC.
Not to mention that the move in the US from ACH to RTP makes ~free and instant domestic transfers 24/7. No blockchain needed. Because of course there isn't, the current system was based on policy not technical limitations of MySQL. 
>> you can send $10,000 worth of BTC anywhere in the world
> You cannot send $10,000 anywhere in the world via Bitcoin
This is why the post you responded to said "worth of BTC".
Most payments– imagine hiring a contractor or buying a Starbucks gift card– don't need to go through instantly.
Check rates quoted on the final statement versus those prevailing at that time.
For a few limited cases of going USD -> USD (in foreign jurisdiction), and the USD is not converted to anything else ever, there may be a net win.
In all other cases (the majority), you, or the receiver will at some point be paying far more than the wire fee for any transfer over a grand or so.
Schwab operates these retail banking services as a loss-leader for their investment products.
I've never experienced a bank that hasn't silently gouged on forex, and this is the first I've even heard of one that didn't.
Definitely good to continue to be vigilant about such claims.
As for BTC, I wouldn't use it for anything other than a store of value - there are far better crypto options for transferring value. Some cost fractions of cent and take seconds.
Others cost single-digit dollars, but are fiat-pegged so there's no volatility risk at all.
The problem I see is bitcoin devs refusing to be pragmatic. Segwit is just an accounting trick.
Would 2MB be preferable? Would it be useful? Just the same magnitude of transactions that visa settles would require slightly above 500MB sized blocks, and that's without any more sophisticated transactions such as atomic swaps, which could potentially be useful.
After all, if a transaction had an invalid spend script, it would not have gotten 100 blocks worth of confirmations.
They only exist today as a mechanism to prevent nodes from being flooded by low difficulty fork blocks, forking off back before height 230k, because the initial difficulty of Bitcoin (2^32 hash operations per block) is too low relative to multiple TH/s asic mining devices.
There were a couple distinct activities. One is the rolling utxo hashes, which has no major engineering hurdles, and can allow a compromised security "bootstrap from a utxo set".
The other are schemes that allow nodes to not have the utxo set but still validate-- these have historically had unfavorable IO costs, and the bandwidth storage tradeoff hasn't seemed that appealing-- e.g. would you find reducing storage from 10 GB to 1MB but at a cost for increasing bandwidth 10x to be appealing? In some applications it would be, not others.
I believe work related to both has been ongoing, however.
could you elaborate what's compromised about including a sha256 of utxo set in every block and allowing users to choose how far back they want to bootstrap from?
isn't it strictly better than current situation with assumevalid?
If you're happy with the spv security model-- perhaps you should be using SPV? :) This is a little trite I know, because it's not quite identical because of the "past": but the vast majority of the sync time is in the last two years in any case, and practical considerations mean you wouldn't be able to just arbitrarily choose how far to sync from (as you need to be able to get the utxo set as of that height).
In the ethereum world effectively almost all synchronization is done using 'fast sync' which is essentially the committed utxo blindly trust miners model. Performance and storage considerations mean you can't go back more than a tiny amount of time (I believe its normally 4 hours). Many commercial entities operate with multiple nodes and if they detect they've fallen behind they just auto-restart and fast sync to catch back up. Effectively this means that if miners commit invalid state they'll just blindly accept it after a couple hours outage.
All assumevalid is doing is asserting that the ancestors two weeks back and further of a specific block hash all have valid signatures. When you get a setting there as part of the software you're running you're assuming that the software isn't backdoored (e.g. because of a public review process, or your own review). Assumevalid is strictly easier to review than pretty much any other aspect of the software integrity. E.g. there are 100 places where a one character change would silently bypass validation completely. Reviewing AV simply requires checking that the value set in it is an accepted block in some existing running node. AV as implemented also requires the blockchain to agree and have two weeks of work ontop of it, so it's just in every way harder to undermine validation by messing with it than changing the code some other way.
On a technically pedantic point. It takes a minute or so to sha256 the UTXO set, so doing literally what you suggest would utterly obliterate validation performance. (fortunately rolling hashes accomplish what you mean without the huge performance hit.)
> Depending on a utxo state in blocks is effectively the SPV security model, -- it's an utter blind trust in miners to set the value honestly
if it was hardforked in as part of consensus protocol - miner's wouldn't be able to set invalid utxo set hash any more than they are able to "produce" blocks with invalid signatures, or am i missing something?
as for storage and performance, maybe it would make sense to take the performance hit of maintaining a persistent immutable set such that you would be able to travel back as far as you like with minimal overhead.
do you know of any active PRs/branches where utxo commitment work is/has been happening?
> as for storage and performance, maybe it would make sense to take the performance hit of maintaining a persistent immutable set such that you would be able to travel back as far as you like with minimal overhead.
The cost of supporting that arbitrarily would be extremely high over and above the cost of having the complete blockchain. I don't see why anyone would choose to run a node to serve that. I certainly wouldn't-- it's obnoxious enough just to have an archive node. But having some periodic snapshots would probably be fine ... but not that many since each would be on the order of 7GB of additional storage.
No, there is work ongoing I haven't been following closely. Sounds like you're more interested in the assumeutxo style usage, so search for that and muhash.
in number of transactions?
I do not think so, and according to the first few results I've found it's not true.
What source do you base your claim that they are?
Perhaps a pruned node can get somewhere in the area of sub-10G, but a "full node" is actually at around 328G: https://www.statista.com/statistics/647523/worldwide-bitcoin...
"Full nodes download every block and transaction and check them against Bitcoin's consensus rules." (https://en.bitcoin.it/wiki/Full_node#Archival_Nodes)
Pruned nodes do just that.
The blockchain data structure means that once verified, you don't need to store old blocks. You only need the most recent blocks to verify new ones.
The nodes you're talking about, that store the entire blockchain, is called an archival node.
Anyway, encouraging people to install full nodes, pruned or not– which enforces the rules– improves Bitcoin's decentralization.
At best it's a gateway drug to get people more involved.
Interestingly, running a lightning routing node actually is a way an individual can run a full node to not only generate some fees to cover the cost, but actively and consistently participate in the cyber economy and have a marginal voice in enforcing the rules. The nodes you have channels with are incentivized to care about your vote if they want to stay connected to you.
From  in the parent.
A Bitcoin full node only takes 5GB of disk space to run, and 256MB of memory.
A Bitcoin pruned node only takes 5GB of disk space to run, and 256MB of memory.
Edit: a “full” node is one which fully validates blocks, meaning it checks that blocks meet all of bitcoin’s consensus rules. All data in bitcoin is either validated once the first time it is seen (like witness data), or at most twice (an output and its later spend). Pruned nodes garbage collect this data once it can provably never be referenced again. This does not diminish the nodes ability to check bitcoin’s consensus rules, so it is still a full node.
If you have a "full" version that is missing some info, people will be confused.
One thing that made me laugh regarding the Ethereum “archival node” semantics debate was that equivalent functionality (immediate lookup of any transaction by address) in Bitcoin requires an additional index, like ElectrumX. Apparently many of the loudest Bitcoiners would benefit from understanding the distinction between a block explorer and a full node.
I thought the majority of the hundreds of gigabytes consisted of dormant and throwaway addresses that still have some value on them, but it sounds like I've got the wrong impression and by removing transaction history for empty addresses and other historic block data, it's only a few gigabytes?
The need to track previously used stuff in some other systems to prevent transaction replay is a design flaw that Bitcoin avoided.
The vast majority of the blockchain data is digital signatures which you don't need anymore once you've validated them (except to help other people sync up).
It’s perverse you are proud of that and chose a tiny hard drive size over becoming actually useful and doing more than seven transactions per second. Time marches on, technology improves, but Bitcoin is proud they can run a node on a computer from 1995.
Once you've accepted that premise, the next question is how do you achieve the most decentralization? The answer is by having the most nodes. How do you get people to run nodes? Make it super cheap to run. It's simple really, people are not paid to run nodes (miners != nodes). It's not about storage space, you also have to consider network latency and propagating those blocks across a large adversarial network.
The signalling mechanism is a way for the miners to vote ahead of time whether they will follow to the new fork or not. If more than 10% of the hash power of the network does not signal that it will follow, then the fork is called off and won't happen.
Bitcoin's rules specify what isn't allowed and violating that breaks compatibility, but the system can become more restrictive without becoming incompatible.
Huge chunks of the space of possible inputs was set aside for futures use by allowing them in blocks without imposing any structure (while disallowing them for loose transactions so nodes won't inadvertently relay transactions which are invalid due to rules they don't understand).
In that sense all future features are already complete latent inside the system and to make them useful the community need only chip away at all the permitted inputs which aren't compatible with the desired new functionality. :)
At this point bitcoin is like an economic black hole. Most technology allows us to do more with less. So far, bitcoin doesn't follow that pattern. The higher the price, the more resources are needed to secure the network.
I guess the next major release is going to blow your mind.
Bitcoin currently emits as much carbon as the entire country of Sweden.
Bitcoin currently creates 10000 tons of electronic waste and associated pollution a year.
Bitcoin is currently enabling a massive epidemic of ransomware, which is not only crippling businesses but also critically important societal services.
At this point, it is neither ethical nor moral to use bitcoin, or do anything that helps it thrive.
Bitcoin is a crime against humanity, and must be destroyed as soon as possible.
Please understand there's a difference between what the technology provides and what it's being used for. Malicious people using the technology is true of many many technologies. I keep getting phishing calls every other day to give my SSN, does that mean the social security program should be canceled for this very reason?
> Please understand there's a difference between what the technology provides and what it's being used for.
No, but according to some people's logic it's a reason to ban telephone calls.
bitcoin doesn't consume any energy at all, you have a severe misunderstanding of what PoW is and how it works.
the answer is that bitcoin doesn't require any amount of energy beyond a single computer to churn out blocks. all the energy usually attributed to being "used" by bitcoin is in fact energy purchased by users to acquire and/or maintain certain level of security of their financial assets.
bitcoin is simply an instrument through which security of financial assets can be expressed in form of pure energy, which is completely transparent and can't be faked. that security comes for a price and as long as users are willing to pay this much money for that amount of security - miners will continue producing it, consuming energy in the process.
But of course you know that. I wonder why the trollish comments.
i'm sure you would agree that amount of energy required to find signatures at 0 difficulty is negligible
> And every year it gets harder, by design (to control the volume of new btc).
no, it doesn't get harder "by design", it gets harder because bitcoin is an instrument that has created a market where financial security can be purchased in form of energy, so it gets harder because that kind of security has value on the market and users keep purchasing it.
> But of course you know that. I wonder why the trollish comments.
turns out you were wrong in your assumptions, yet you proceeded to call me a troll. how about not doing that in future?
So get off your high horse and do something that actually helps the environment rather than fighting windmills.
What would happen if a Bitcoin fund like Grayscale would say that their customers are only entitled to the "original" Bitcoins in the fund. So investors would be left with pretty much worthless stuff?
What would happen if they don't, but the original coins keep being traded at say 10% of the value of the new ones. Who pockets those 10%?
If somebody like Grayscale decided to outright scam their own investors (why?) there are a hundred excuses they could invent to justify it; they don't need to blame a protocol upgrade.
Segwit was activated years ago. There were some conspiracy theories about Segwit UTXOs being worth less than legacy UTXOs but AFAIK they were totally unfounded. So far Bitcoin is Bitcoin and I don't see any reason why that would change.
I may be mistaken, but I don't think taproot is a hard fork.
Edit: Isn’t this also what happened on some exchanges with the previous big forks?
The question he's trying to bring up is an interesting one, but this is the wrong thread to bring it up on.
On this subject-- Checkout Archer v. Coinbase, where coinbase kept a large amount of a users forked coins and prevailed in court.
No, after activation of Taproot, coins can be spend in fewer ways than they could before. The new 'taproot outputs' will look like "anyone can spend" transactions to old nodes. It is only new nodes that will enforce the new rules on these transactions.
To make future soft forks easier, the Segwit data has a version byte, and unknown future versions are also considered as "anyone can spend". So taproot only had to increment the version from 0 to 1 to work as a soft fork.
But what happens if a miner interprets a Segwit transaction as "pay to anybody" and creates a block that violates the "additional validation" performed by newer miner/node software?
They don't because segwit (or taproot, for that matter) script is flagged using input-space that the old software knows is "from the future" so that it knows it doesn't know how to validate it. As a result, they won't relay it, won't mine it, and won't display it in their wallets until confirmed. But if someone else puts it in a block they'll accept it.
If someone goes and lobotomizes their own software so that it'll mine stuff it knows it doesn't understand, then they could produce an invalid block. But anyone that is upgraded just ignores their block like it never happened.
This is why the activation of such things is normally triggered by a super-majority of hashpower being upgraded: It's for the benefit of parties that haven't upgraded so that even if there are some bad-data-maniacs out there pulling expensive stunts, their invalid blocks are quickly left behind and even non-upgraded nodes won't see many confirmations before the bad block is removed. (And upgraded nodes won't see any at all, of course.)
In the case of taproot 90% of the hashrate has to signal support during any one of several two week signaling periods to trigger activation. If that doesn't happen, people will figure out why and try again (potentially with different activation criteria).
anyone that is upgraded just
ignores their block
Nah, because the post-taproot software will have more hashpower (due to how it activates), the taproot enabled blocks are acceptable to old nodes, and Bitcoin follows the valid chain with the most hashpower.
Bitcoin has introduced changes like this a good dozen times in the past, it doesn't create two separate chains in practice.
The only way it can create two separate chains is if the unupgraded side has a super majority hashpower, one of them modifies their software to mine something invalid under the new 'future' rules... and no humans intervene to prevent that outcome (e.g. by getting hashpower to move). In practice this doesn't happen because the activation is triggered by 90% hashpower indicating that it will enforce. (and won't turn active until November, giving people plenty of time to upgrade)
To old nodes taproot transactions are valid when they've been included in a block, but are invalid when they are relayed on the network.
This is accomplished by taking all the parts of the transaction that are reserved for future extensions and making their use invalid for the purpose of relay, mining selection, or wallet display in advance.
So you can make a transaction that is invalid to new nodes, but valid-in-blocks to old but it'll still be invalid to them for other uses. It's sufficient that new stuff be considered valid in blocks by old nodes for the system to still come to consensus, since only the blocks are included in consensus.
[Thanks for all the questions, BTW, it's been an interesting discussion.]
Both will hold value.
And my question is how funds like Grayscale (And 21shares, exchanges, etc) will handle this.
This longest chain is valid for both upgraded and not-upgraded participants. So that's the consensus that everybody follows.
Taproot is backwards compatible in that non-taproot nodes will just ignore things they don't understand without affecting consensus.
While low quality comments are easy to spot, low quality votes are not.
I often think HN should have the requirement to explain a downvote. I would think that downvotes like these have low quality explanations like "Shut up, idiot!". So these low quality votes could be dealt with and the voters future votes could be discounted.
For every person that won't take my word for it there are probably 100 people here reading and learning something they didn't know.
You're always very calm and communicate quite clearly even with aggressive people suffering from the Dunning-Kruger effect that probably don't deserve it.
But like you say, there are a lot more people reading the interactions so your efforts aren't wasted.
Everyone has an equal voice, but certainly not all voices are equal.
Hopefully the GP comment spends 10 seconds trying to work out if you are correct.