Hacker News new | past | comments | ask | show | jobs | submit login
Nostr is a stupid simple P2P protocol that works, built by builders
248 points by kdragon on Nov 25, 2022 | hide | past | favorite | 128 comments
I have been seeing a lot of shilling for mastodon lately, so I thought I would step in and shill Nostr for a bit.

https://github.com/nostr-protocol/nostr

Fun facts about Nostr:

* Nostr stands for "Notes and Other Stuff Transmitted by Relays". It is an odd acronym, but I like it.

* Nostr uses websockets and relays to build a really simple P2P network. We also steal a few ideas from bitcoin (ECDSA ids, schnorr-signed events).

* Relays are simply dumb data stores for events that clients publish and subscribe to.

* Clients don't trust relays to be honest, so all events are self-signed. Your pubkey is your userid.

* It is stupid simple to build a Nostr client. You can easily do it in less than 400 lines of JavaScript. And it runs in the browser.

(shameless self plug) https://github.com/cmdruid/nostr-emitter

* Nostr is powerful enough to host chat apps very easily. Here is a rip of Telegram, running on Nostr:

https://anigma.io

* There's a lot of fun things you can do with Nostr. Check out all these cool projects!

https://github.com/aljazceru/awesome-nostr

* We are constantly discussing how to improve the protocol. Come join the conversation here:

https://t.me/nostr_protocol https://anigma.io https://damus.io https://github.com/nostr-protocol/nips

Thank you for reading my nostr shill post. I did not create nostr, nor do I get any monies for promotion. I just think it's really cool and I have a lot of fun building stuff that punches though nats.

If you have any questions about nostr please feel free to ask.

Also, Happy Thanksgiving to everyone! I hope we're all feeling fat and sassy today. :-D




So how does nostr propose to solve the problem where there is, in fact, quite a lot of content that you want to filter out, whether because it makes for a better experience for the people using this protocol to talk to each other, or because there are some pretty solid laws about things that various governments require people to filter out?

https://abovethelaw.com/2022/11/hey-elon-let-me-help-you-spe... is a pretty decent rundown of a mix of these things; it is specifically pointed at Elon Musk's decision to buy Twitter and make it a haven for "free speech" but it is a glimpse at what is in the future for anyone setting up a "free speech" platform.

My experience as someone who has been running a Mastodon server since 2017 is that while "we are all for FREE SPEECH, we only block what the government ABSOLUTELY requires us to block!" sounds noble, in practice nodes of the Fediverse that say this become havens for people who are only there to be assholes to other people, and any sane admin will sigh and block the whole server, because it's just going to be a continual source of rude nasty bullshit.


That's a great article, and balancing free speech with censorship is a difficult problem. But it becomes a constant headache only on centralized platforms, where no amount of resources can realistically monitor and filter all content. This scales in complexity as the platform grows, which is the goal of any centralized service. And if the business model depends on advertising, it becomes even messier and crucial to its existence.

P2P services OTOH work on a decentralized and pull model. Users share and only subscribe to the content they're interested in. Censorship is distributed, and it's a problem for people who don't wish to see specific content. It's the way the internet works, and the existing approach of removing sensitive content applies to P2P services as well. Since there are no advertisers to appease, it's not an existential problem.


You didn't answer the question.

The question was, to wit, >How do you keep CP off it?

Your answer was: >I don't, just don't look at the CP.

The problem really in question here is:

You've just created a new distribution method for this type of thing and punted the consequences for someone else to deal with.

(Which is totally cool imo, but newsflash, expect to be the subject of a hit piece some time in the forseeable future. It shouldn't take long; either for actual criminals to set up on it, or for LE to do it to "snare unsophisticated actors").

Welcome to the Internet, where we can't have/make nice things anymore.


How do we deal with objectionable content already on the internet? It gets taken down by law enforcement of some country that finds it objectionable, and we handle the case in courts.

A new protocol that makes this content more accessible isn't an issue with the protocol, but with society and how we decide to deal with it. If CP is found to be served by nginx over HTTP, is that a problem with nginx or HTTP?

If anything, centralized services only make the problem more difficult to address, since they're expected to serve the demands of governments, companies and law enforcement agencies worldwide, while somehow being the arbiter of free speech. Those are impossible goals to reach, and go against the original design of the internet.

This discussion is as old as P2P protocols. I'm sure that if Nostr became a popular way to share copyrighted content, governments would try to fight it, just as they've done before. But at the same time, censorship is not something a protocol should care about, and just like BitTorrent thrives today, with enough interest, Nostr would also find a way to persist.


In many European jurisdictions, you are criminally liable for anything that makes it onto your systems, regardless of how it got there.

So, the OP has just given people a whole new way of making themselves liable for illegal content, just by running the P2P software in question.

In most cases, the police aren't interested in hauling a service provider into jail, they want to haul in the person who put that content there. But this does make a nice little lever that the police can use against any service provider.


Nostr lets you specify who and what you subscribe to, so unless you subscribe to everything and everyone this isn't really a problem. You can also subscribe to followers' followers and expand that way.


Unlike what OP says Nostr relays are not dumb, they can have their own policies and to me they look like a better version of Mastodon servers. They can have identities, "themes" and policies as they wish. On Nostr it's totally fine for one relay to only allow certain kinds of content and block everything else. Users can just connect to multiple relays if they want to read/write about different things.


What bugs me about it is their naivety to solved technical problems. For instance, they answer the question of “why this hasn’t been done?” with:

> I don't know, but I imagine it has to do with the fact that people making social networks are either companies wanting to make money or P2P activists who want to make a thing completely without servers.

Except it has been done. In fact, that’s literally what KaZaA was with its “superpeers”. And what they realized was that by making a semi-decentralized system, they just introduced the weaknesses of both systems (slow downloads via peer-latency and network limits + easy censorship by killing relays/nodes). In addition, this is exactly how IRC works, despite the fact that it’s mostly used with a few nodes these days.

I’m not against semi-decentralized systems. They’re great and help deal with some scalability problems; but they don’t solve for the number one issue most people moving to decentralized are seeking (anonymity, privacy and free speech), so it’s not fair to compare it to platforms/protocols that do offer those features.


They may be young and unaware of "solved technical problems," but they are full of energy, and every generation has to relearn the same things.

But as for peer latency issues or easy censorship by killing nodes, I don't see it. Nostr has fan-out, but not as much as RSS does and I don't expect superrelays.

I also don't follow you on the issue of anonymity or privacy. The guy who started it fiatjaf is anonymous. We don't know who he is. And you can be too. Just make up a key, create and sign an event, and push it into whatever relay takes your fancy... through Tor if that's your thing.


I don't know about KaZaA but I remember it was very popular for a time, so it might have done some things right? What was its fate? Was it censored to death?

And I disagree very much that IRC is "semi-decentralized". IRC is completely centralized, it is just chat rooms on a server. You have to register on each server and each server has full control over its rooms and users.


I'm sure all the script kiddies who loved to take over channels in netsplits are gonna be disappointed that they never actually did that now.

More seriously, this is the second time I've seen someone on here characterize IRC in this (very wrong) way in the last day. Where is this coming from?

IRC networks are made up of servers that relay (hence Internet Relay Chat) with each other. You connect to one server and you can communicate both with people local to that server and people on other servers that are part of the same network (including ones that server is not directly connected to). Channels prefixed with # are shared across all servers in the network, while channels starting with & are local to that server (though rarely used).


I think you may be confused because you decided to rely on this loose concept of "semi-decentralization". IRC providers may use multiple servers, but that doesn't mean decentralization. They are closed networks, not very different from any sufficiently big internet business that runs multiple servers behind a load-balancer.

See https://drewdevault.com/2021/07/03/How-does-IRC-federate.htm...


Part of it is that for users IRC definitely presents as centralised. You don't usually connect to a specific server, but rather a network that did some load balancing in the background. Like you typically connect to irc.efnet.org and not one of the sixty servers specifically.


KaZaA was also a big spyware.


Scale is hard, but for small-to-medium sized clusters, you can use your own node, or a friends'. Now you have your own copy of the data, and control over latency. The key insight with nostr is the multi-master architecture.


Relays can censor as much as they want, since they're only stewrding a small part of the network, likely replicated elsewhere. As to building in ways to curate/moderate content in a distributed way, there are lots of ideas out there as to how that might happen. Here's an issue I opened just the other day: https://github.com/nostr-protocol/nips/issues/75


Yes this is the crux of any social media application. I don't know if there will ever be a perfect solution.

I like that nostr abstracts this problem away from the relays. Relays only focus on storing data and handling subscriptions. They can choose to censor and/or curate content if need be, but it's not their concern.

It's up to the client to come up with a solution, and that client can be a platform or a protocol of its own.

edit it also feels really great to work on that problem from the application layer. I can come up with a solution that isn't confined to the parent protocol.


I'd like such a protocol to be designed so servers, relays, etc. are unable to censor content at the protocol level (if someone creates it, it is available) and the filtering is done on the client side.

Commonly filtered things (account block lists, post flag lists, filter rules, etc.) could be shared via the same system — indeed there could even be competing versions and everyone could follow their preferred filter source.

Users would also likely run statistical and machine learning based spam and content filters locally (perhaps on a personal relay/server of some sort, or an account on a shared one) configured to their preferences.

I would expect the infrastructure running such a network to be in the same position as Signal, who do not know the content of messages and can't censor them, leaving individual clients to figure out blocking etc. (albeit the client side options as well as ways to share configurations etc. would need to be much more advanced for a social network or similar than for a messaing app).


Just mute them. Or just follow who you follow with no suggestions of other people. Or relays can have a censorship policy based on the law or community standards or anything else they want... and the people will use whatever relays work for them (typically multiple relays to follow multiple crowds). Some people want censorship, some don't, the protocol is totally agnostic on this point.


I noticed you haven't shared your email address in your profile. Surely if muting is sufficient you would accept that it's a risk-free action?


> r because there are some pretty solid laws about things that various governments require people to filter out?

There are solid laws protecting copyright everywhere yet it is stilm trivially easy to find copyrighted content available for free online. Laws dont mean anything unless they are or can be enforced.


[flagged]


Letting other countries run intelligence ops on you isn’t actually beneficial, even if you enjoy it at the time.

(No, the president’s son didn’t actually leave a laptop at the repair shop of a blind man in a different state and then not come back for it. And while some emails were verified through DKIM, the one you specifically mentioned wasn’t.)


What stake do you have in either of the debates that you're referencing? When was the last time a politician's family member or a trans person did anything good or bad for you, personally?

If you didn't allow social media to get you invested in these manufactured pseudo-political pseudo-events, you'd be free to realize that Trump and Biden are both obvious idiots and crooks; that people's gender identities, biochemical makeup, and surgical history are obviously completely irrelevant to complete strangers; and that many other things that you waste your time constructing your tribal identity around are obviously complete bullshit that someone artificially planted into the discourse for their own benefit. That someone couldn't care less about your best interests as a person, anon291.

The problem isn't that Twitter and Facebook censor one side of those debates and promote the other. The problem is that people are using them in the first place! That way, social media is the thing that forces those debates to exist in the first place - by making you feel like you have a stake in events that are completely remote from your life. Whose agenda is being promoted is tangential to the fact that these unelected, unaccountable corporate bodies have entrenched themselves in a position where they can basically replace our individual world-models with this sort of outrageous nonsense. That's what's fucking democracy up, and in the most elegant way, too: your right to make free choices between alternatives is preserved, but only inconsequential choices are ever presented.

On the Internet, nobody knows you're a dog; on corporate social media, nobody cares that you aren't a dog.


> that people's gender identities, biochemical makeup, and surgical history are obviously completely irrelevant to complete strangers

At least in the UK, this became a matter of renewed public interest for two reasons: firstly, a proposed reform of the law to remove all gatekeeping from the process of changing one's 'legal sex', and secondly, an appalling case of several imprisoned women being sexually assaulted by a man (Karen White) who had been incarcerated alongside them due to having a 'legal sex' of female.

It was left-wing feminist groups, who organised largely offline to begin with, that reignited this debate. This wasn't some artefact of social media raging, it was a grassroots effort to halt and reverse a change in the law due to its clearly negative effects on women.

This is also a debate that has been ongoing for decades, long before social media websites even existed. Janice Raymond wrote what turned out to be a quite prophetic book on this topic in the 1970s, for instance. Renée Richards was stirring controversy in women's tennis at around the same time. Much of what you'll hear on this topic these days has already been covered by radical feminists for many years prior.


You make valid points - but I don't think the author of the post that I was replying to would be able to appreciate the nuance.

By bringing up the Karen White example, aren't you basically saying it would somehow be less appaling if it wasn't a trans person perpetrating the assault? Because I thought this sort of thing was abhorrent regardless of the salient details of a particular case?

Social media only makes it easier to focus on "which cage should we use for transgender people", and so much harder to ask ourselves "why are we putting people in cages". Or, as per the other example, "did Hunter Biden really lose his incriminating laptop?" vs "why are we letting ourselves be governed by people with familial ties to criminals?"

Social media and the polarizing meaningless debates that it enables serve the purpose of precluding people from focusing on the latter kind of question. (Which is already hard enough as it is, because it involves actual thinking.) If the public conversation is retreading ground that was already covered in the 1970s like you say, doesn't that mean that our society is regressing? Shouldn't be worrying first and foremost about that, since that's where we'd find the root cause of all the more specific issues?

How many people feel good for having the correct in-group opinions, while their contribution to e.g. the trans rights debate only goes as far as canceling JK Rowling, or, conversely, going to a Jordan Peterson talk? How many people have even heard of the actual examples you mention, as compared to the number of people who only know "uhh, so there's a debate on the Internet about some abstract hot button issue, and I'm required to pick a side in order to participate in society"?

A couple years from now the topics may be completely different, they'll just find another scapegoat or another thorny bioethical edge case, but the medium of debate will still be largely the same ol' Internet, and the AIs will only have become more effective at sowing discord.


Wow, nice "people are saying" play. Every single time the laptop story comes up it's in a butthurt way that is hilariously selective in its indignation.

Misinformation/disinformation is a thing and is not considered healthy for discourse. It also seems to be willingly embraced when it serves one's biases. Pity.


The laptop story would not have survived for two plus years were it not for the fact that Twitter and facebook both took unprecedented steps to censor it. Like literally... if you want something relegated to ignominy, don't do what facebook and twitter did.

Plus, at the time of the censorship, facebook and twitter claimed that the misinformation was that the laptop did not belong to Mr Biden. At this point, every news outlet of repute has admitted the laptop belongs to Hunter.


> Misinformation/disinformation is a thing and is not considered healthy for discourse.

The stupid laptop being a Russian forgery was misinformation/disinformation that affected an election. The NYT and WaPo have faced up to it by now, but middle-aged extreme partisans will post "cope" like they're millennials from 2015.

Or "butthurt."


I don't agree Nostr is P2P since clients must connect to relays and there's no provision at the time for client to client connectivity. I found out about Nostr and due to the simplicity of the protocol I was able to start building right away. I implemented a relay using Typescript: https://github.com/Cameri/nostr-ts-relay I also wrote this dead simple SMTP to Nostr gateway: https://github.com/Cameri/smtp-nostr-gateway


How does your client (or any nostr client) deal with discoverability?

I'm not piling on nostr here, that's an issue with ActivityPub as well as most decentralized platforms/protocols?


You can look someone up like bob@example.com and if he publishes his nostr information in a well-known file, you'll know where to follow him and what his public key is: https://github.com/nostr-protocol/nips/blob/master/35.md

Of course this doesn't work for people wishing to remain anonymous.


I'm not working on a client. But some of the ones I've used have a Global Feed/Explore section like Astral.ninja and Damus.

I wrote a simple directory you can use to find users and channels here: https://nostr-fzf.netlify.app


A bit disappointed about that actually. For a moment I thought someone had done a sane version of Scuttlebutt, but alas :)

Though at this point I probably wouldn't use that either.


Which part of Scuttlebutt do you find not sane?


Last time I checked it out it depended on JS-specific behaviour which had to be replicated in other languages. It seemed like something was written first, and then they decided to extract it into a spec.

Maybe this has been cleaned up in the interim, but at the time a bunch of apps for it already existed so it was probably to late to go back.


That, and the insistence on 1-1 device:identity mapping (so you can't use the same account with multiple devices). And verifying an append-only log, so "show me Jane's recent posts" can't happen until it's downloaded & verified signatures for Jane's entire post history going back years.


I'm beginning to think that ActivityPub will never add anti-censorship features into the core protocol (for example, independent IDs that don't live on a relay server). Because I spoke with a core developer on ActivityPub stuff a couple of years ago about the issue of servers going offline and losing your account and he said that he was working on it.

But I wouldn't be surprised if he was discouraged from continuing this work because ultimately people like ActivityPub because server admins can be little dictators that censor and ban at will.


> because ultimately people like ActivityPub because server admins can be little dictators that censor and ban at will.

ActivityPub isnt the only distributed thingy to learn that lesson. There are a reason the various ultra-free-speech hosted on the dark web, things never take off.


I think the architecture is the issue. Nobody wants their account owned by a random guy running a server as a hobby.


Of course they won't add anything, and if they add no one is going to implement.

Mastodon, Pleroma et al. are not really ActivityPub implementations. Each of the fediverse clients implement some random 10% of AP and just try to get compatible with each other using dirty hacks.


A huge advantage to many large projects is separations of concerns, particularly in the case of protocol (or API/etc) vs policy.

From the sounds of it, you have the protocol but not the policy. Which is by itself huge that they are separate, but now the clients (?) needs flexible policy, no? Otherwise its just going to turn into a billion people all talking in the same room, or your going to have a ton of tiny rooms with no activity. The discoverability of interesting rooms will be difficult. Its sorta the IRC problem in a nutshell (or discord/etc). Balancing the noise, vs the quiet is the difficult part (AFAIK).


Still trying to understand how it all works but maybe there could be some agreed way of creating rooms, like naming them after sub Reddits.


So reading the github page this sounds basically like a variation on RSS.

There are a number of web servers that host content (either for free or for money) [called relays].

Clients download recent posts

Identity is based on public key, allowing users more control and the ability to easily change relays.

So is RSS + pubkey based identity the right way to think about this?


That's roughly my first thought too: the project seemed like pubsub with message authentication and without federation. Was about to bring up "Atom and RSS, mailing lists, NNTP, XMPP, ActivityPub, Matrix" as examples of existing protocols capable of that (to a different extent, and usually with more functionality), thought to add Mastodon to the list of examples too, but the README already mentions that Mastodon is undesirable because of the dependence on domain names/administrators (and elsewhere mentions undesirability of running servers by users, of having many servers, while considers simplicity from the point of view of in-browser JS, apparently), that the servers aren't run well, and so on; seems pretty opinionated and enthusiastic about this protocol, so I guess those existing protocols won't satisfy all those requirements either.

Initially restrained from posting it, since it's possibly rather grumpy, and the authors seem to have fun with the protocol, but here it is.


The story around relays seems to me like the most suspect part of this when it comes to scaling.

The protocol doesnt want users to run their own servers, but there also aren't really any incentives to run relays, so what happens when it gets big enough that running a relay is expensive or non-trivial? I feel like it would just fall back to users running their own servers like in mastadon.


That's not a problem if the server owners don't have any power over users.


They still have to exist. The usual reason for existing is to exploit their power over users. When its cheap to host people will do it out of the goodness of their heart, once it hits scale the goodness tends to dissappear.


That is how I think of it. RSS with user-created public key identities. Lots of client fan-out to lots of relays. And a straightforward way for anyone to post (unlike RSS)

Relays have to figure out how not to get smashed with too much data. I predict they will require an account/login at some point. But you can post to multiple relays and drop relays that don't serve you well at any time.


I like to think of it more like (old) Usenet with pubkeys. It gets more at the distributed database factor.


Not saying this is the future, but something like it is. All of the core decisions here are solid (pub key identities, signed events, dumb relays).

There are still features that many apps will need such as tying multiple devices to an identity, abuse prevention for relay operators, etc.


“All of the core decisions here are solid (pub key identities…”

I agree, except for the bit about public keys as identities.

I think public key identities are a step in the right direction, but there’s still a gap between that and what the ultimate solution is going to wind up being.

We need to have some layer of indirection between user identities and public keys so that users can do things like rotate keys, have multiple keys, and recover their identities.

I don’t know what the right solution to that is; I think it’s an open problem and probably one of the most important ones to solve. Keybase probably came closest to a good solution, but it wasn’t decentralized.


I was reading about algorand rekeying today, as well as DIDs and atproto/bluesky.

Both seem to use a “signed rotation” approach. Algorand keeps your public key stable while adding metadata that your spend key has changed and links the two. Atproto similarly uses the recovery key to sign a rotation op which can regenerate your signing key, additionally readjusting the tree to preattack state (by setting prev of the rotation to the last precompromise state).

This seems like an improvement of some kind, but still leaves gaps for lost keys. Keybase style approach, or multisig social recovery may also help.


Until Algorand can remove the CGO requirements and libc JS dependencies then I hope it won't ever be considered for something like this. Let's not also forget about their terrible management.


I wasn’t suggesting either of the technologies wholesale. The “signed rotation” commonality seemed interesting, with subtle differences. I’m curious to see where DIDs go, I’ve seen those crop up a few places.

UCAN also seems interesting, JWT with extra steps and attenuation. But orthogonal to this issue for the most part.


Is there a way to do what you're suggesting with identities? I don't think there is. How are you going to rotate keys without a master key?

And even if you're ok with the master key, the only way to solve this without centralized providers is with blockchains. A blockchain for rotating keys doesn't make sense.

But I do want to know if you're ok with a master key and subkeys that can be rotated.


“Is there a way to do what you're suggesting with identities”

There are certainly solutions, but I don’t know what the best solution is, hence why I called it an open problem.

An example solution would be something like having your identity be a hash of your initial public keyset, making each key have a set expiration date, adding new keys by signing them with one of the existing keys, and then storing all of the rotation operations in a transparency log.

“the only way to solve this without centralized providers is with blockchains”

That’s not true; you probably want a transparency log, but that doesn’t require blockchains.


Vitalik wrote a bit about ways blockchains can help with identity systems here: https://vitalik.ca/general/2022/06/12/nonfin.html


My vote is for extended keys, something based off HD Wallets:

https://github.com/bitcoin/bips/blob/master/bip-0032.mediawi...

Easy rotation and recovery of individual keys, but you do have to protect your master seed.

Nostr also supports user verification through DNS hostnames.

https://github.com/nostr-protocol/nips/blob/master/05.md


How can you rotate that? No one knows the second key is related to the first. You still need to publish your second key somewhere along with an invalidation certificate for the first key.


They do with the extended key as it includes the chain code.


Rotate keys: old key signs an event indicating that it is being rolled into a new key.

Multiple keys: nothing to change. Works like that now.

Recover your identity: Well, if you want a well-known identity use NIP-05/NIP-35 and just change your .well-known/nostr.json file to point to your new identity, the one that hasn't been stolen. Hopefully nostr clients of your followers will respect that (who knows what programmers actually will do).

I think these problems are easier than you think they are.


Something like passkey?


A couple of ideas that have been tossed around for relay abuse prevention:

- Proof of work: computing some hash, which is not enough to be onerous but enough to reduce spam

- micropayment over Bitcoin lightning network


Proof of work via computing should be a non-starter for anyone caring for limiting environmental damage.


Bitcoin is carbon-negative believe it or not. Not that CO2 emissions are a meaningful measure of environmental stewardship.


I absolutely, unequivocally do not believe it.

Bitcoin consumes 111TWh annually, the power consumption of the Netherlands, and emits 62Mt of CO2 per year, the same as Belarus. It also yields 42kT of e-waste per year.

Each transaction produces 650kg of CO2, consumes 1160kWh of power (as much as 40 days consumption for the average American home) and produces 450g of e-waste (about the same as hucking your iPad into the garbage can each time you transact on-chain).

97% of all Bitcoin mining hardware will never successfully produce a single block in its entire useful life, going from factory, to space heater, to garbage can - while about 60% of all the power consumed comes from oil, natural gas and coal. So whoever sold producers the offsets I assume you must be alluding to better have replanted the entire Amazon rainforest by now. (Quick spoiler, carbon offsets are also a scam, generally speaking).

To say you're going to need to get some sources is an understatement about as large at Bitcoin's environmental footprint.

You can find all this in [1] or you can just reverse it yourself from the specs of the latest AntMiner and the current hash rate. Some napkin math is all you need.

I'm honestly amazed people still believe something so trivially falsifiable, but with everything else going on in 2022...

[1] https://digiconomist.net/bitcoin-energy-consumption


> 97% of all Bitcoin mining hardware will never successfully produce a single block in its entire useful life, going from factory, to space heater, to garbage can

This is a really disengenuous point. Mining works probabilistically, and mining pools payout based on smaller units of work that probably have some probability of finding a block for the pool. The fact that a block itself is a large parcel does not make the system less efficient.


The point is a centralized equivalent of bitcoin could run on a raspberry pi. Heck a proof of stake bitcoin could run on a raspberry pi. It’s a ridiculous system even Hal Finney thought was unsustainable 10 years ago.

Value judgement aside the question was “is it carbon neutral” and the answer is a resounding no.


That's an entirely different topic/point. I was responding to your ridiculous point that 97% of bitcoin miners never find a block. You didn't even try to defend that point.

What is your source around Hal Finney? I have not heard that.


> That's an entirely different topic/point.

The fact we have a system where 97% of all the equipment isn't used to ever do anything useful and instead "increase security" is an insanely wasteful system. Nobody has quantified what level of "security" is required. There's no feedback mechanism to pull back based on need because the need is fundamentally unquantified. It's a grey goo style uncontrolled positive feedback loop.

Each time more miners come online it's lauded as "more security is better" - but how much security do you need! The answer isn't "as much as you can afford period no follow-up questions."

You don't have 75 seatbelts in your car because "security." You don't use a dump truck to take your kids to school because "security." And you don't use a global army of computers consuming 100+TWh/yr to process 2-3 tx/sec because "security."

Ultimately it's a half-baked security model, and Bitcoin is a half-baked proof of concept that escaped the lab and gained a cult following.

> What is your source around Hal Finney? I have not heard that.

My source is Hal Finney [1]

[1] https://twitter.com/halfin/status/1153096538


You didn't understand my point. If bitcoin had 1 minute blocks instead of 10 minute blocks, more than 3% of miners would find a block in their life time. But that change alone would not change any meaningful properties of bitcoin. You think bitcoin is a total waste, I get it. But that statistic doesn't help your argument.

> The answer isn't "as much as you can afford period no follow-up questions."

You are right, and this isn't how bitcoin works. The amount of mining should be determined by the block reward, market price, and fees per block. Mining in excess of this is not economically rational, and shouldn't happen. So it is not "unlimited security" at any cost"

Where in that tweet does Hal characterize bitcoin as ridiculous or unsustainable?


> But that change alone would not change any meaningful properties of bitcoin.

Sure it would, it would change that resource cost per transaction.

> You think bitcoin is a total waste, I get it. But that statistic doesn't help your argument.

I disagree, if 97% of servers at AWS were there for some hand-wavey notion of 'providing security' without quantification I think Amazon would be roundly mocked.

> You are right, and this isn't how bitcoin works.

Yes it is. You just re-stated my position with slightly different wording.

I said the security model was "as much as you can afford." You can afford block reward plus fees times price. The security model is "spend as much of that as you can without regard for what you need to achieve security." The issue is that block reward plus fees times price is not a function of how much security is required.

> Where in that tweet does Hal characterize bitcoin as ridiculous or unsustainable?

I never said 'ridiculous' - obviously I don't think Hal would make that claim, so let's stick to what I did say :) I think it is not unreasonable to extrapolate from his tweet that he believe that at the limit CO2 would be an issue. CO2 itself is an issue of sustainability. You can disagree with that interpretation, but I do not think that an average unbiased observer would find my reading unreasonable.

You seem to be trying to win an argument at all costs, putting words into my mouth, running with uncharitable interpretations and arguing in bad faith - in this thread and the other. If you read carefully you'll find I paid close attention not to move goalposts. I'm going to cut it off here. It's particularly silly because you've already admitted you agree to my premise that it is not carbon neutral.

Have a good evening though.


> I disagree, if 97% of servers at AWS were there for some hand-wavey notion of 'providing security' without quantification I think Amazon would be roundly mocked.

Again, you fail to understand that the number of a machines that find a block could be changed by making the blocks smaller. You could do that by also reducing block size to keep the total number of transactions the same. But the larger point is that mining pools make it so that mining machines are paid for partial work, even without finding a block. This enables the utility of mining to be measured by hash rate, rather than by number of blocks solved.

You called it a gray goo style positive feedback loop. It is not, there is a upper bound on how much will be spent. There is no feedback loop as well, if the price goes up, more will be spent on mining, but that does not feedback to make the price go higher as required for a loop. Moreover, the block reward is exponentially decaying.

Your reading of the tweet is that a person saying a system should use less CO2 implies that person believes the system is unsustainable. I don't think that is reasonable, e.g. an Airline CEO launching an initiative to reduce the airlines CO2 usage would not believe that the airline is unsustainable. I did misread the other part of your statement-- you said bitcoin is ridiculous, not that Hal said that.


So much FUD.

1) miners profit from energy surplus. They do not create new power plants. Nobody is firing up a new coal plant to mine bitcoin. That is absurd.

2) miners have incentives to find and use wasted energy. For ex, flare gas recycling. Which actually helps the environment.

3) miners can turn on/off at will, and produce energy loads on demand. Which means they balance out the energy grid. Especially from erratic energy sources like wind and solar. This is already being deployed in some states.

You are fighting on the wrong side buddy.


(1) They literally reopened a natural gas plant in New York, where were you? [1] the broader point is this fossil capacity can just be turned off if we stopped wasting it on a bitmask lottery. You don’t have to bring on new capacity to cause harm (although they are doing that too).

(2) burning flared methane is better than not but not flaring at all is best? There’s no such thing as stranded power, there’s only missing transmission infrastructure which is being incentivized not to come online through profitable waste at point of generation.

(3) raising the baseline usage level through waste then turning it off when there’s a brownout level crisis is just an asinine plan to manage load that costs everyone money. Active demand management is an actual solution. So is grid storage. Also, “some states” is Texas, famously the worst energy grid in the US, home of the blackout. Meanwhile New York is banning wasting fossil fuel energy on miners.

Anyways you seem to bring no data to the table other than a “nuh uh” and some obviously flawed apologism.

No word on the ewaste?

It appears to me you’re using the “facts u dislike” acronym expansion for FUD.

[1] https://grist.org/technology/bitcoin-greenidge-seneca-lake-c...


If the cost of the transmission infrastructure costs more than the economic value of the energy it would transport, then you have stranded power.


Fine, I disagree but even if I were to concede surely there’s a better solution we can think of than looking up a big resistor.

The question is how much so called stranded power is there, what percent of Bitcoin energy usage does this account for and is there really nothing useful we could do with it? Finally with “stranded coal” is it better to just not use it?

This is just a talking point used to distract from the actual issues. In reality the percentage of renewables wasted on bitcoin mining is at an all time low. The question at issue was whether all this is carbon neutral, and no, it’s definitely not.


The original comment that bitcoin is carbon negative is ridiculous, I agree. But you are posting empty/wrong rebuttals as well.

In the case of flare gas, it is a byproduct of gathering oil that is unavoidable. The oil is economically worth collecting, but the extra gas is not. I think this can be more finely split into natural gas vs methane-- sometimes the natural gas is worth collecting, but the methane never is.

You could argue that there should be laws to force the methane to be collected and used elsewhere as a condition of accessing the oil. That might be overall better. But prior to bitcoin, that perfect solution was not really implemented. In this sense, bitcoin mining of flare gas is a step forward.

Note that the argument is not just that economic value is created by mining bitcoin, but that running the gas through a generator leads to cleaner combustion and thus less greenhouse gas emission than simply lighting the gas on fire. In this sense, the bitcoin flare gas mining operation by itself could be considered carbon negative (again, surely all bitcoin mining is not). The bitcoin mining pays for the generator. So bitcoin mining is giving the industry a subsidy that enables reduction of environmental harm. It is not perfect, but it is an improvement.


> But prior to bitcoin, that perfect solution was not really implemented. In this sense, bitcoin mining of flare gas is a step forward.

Prior to bitcoin they flared it - or didn't. Now there's an economic incentive not to find another solution. Now the oil and gas companies are incentivized to prevent that better solution because there's something in it for them not to. So we make it harder to solve 100% of the problem because this gets us 5% of the way there and pays us not to.


Cool, so you moved the goal posts from saying "there is no such thing as stranded energy", to saying "yes, bitcoin is reducing the environmental impact of flare gas burning, but we should be doing something else instead (no concrete ideas) that can solve this problem even better". I have explained to you how bitcoin can have a net positive impact on the environment, given existing practices. You now say that since this is not a perfect solution, we should go back to harming the environment more, presumably since you have already made up your mind that bitcoin is evil?


Digiconomist’s model is flawed and doesn’t represent actual mining electricity use or carbon output.

Also he’s a central banker, so not exactly motivated to fix it.


> Bitcoin consumes 111TWh annually, the power consumption of the Netherlands, and emits 62Mt of CO2 per year, the same as Belarus. It also yields 42kT of e-waste per year.

it only consumes as much as the complexity required to roll out new blocks. As its price goes down so does the energy required to find new blocks. It is not static.


I gave current numbers. Surprisingly (and obviously unfortunately) the price change hasn't had an impact on energy consumption commensurate with the drop. There's a lot that goes into it - for instance cost basis of power: theft of power, graft, corruption, etc. If you're able to steal the power then it doesn't matter much so long as you can afford new miners.

tl;dr: price per coin represents something of a bounding function on consumption of resources (including mining hardware and electricity) but it's not as tightly correlated as hoped.

My point though is the idea all that consumption has been papered over by what I assume is buying some offsets - and now it's magically carbon neutral - is silly and obviously wrong.


Or wanting to operate in New York State as of this week!


Is it forbidden to calculate in New York now? Had no idea.


I suspect you're being intentionally obtuse, but it bears repeating that there are plenty of simple activities that are forbidden by policy, and it would be just silly to argue that there shouldn't be any.


Link to nostr NIPs (nostr improvement proposals): https://github.com/nostr-protocol/nips/blob/master/README.md

One of the neat things about nostr is that while it has already been used to build a decentralized Twitter like social network, the protocol could also be used to build encrypted P2P chat, traditional discussion forum, alerting/push style notifications, and numerous other applications.


Might be worth going with "enhancement proposal" instead so you don't call your governance documents "nips".


They should go all in to find an acronym for the full word.



I was pretty bullish on Nostr back then, and still am.


If everything goes through relays then is it really P2P? Why not even try to have a direct connection of any sort, such as WebRTC?


You can argue that it is not true P2P, since you rely on a public-facing intermediary.

A lot of p2p protocols cheat with relays, it is really hard to traverse nats otherwise.

Nostr can be used for peer discovery to bootstrap a direct p2p connection.

You could also use a client/relay hybrid application, similar to other p2p networks. That would be fun to build. :-)


Is email p2p? Can you configure multiple relays like MX records for email? Can a receiver be its own relay?

Relays are important for two reasons: peer discovery and communicating when one of the parties is offline. Same as with other p2p networks.


I mean, the pure P2P solution would be supernodes.

Edit: i RTFA. Sounds like relays can be run by anyone. That sounds p2p enough to me.


Cool! How do you discover users to follow on nostr?


Great question! Relays aren't involved in curation or discovery, so it fall on the client.

You can request very broad subscriptions from relays! For example, here is a site that subscribes to everything, showing you a gods-eye view of events streaming into a relay:

https://nostr.info/relays

Events have different "kinds", so you can filter this based on the type of traffic you are looking for (like public posts or user profiles).

Platforms like damus.io are more user-friendly, and offer better tools for discovering users and content.

You can subscribe to a user's feed via their pubkey, so discovery methods typically revolve around learning pub keys.


Makes sense, thank you!


How do you discover users to follow on Twitter?

I doubt anyone has ever been successful into signing up on any social platform and just followed the big names that are suggested automatically at the beginning or based on some "key interests" you select.

But hey, if you want that, it's easy for a third-party website to grab a ton of public Nostr data and build custom recommendation lists and whatnot.



What does "built by builders" mean?


Crypto people talk like this to distinguish real projects from ones that were purchased off the shelf and launched by marketers/hustlers. I’m guessing the guy who wrote this is a crypto person who doesn’t realize the slang is so local.


Guilty >_>

It feels nice when a protocol has its core devs on the same wavelength as application devs. That's the feeling I get with nostr.


Even if you bought something off the shelf, it was still built by someone (presumably “builders”), even if that someone isn’t you.


True!


shill: One who poses as a satisfied customer or an enthusiastic gambler to dupe bystanders into participating in a swindle.

It's not shilling. It's recommending. Shilling is a bad thing. It's a simple thing.


I was being a bit dramatic with my choice of words but you are absolutely right :-D


> A relay doesn't talk to another relay, only directly to users.

Can you elaborate on this point? It would seem that meshing relays would've facilitated the dispersal of updates.


People generally post to 5 or so relays for censorship resistance. If you want to follow them, you need to query at least 1 of those 5 relays.

Nothing in the protocol specifies relay-to-relay communication, but nothing stops them either.


The inability of people to spell Mastodon correctly will always surprise me. Also, isn't « nips » a slang for nipples ?


It is, but it also means other things too: nip in the bud, a nip of alcohol, to nip off with something, to nip by, etc.


fixed >_<


I don’t understand —- this isn’t a P2P protocol since it isn’t peer to peer by definition. And the “relays” don’t even relay anything … a “relay” is a server acting as a Dropbox that you have to query.


https://github.com/nostr-protocol/nostr literally says "it is not P2P".


How do I discover people and follow on https://branle.netlify.app/ ? I can’t figure it out


Try astral.ninja it has support for a global view


So, Usenet built out of modern tech? Nice:)


Check out more-speech (a Nostr client) being built by Robert C. Martin (Uncle Bob).


At this point I don't see much point in adopting something that doesn't support ActivityPub. I'd rather just use Mastodon/Pleroma/Akkoma with some heavy blocklists.


I've been looking at alternative implementations since I don't want to run the entire Mastodon app for just myself. The fact that every single ActivityPub implementation runs into so many interoperability issues, such that they have to be fixed to work with each other system one by one, is a sign to me that the protocol is either too complicated or not robust enough (probably both).

There is tremendous value in a much simpler protocol, especially if it can deal with the identity migration issues that Mastodon has faced since day one.


> so many interoperability issues [...] the protocol is either too complicated or not robust enough

From what I've read, it sounds like the interoperability issues come from not implementing AP the same way as Mastodon rather than anything specifically protocol-related. Hence, e.g., Pleroma going out of its way to be Mastodon compatible as a design goal and GoToSocial having lots of issues because it hasn't yet gone that route[1]

[1] e.g. they have an issue where some client apps don't work with GTS because GTS doesn't advertise a version number sufficiently high enough - because the apps are looking for "v > 2.10 because that's when Mastodon introduced feature Z".


I wouldn't say that. In practice, everyone tests Mastodon first since it is the most popular implementation.

Here are the ones gotosocial have issues with: https://github.com/superseriousbusiness/gotosocial/projects/...

Not to say I don't believe you about Mastodon, I am sure they have their quirks as well.


A stupid simple message relay protocol can be used for stuff other than social media.

OTOH websockets are hard outside the browser :(


Maybe years ago but now Websockets are dead simple to use in virtually every language. Heck you can even pipe to a websocket server using websocat using the shell.


Websockets are almost trivial now if you just use a library


Actually, websockets are trivial full stop. I implemented a web socket proxy in C in a day by reading the excellent RFC (sure, there's weird shit in there, but it's not hard).


Isn't this just Urbit with extra steps?


Cool




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: