Hacker News new | past | comments | ask | show | jobs | submit | rys's comments login

Just in case anyone reads the parent and the takeaway is you can’t: you can still operate UniFi that way if you want to. The cloud connection and apps are optional.


Interesting. Did that change at some point in the past? I'd seen reports from folks suggesting that you were stuck either using the cloud management bits or using their app, and that a plain local web connection didn't work. Does Ubiquity equipment work entirely with a local web connection now, with no app or cloud, for all their functionality (except, obviously, cloud management functionality)?


The answer depends on which piece of equipment, usually by product family. More and more is getting the cloud push over time though.


Ah. That kind of variability is the kind of thing that makes me want to avoid the entire brand.

Any recommendations for comparable equipment that doesn't have any kind of push towards cloud or app?


Agreed. As to the world lived in: most people are so enthralled with money that any risk to earning more in the future is unacceptable to them, sadly. Companies then bank on that behaviour to help sweep things under the rug.


The mention of OLED is sideways to the network stuff, it just so happens the device he chose has an OLED screen.

The protocol is FreakWAN, so there’s no TCP/IP involved. LoRa devices send messages opportunistically and listen out for other messages they can understand and then maybe act on if they choose to. It’s about as simple as you can imagine.


Isn’t DW state-owned by Germany?


They use commodity LPDDR4X or LPDDR5 depending on SKU, for M1 and M2.


Thanks! I missed that.


Do we know how he cheated in the in-person over the board games? That’s the most fascinating aspect of the story now.


One thing that struck me as I read through was that the computational cost for TLS 1.3 is so expensive that it becomes infeasible to run on the slower machines, resulting in timeouts and failed connections while setting them up.

Yet those old machines had full GUI OSes and performant apps and snappy interactivity in their day. Is the sheer amount of computation required to facilitate TLS 1.3 really more than what was needed to run entire interactive GUI operating systems of that era?

Kind of mind boggling if so.


My Amiga back in the day had a very snappy gui. But I remember it took 1 minute for a 512px jpeg to decode using FastJpeg which was written in assembly language, and I'd leave my computer running all night to draw a mandelbrot set or perform a simple 3d render.

Not to mention that if you wanted to use floating point arithmetic, you had to do it in software via a library, or use fixed point routines. Floating point units were expensive hardware add ons that were eventually integrated with the cpu.

It is to the credit of the gui toolkits on those machines that they seemed so fast, until you tried to do serious computation. Then you became fully aware that you only had a 7mhz cpu!


I think it is forgotten just how simple thing a basic gui is graphically. Ofc, things look nicer now, but still. It is not very complex thing.

On other hand, also always make me question what are we really doing these days. Some tricks are nice and make things better, but how much is wasted on billions of devices every day...


That things look nicer now is highly debatable. More complex, but just perhaps. How many pages I've seen scrolling on HN showing how the proper "toggle button" needs to work and animate in order not to be confusing where a checkbox still looks infinitely better to me? Or overlay scrollbars, that save essentially no space (which is given to padding instead!!!), hide the position are more difficult to grab?

I still think the W95/Motif-CDE/late Amiga years had very decent UIs even by today standards. Especially on Amiga, everything was expected to be interacted with with minimal latency. Everything was absolutely super-responsive. The UIs were legible, functional, and had tremendous information density.

That is the total opposite of today.


And they had hardware acceleration for graphics (not a GPU as we understand it today, but the graphical composition was done in hardware).


You can have a complete tcp/ip stack and an embedded web server in a 8bit micro using only a handful kilobytes of ram. This is (used?) to be a good way to service super-simple configuration to the user without having to connect through a serial cable and without too many additional parts.

There's absolutely no chance that 8bit can do TLS though, and with browsers refusing to connect without https what is often done in the embedded space is to have a separate mcu which is just doing wifi+tcp+webserver. That mcu is 32bit or better (any of the espressif families for example), often eclipsing the power of the rest of the entire system.


I think a better conclusion is how much more resource intensive entire interactive GUI operating systems have become since. Cryptography inherently relies on algorithms that are as secure as possible given currently reasonable hardware requirements. More security means bigger keys, more cycles, more work.

Windows 95 could boot on 4 megabytes of RAM. It could work comfortably on 16 megabytes of RAM! Visual Studio 5, an entire systems IDE, ran comfortably with 32MB of RAM and a Pentium chip! Even if you ignore the extended screen buffer, you'll be hard pressed to find even a basic text editor that will run that lean. What's notepad.exe even doing with all that memory?

You don't need all that much for a GUI if you optimise well. Today's GUIs are built to be quick to develop, not easy to run, because everything needs to be done quickly and only optimised later if people complain about it. Real sad, in my opinion.

Sometimes I wonder what my system is doing with the 2 to 3 gigabytes of RAM and five percent of a quad core CPU running a several gigahertz clock; it's using right after boot; it's certainly not running 500 times the complexity of Windows 95!


> Yet those old machines had full GUI OSes and performant apps and snappy interactivity in their day. Is the sheer amount of computation required to facilitate TLS 1.3 really more than what was needed to run entire interactive GUI operating systems of that era?

Yes.

Some is raw speed - we are talking about going from 10Mhz 16 bit (on a 286 or 68020) to 2.4Ghz+, multiple cores and 64 bit. This probably sums it up best: you needed a processor capable of about 2 MIPS (386sx, 68020) (yes MIPS is a naive benchmark) to run a color GUI nicely. You bog-standard modern i7 laptop does around 100,000 MIPS.


Yup, the post makes it very clear that running modern crypto with reasonable speed requires quite a beefy machine by early-1990s standards. 40 MHz clock speed is mentioned as a reasonable minimum.


What if curvecp had won? Perhaps it still has a chance.


The IMEI is trackable and triangulatable geographically as it moves around the network. So it still helps determined actors to find the phone, and therefore maybe the user.


The same can be done with just a SIM though.

What IMEI gives you is the ability to track the same phone across SIMs.


Yes, but so is the ISMI, from the sim card, which is actually the primary identifier used for most things within the protocol. ISMI determines which phone number gets connected to the phone, who gets billed for phone calls, etc.

The main scenarios where IMEI should be interesting to law enforcement are: stolen phone situations, criminals that keep getting getting new burner sims while continuing to use the same phone, or people that keep prank calling emergency services without having a SIM installed.

In nearly all other cases, the ISMI is far more useful identifier. And obviously the phones in question all have unique ISMIs or the network would be having, err.... great difficulty with all of those phones having the same phone number, and being able to route calls.


Determined actors? Like Tom Cruise? He seems pretty determined.


Maybe it wasn’t meant to be taken literally, but it’s difficult to believe he listened to tens of thousands of tracks, which would amount to almost 10 weeks of non-stop music on the conservative end of an estimate.


I was unsure and looked it up, so in case it helps someone else: the person that died in Costa Rica is Mircea Popescu.


Yes, and our friend Pomp left this delightful message on Twitter to mark his passing.

> Mircea Popescu, a Bitcoin OG, has passed away. He likely owned quite a bit of bitcoin. We may never know how much or if they are lost forever, but reminds me Satoshi said: "Lost coins only make everyone else's coins worth slightly more. Think of it as a donation to everyone."

Disgusting. [Edit] To be clear I'm reacting negatively to his choice of framing Micrea's passing in terms of his own enrichment, not quibbling with the mechanics. I've nothing against the idea that supply going down would benefit the community, just the way Pomp chose to address Micrea's passing. I suspect Pomp realized the optics himself as he has since deleted the tweet.


> Disgusting

Weird to hear you say this. Many of your HN comments on Bitcoin demonstrate a rather collectivist take on crypto.

Yet this feature of Bitcoin (lost Bitcoin are effectively fairly shared among all participants) is probably the most "fair" feature of the system.

It'd sure help if you gave some sort of rationale.


I'm sorry, I edited to clear up what I was referring to.

I was taken aback by Pomp choosing to comment on Mircea's life in terms of how his death would personally enrich him. I've edited to leave this note. I suspect he realized the optics himself as he has since deleted the tweet. To me it read as "Mircea died huh? Well good news, our bitcoin's more valuable now."

I've nothing against the mechanism Pomp was pointing out, per se, other than I suspect any random saunter down the time line results in all coins eventually lost. I'm also aware of the counter-argument that infinite divisibility mitigates this.

Thanks for calling out the lack of clarity.


Thanks for clarifying, your "disgusting" makes more sense now.


> Mircea Popescu, a Bitcoin OG, has passed away. He likely owned quite a bit of bitcoin. We may never know how much or if they are lost forever, but reminds me Satoshi said: "Lost coins only make everyone else's coins worth slightly more. Think of it as a donation to everyone."

So, people are literally betting on the deaths of other Bitcoin holders? Is this supposed to be a joke? If there was a Bitcoin nation then it would execute its richest members...


You follow bitcoiners on twitter? That’s interesting

I would have assumed you had completely segregated yourself from them

Yeah I also believe the coins are lost and the principles of scarcity do apply, this is acknowledgeable with more tact than what Pomp chose to say, its an assumption that can just go unsaid


I do; when I feel really strongly about something I always try and challenge my opinions. I like to hear both sides of the debate and make up my own mind. I'd like to think I have more than a passing knowledge of Bitcoin and crypto and the ecosystem in general. Not as much as some, to be sure. I am open to being wrong, and of course it goes without saying that I've learned a lot from you and our various back-and-forths! :)

I work in payments, and have for a long time now so it's important I have at least something of an informed opinion on cryptocurrency in general. I've also got a few friends who did very well in crypto (and some who haven't done very well at all).

I follow and often engage with Paolo Ardoino too! :) I think in a different life he and I would be great friends.

I agree, Pomp is right about the mechanics, I was referring to the tactless way in which he chose to honor the life of Mircea - in terms of his own enrichment.


Many people die intestate every year and the state takes their net worth. That money, in many ways is a donation to everyone, since it marginally reduces overall tax burden.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: