Hacker Newsnew | past | comments | ask | show | jobs | submit | cdata's commentslogin

I had the pleasure of meeting Mikeal on a few occasions, but mainly I've benefited from his work over the years (initially via the JavaScript ecosystem, and later through the Protocol Labs community).

PouchDB was way ahead of its time, and I'm just now coming around to how crazy cool it was and is compared to most other tech in its space.

He made a great deal of positive impact on technical areas I care about. Rest in peace.


just learning about pouchdb now. why did it not take off you think?


Around 2016 sometimes, a small team (me included) built a "mini" version of our main product (Typeform) which was using PouchDB for syncing forms/answers between the backend and the mobile app (written with Phonegap/Cordova if I remember correctly), mainly so we could have offline capabilities.

Everything worked fine, and was cool to launch something like that since I'm not a mobile developer by any measure. But PouchDB required using CouchDB for the syncing, which was both the first document DB we deployed in our production infrastructure, and the only use case for having CouchDB at all, so we didn't have lots of expertise about it.

I think managing CouchDB ended up being the biggest maintenance hassle at one point, as it was kind of an extra piece, compared to the "real" setup that hosted the other production data. AFAIK, there was no experts on CouchDB at the company either.

So I guess in the end if this "frontend sync library" you're want to use also ends up dictating the backend storage/engine, then make sure you can "afford" a completely new and standalone piece for just that. Unless you're already using CouchDB, then it seems like a no-brainer.

Probably today I'd cobble together something "manually" with Postgres and WebSockets/SSE instead if I was looking to do the same thing again.


I remember 2017, at offline camp, I proposed talking about using offline first libraries with existing backends. Nobody, was interested. Seems the people interested in such tech were pretty much sold on CouchDB.

Just now, almost a decade later, we get libraries like Tinybase and SignalDB.


In addition to the sync issues mentioned, personally I think overcoming the browsers was the real issues. Nobody wanted to support this, the security would have been a contrived nightmare.


I entered the workforce in 08/09. At that time things seemed really dire. It felt to me like the whole house of cards was coming down, and I told myself that I would take any job that I could get.

I ultimately landed a job with an odd startup, eccentric founders, working out of an attic. In hindsight I couldn't have asked for a better start to my career. But, my expectations were rock bottom at the time.

Anyway, keep your mind open to all possibilities. You never know where an unlikely choice may take you. And, good luck!


I wonder if you could pair this with nix e.g.,:

    - shell: nix develop --command {0}
      run: ...


In my experience, the default VM size is so slow, you probably don't want Nix on a workflow that doesn't already take minutes.

Even with a binary cache (we used R2), installing Lix, Devbox and some common tools costs us 2 1/2 minutes. Just evaluating the derivation takes ~20-30 seconds.


You can use a self-hosted runner with an image that has anything pre-loaded.


Is there a way to cache the derivation evaluation?


You can cache arbitrary directories in github actions, but the nix package cache is enormous and probably bigger than GH's cache system will allow. Restoring multi-gig caches is also not instant, though it still beats doing everything from scratch. Might be more feasible to bake the cache into a container image instead. I think any nix enthusiast is still likely to go for self-hosted runners though.


The default cache action also has issued with anything that isn't owned by the runner user, and caches are per-repository, so you can't just have one cache like you do for binary caches.


Yes, we do this, although you need to do `nix develop --command bash -- {0}` to make it behave as a shell.


This is so close to validating my expectations that I'm almost skeptical of its veracity. I'm a regular Quest player, but I couldn't tell you how to launch Horizons if you held a gun to my head.

The leaders of corporate initiatives like this often tell themselves that they are building an ecosystem. They also seem convinced that an ecosystem will manifest from a highly curated, centrally developed silo that they have total control over. I guess it sort of worked for Facebook back in the day, but they were surfing on a lot of good will when it happened (and look at it now).

Things that are much closer to a metaverse than Horizons will ever be:

- Minecraft (Bedrock)

- VRChat

- Any popular multiplayer game that includes a free level editor

- The open web


Definitely agree about VRChat (and Minecraft to a lesser extent)!


But consider the opportunity cost: open content can be used to train an AI that may approximate the works of the author!


I used a Framework 13" as my daily driver for 3 years. I still have it, and now I also have a Framework 16", which has been my daily driver for the last six months.

The user serviceability and upgrade stories are real. The hardware isn't as svelte as Apple's, but mine has traveled all over the world and has yet to have any major issues. The one hardware failure I had was that the USB-C half of the charging cable on my 13" eventually broke after a few years of abuse, but that used to happen to me with Apple charging cables, too.

Framework has an active initiative to do outreach to different Linux distro communities and give them free hardware to help shore up compatibility. And, on that note, I haven't run into any Linux hardware compatibility issues (not with Pop!_OS, or more recently NixOS).

Speaking for myself, they have a loyal customer for as long as they continue to make this kind of hardware.


I love mine too (have owned every 13" they have made either personally or at work, plus the new 16"). Having the actual usb recessed and having a sacrificial usb-c as the one you use has saved taken me from breaking usb-c ports at roughly one a year to zero. The upgradeability and serviceability is real as well.

That beings said, my complaints about them are: They are a few hundred dollars more expensive than comparable hardware most of the time.

They were pretty slow releasing bios updates, although they seem to be getting faster at that.

There is no kensington lock.

After seeing the Linus tour of the factory where they fully assemble the DIY edition for testing and then take it back apart for shipping. I'm kind of annoyed. Find a different way to discount home users, you're spending more labor to get a lower price for your product.


Tbh it probably is cheaper for them to test that it powers up ... There would be nothing worse than building a laptop to find it was an RMA deal.... I would expect that the social media backlash could kill the product...


Yah, it does, but just leave it assembled to the point you had to assemble it and ship it rather than making me put the ram/gpu/nvme/whatever back in putting wear on the insertion slots and taking my time. I get that they are trying to put a barrier big enough that people are willing to pay the few hundred dollar convenience tax, so maybe just leave the NVME out and avoid the wear and tear on the GPU and the ram. It's basically the same barrier because you are removing the keyboard and undoing a bunch of scary screws in both cases but you get to spend less on labor at the factory and I have to install less crap myself (I was always buying the self assembled one, I like that stuff).


I fully expect that the intention is to force you into opening up the laptop to install the RAM. RAM is so easy to install that there's basically no risk of the customer messing up, and it exposes you to how easy it is to open up your laptop and how high quality the build is. Worked very well for me, I knew I would not accept buying anything of a lower standard before I even powered it on for the first time.


If it's actually an advertising expense then it's likely priced wrong. They should give the real price in the business section (where people don't want to have to install because they are buying multiples of them and where that price is obscured from the consumer) and have an even higher price for the fully assembled one (and a bit lower price for the unassembled one). Now if 80 % plus of their consumer business is the diy one right now then I'm wrong, but I doubt I am.


Glad to hear it! I'm interested to know, though:

How rigid would you say the frame is for Frameworks? Do you feel any flex at all when typing? Screen shake?

Over time, some of the laptops I've tried (cheap and expensive, many different brands) just feel like they start to fall apart. Either the screen hinges are junk and fail, leading to screen shake/nod whenever I type...or the frame is too weak, and the laptop itself starts to bend inward over time because I type hard.

If I could get something with an incredibly rugged frame, and excellent hinges, it'd be wonderful. I've seriously considered Toughbook's in the past, but the keyboard feeling for them is atrocious and the specs are always too weak.


It is overly generous to describe this as "privacy first." This looks like it's one ToS change away from being a privacy violating service.

In Apple's case, they are putting some amount of work into making their privacy claims verifiable. Good will is no longer good enough. Verifiability should be the bar for trust in 3P privacy claims.


This might be true for any run-of-the-mill service, but I do give Mozilla upfront credit as an entity and Firefox's privacy-leading track record. I haven't read the fine print, but I would be very surprised if there wasn't a robust layer of privacy/anonymisation involved. (Side note: I think the future is in-browser LLM (a la Gemini Nano), so I suspect they will eventually move there.)

Also consider that Apple has the big pockets to build their own server hardware, to claim multiple layers of privacy - but also remember that when they first introduced "differential privacy" and claimed it would be totally anonymous, privacy researchers soon found out that Apple set the epsilon so low that even after a few requests to their service, the user could be de-anonymized.

source: "Apple has boasted of its use of a cutting-edge data science known as "differential privacy." Researchers say they're doing it wrong." https://www.wired.com/story/apple-differential-privacy-short...


Mozilla's privacy-leading track record includes making Google the default search engine, running opt-in-by-default privacy-violating experiments, such as the Mr Robot fiasco[1], and opt-in-by-default collaboration with advertisers[2].

I still use Firefox, but I try to stay aware of changes, precisely because of Mozilla's privacy-leading gaffe record.

1. https://itsfoss.com/firefox-looking-glass-controversy/

2. https://www.pcmag.com/news/firefox-mozilla-data-collection-f...


If those are the only examples of privacy-tarnishing theyve done, I think that would speak for Firefox and Mozilla.


They literally have Google Analytics which sends telemetry data to Google integrated into the Firefox UI.


Can you substantiate this a bit more? Do you have a link?


https://github.com/mozilla/addons/issues/3145

Worth noting that the comment closing the issue mentions:

> You can disable Google Analytics in about:addons by setting your Do Not Track status to on.

> Again: this only affects users who visit the page with Tracking Protection on (which automatically enables DNT) or who manually set their DNT status to on.

but Firefox removed the DNT control last month (https://bugzilla.mozilla.org/show_bug.cgi?id=1928087), though it kept the Tracking Protection control and privacy.donottrackheader.enabled is still available in about:config.


That's not even getting to the fact that Apple is also running a display ads business: https://searchads.apple.com/


Indeed. Apropos to this: new features[1] to insert ads into videos in native apps.

[1]: https://developer.apple.com/videos/play/wwdc2024/10114/


Such a lazy take. Yes, they show ads based on what you search for in the App Store. They will also show apps based on location if the customer opts in to that feature. No other data is used. No browsing history, no purchase history, nothing like what other companies are collecting.

https://searchads.apple.com/privacy


Glancing at your comment history I can't help but notice that most of your comments are related to defending Apple, even at points where the consensus on HN is that Apple is obviously in the wrong. I applaud you, sir.


Eventually the addressable market for iPhones will saturate, but the growth imperative will remain.

If I were king of Apple and I truly valued user privacy, I would be careful not to tie any revenue streams to products that entail the progressive violation of user privacy.


This elides an important detail: outside of iOS a browser implementer can choose the code to compile and link against. Even if they build on Chrome, they can patch it to their heart's content or even fork and carry on. On iOS the browser runtime is set by fiat.

Also worth observing that most web browsers in the history of the web (including Chrome and Safari) started out as a fork of some other browser.


I might think to suggest that unchecked capitalist exploitation of our natural world has an awful lot to do with the inequities you brought into the discussion. And, if you grant me that this is possibly true (even if unconvincing or unprovable), surely you would agree that someone who believes such a thing would be prosocial and certainly not racist to scoff at the notion that somehow ever-more-unchecked-exploitation is the path to absolution.


I don’t know how absolution figures into this or how providing an API for simulations of earths weather systems is some escalation of unchecked exploitation. In fact there was a HN headline article recently that extreme weather hurts the poorest nations because they lack the infrastructure to predict the weather well. Wouldn’t having such an API be of assistance? Is it possible that technology that’s three orders of magnitude more efficient (as they claim) be a boon to the poorest countries that are roughly three orders of magnitude less capable of affording advanced infrastructure like this? Or is grinding political ideology axes more important than actually making important things possible for the poorest countries?


> Wouldn’t having such an API be of assistance?

It's probably going to come down to:

1. How accurate is it?

and:

2: Are the people using it for things that it suits, vs things that it doesn't?

For example, if it turns out to be decently accurate at predicting weather (say) "3 days from now" while being seriously wrong at timescales of 6 months, then only using for weather forecasts for the next few days seems good.

But some people tend to believe anything an "AI" computer says regardless, so we'd need to make sure those kinds of people aren't getting it adopted where its harmful.


I wish I could agree, but the hard truth is that decisions that imply heavy amounts of energy consumption have to be able to sustain active critique.

Simply waving the upsides around and shaming anyone who questions the cost is an age-old posture that has helped bring our society to the brink of disaster.


Consuming energy in itself isn’t problematic, what’s problematic is certain types of energy production are destructive. The question isn’t whether the use of energy itself is moral, it’s are they making moral choices with how they source their energy. That’s why clean and renewable energies are so important - there’s no reason to reduce energy consumption, there’s just reason to stop being destructive when producing energy.


I partially agree. It's actually both questions, because...

- We don't have a 100% renewable energy regime (and we are very far from it)

- Even if we did it would be possible (and likely) for bad actors to arbitrage cheap energy for yet-more-harmful use cases

- As we grow the portion of grid demand that is served by green, renewable energy, the price of coal and natural gas is being driven down

- Until we have 100% deployment, one company's "green" energy consumption will simply drive the less-well-capitalized to cheaper, not-green energy sources

Cheaper, more pervasive energy will be used efficiently in the ways that are enabled by our legal framework. If we aren't critical of those usages, there is nothing to say that abuses won't become exponentially more conspicuous even as we adopt more and more "clean" energy. And in fact, that's what we see happening in the world today. Look no further than:

- Bitcoin mines exploiting cheap spot power and demand response policies, and burning natural gas directly from the well because the speculative price of crypto is more appealing than selling the same gas on the market

- Generative AI in general, but especially when you see hundreds of thousands of GPUs deployed without so much as a business case (Stability AI, Inflection, probably many others surfacing in the weeks to come)


> capitalist exploitation

Ah ok


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: