Hacker News new | past | comments | ask | show | jobs | submit login
KV Storage, the Web's First Built-In Module (developers.google.com)
211 points by feross 12 days ago | hide | past | web | favorite | 98 comments





I'm still in awe of the arrogance of calling this "the Web's first built-in module" and promoting it to developers when it's a non-standard, Chrome-specific feature backed by a spec written entirely by a single Googler without overt evidence of outside collaboration.

Between this and last week's "Skype for Web," are we collectively done with the pretense that the Web is anything other than what Google dictates?


Hi, I don't work for Google, but I do work on web standards.

In WHATWG, single vendors typically work on features until they believe they are good enough to be standardized, which is when they are submitted to the WICG. At this point, other vendors and spec editors can take a serious look at the proposal and decide if it should move forward, what changes should be made, etc.

kv-storage has in fact gone through this process, and has overwhelming support from vendors and developers.

aside from this, import maps have been in the pipeline for even longer, and have broad support ranging even into Node.js being involved.

I know it's tempting when a new web feature is announced to claim that vendor is destroying the internet but please take the time to consider all the work that is put in to making the web a better place for you.


Hey! Spec editor here.

KV storage is being worked on in a standards body, the WICG. It has healthy cross-vendor collaboration, including Mozilla (who influenced the API design and naming significantly) and Microsoft (who encouraged us to work on it in the WICG and then eventually merge it into the IndexedDB standard, which they are one of the editors of). It also has a decent amount of collaboration from various web developers. I'd encourage you to check out the issue tracker and the minutes of the W3C TPAC standards meeting for more!


Hi Domenic! First: I'm outside the edit window for the original post, but in hindsight it reads as more of an ad hominem than intended. My apologies.

I don't mean to malign the technical aspects of KV Storage, nor WICG / TPAC / TC39 / etc. collaboration, but rather the specific positioning of KV Storage in the article. There's no acknowledgement of collaboration nor explanation of where this proposal sits in the standards process. In fact, it gives the opposite impression:

- "the first one we're planning to ship"

- "do we have to wait until all browsers support built-in module [...] no!"

- "you can actually deploy your changes today!"

- "Chrome 74+ users won't have to pay any extra download cost"

That reads a lot like an Intent to Ship and counseling developers to begin relying on it, which feels wholly inappropriate given that the entire notion of having built-in modules at all is still a Stage 1 proposal.

Similarly, I do not believe any vendor should claim an in-house implementation of their own proposal as being inherently part of "the Web." It may very well end up that way, but can we at least wait until the proposal hits Stage 3 before using that kind of language?


Hi callahad,

I think there's some confusion about the standards venue this is taking place in. This work is being done in the WICG, at https://github.com/WICG/kv-storage. The TC39 staging process, while relevant for any built-in modules TC39 may want to work on such as temporal, is not a blocker for web built-in modules.

Just like web standards bodies and TC39 already collaborate on built-in globals, we're also collaborating with TC39 on built-in modules. Thus the links to the stage 1 proposal in the article and explainers. But in the end, KV storage is a web feature, and goes through the web standards process, which doesn't use the Ecma staging process. Instead, it uses the process others are discussing elsewhere on this thread: incubation, shipping to web users through early implementations or origin trials, eventually to stable, and then promotion to a standards body like the W3C or WHATWG. (In this case, to the W3C, as part of the IndexedDB specification.)

Hope this helps!


Just to be clear, is this article an intent to ship?

That's the concern I have; not that it's going through the WICG, but that whatever process it's going through, this blog post sound like Google announcing user-facing shipment of a feature that hasn't finished its standardization process.

We've already been through this before with Web Bluetooth and the Shadow DOM specification. Once Chrome turns a feature on for end users, it is effectively standardized -- it is very difficult to justify changing or evolving a spec once real sites on the actual web have already started to rely on it. Again, look at Shadow DOM as a good example of why this is a problem.

I'd feel more comfortable about this process if there was some clarification that this is an intent to ship only behind a flag, or an intent to ship only to dev/beta versions of Chrome.


Article author here,

I probably should have been clearer in the article. I was trying to strike a balance between:

- presenting what I believe to be a compelling and exciting possible future for the web (especially considering it has a viable polyfill story) - getting developers excited about this future and thinking about how it could integrate with existing tooling

and:

- Asking for feedback on the KV Storage and Import Maps APIs themselves. - Encouraging developer to experiment and/or sign up for the origin trial

It's not an easy balance to strike, and in this case I probably should have emphasized more that this is still in the experimentation phase.

I can update the article to make that more clear.


It's got to be frustrating to write something like this up and immediately see people jump to, "Google is taking over the world again."

To be clear, this looks really promising. I particularly like the way that import maps are polyfilled. I kind of wish standard modules were flat-out required to be imported in versioned form, since that would open the door to better API versioning on the web in general, but... whatever, that's what the standards process is for, and it looks like that's something people are at least already talking about.

But the unfortunate side of things is that because of Chrome's history, it's really easy to read posts like this as, "here's a new thing, and btw we're shipping it tomorrow." That's not your fault, it's just what the environment is like right now.

> All your users should benefit from better performance, and Chrome 74+ users won't have to pay any extra download cost.

To me that sounds like, "post Chrome 74, we will have this feature turned on for production sites." If that's not the intent, and Chrome isn't planning to turn this feature on early, then I'm much more excited about the proposal.


It seems a bit unfortunate to quote a sentence out of context and then misinterpret it like that. Let's expand the quote by one sentence:

> If your site currently uses localStorage, you should try switching to the KV Storage API, and if you sign up for the KV Storage origin trial, you can actually deploy your changes today! All your users should benefit from better performance, and Chrome 74+ users won't have to pay any extra download cost.


I'm not sure what that extra sentence changes. I would still (and in fact, did when I read the release) interpret both sentences together as, "you can enable this today via an origin trial, and in Chrome 74+ it'll be enabled permanently." I certainly wouldn't read it as, "you should sign up for an origin trial and then provide us with feedback so we can continue to evolve the spec."

I realize that to no small degree that's my own fault for not understanding the specifics of origin trials, but how many people reading this article already understand origin trials?

I don't want to derail things -- I just wanted to get clarification that this wasn't going to be HTML imports again. At the end of the day, I don't care how the announcement is worded as long as it's not actually being shipped in the next release as on by default for every site. The spec itself looks really interesting and I'm excited to see it develop.


I've updated the article to emphasize that this is indeed something we're experimenting with, and explain a bit more how built-in modules go through the standards process (in general and in Chrome).

An intent-to-ship has to be sent to blink-dev, so per the Blink dev cycle this cannot be an I2S.

There has however been an intent-to-experiment[0] and as the article says it'll be available as an origin trial in Chrome 74. (For those who don't know, origin trials are essentially an per-origin opt-in, time-limited way of shipping. It won't work by default, and in this case it won't work beyond Chrome 76.)

[0]: https://groups.google.com/a/chromium.org/d/msg/blink-dev/sEw...


Thanks. That makes me feel way better about this proposal.

Intent to experiment is exactly where I would want a feature at this point in development to be -- gives normal people the ability to build things with it and figure out any pain points, but forces them not to use it in a production site quite yet.


From RFC2026

> A specification from which at least two independent and interoperable implementations from different code bases have been developed, and for which sufficient successful operational experience has been obtained, may be elevated to the "Draft Standard" level.

Standards are created first by someone implementing them and then others coming out with their own implentation and then finally going through a standardization process

Even IEFTF does not accept standards which lack at least two independent implementations


Implementation is great. It's not the same thing as shipping a feature to end users.

Once you ship a feature to end users and start advertising it, changing it becomes very, very difficult because you risk breaking the web just by evolving the standard. If you're trying to create a joint standard, you can't start by flipping the switch and turning the feature on for regular people and production-facing sites.

That would be like passing a law and enforcing it months before Congress voted on it, simply because, "without real-world trials, how do we know the law will work?"


Most specs have a small number of authors (1-3) and a much wider group of people who have influenced the spec. This is as true for Import Maps and kv-storage as any other. FWIW I think "the web's first built-in module" is a reference to the name of the spec, which is called "built-in modules", not a claim to originality.

Hi! Thank you for your work on the polyfill. There is a healthy inter-vendor discussion in the spec's GH issues, and I believe there is absolutely technical merit to this specific proposal. That's all great and above-board.

I'm mainly and strongly objecting to this degree of external promotion for a Stage 1 proposal, along with positioning it, fait accompli, as part of "the Web."


Mozilla has a storied history of roadblocking Google on principle. This outburst has all the hallmarks of an ideological objection, rather than technical, another in Mozilla's longstanding crusade against Google. If the proposal is a net benefit to developers and users, maybe energy is better spent in including this in Firefox, versus demonizing the semantics of a blog post from the competition.

For the curious, below is a non-exhaustive list of promising tech clearly beneficial to developers and users, killed by Mozilla, on principle rather than technical merit:

1. WebSql - A web port of SqLite - arguably the world's second most used (and loved) tech. Instead, we have IndexedDb, which is such a damn hassle that a million wrappers exist to deal with it's shortcomings and complexity, including the tech presented in this blog post.

2. Nacl/PNacl - A precursor to WebAssembly which already had threads, SIMD, permissions, security ... figured out. By design, most Nacl code will run faster than webassembly, as Nacl's sandbox only excluded certain processor instructions deemed a security threat to it's sand-boxing model

3. HTML5 Storage: A file-system like storage for the browser! Killed by Mozilla...


Sure, we don't want any single entity--including ourselves--to monopolize the foundations of the Web. To allow that would diminish the Web's power as a democratized public resource.

But let's not kid ourselves: Mozilla hasn't been able to unilaterally kill things for years. We finally caved on H.264 in 2014, acceding to a patent-encumbered Web: https://andreasgal.com/2014/10/14/openh264-now-in-firefox/ The following year, we reluctantly added DRM to Firefox: https://blog.mozilla.org/blog/2015/05/12/update-on-digital-r....

Google was the only vendor that supported NaCl or the Filesystem APIs, and WebSQL similarly failed to gain traction outside of WebKit and Presto. If we were wrong in our assessments of those proposals, we weren't alone. And that's what killed them.


FWIW, I don't believe Google was pushing a patent encumbered web, or H.264. Their VP8/VP9,VP10 funding speaks for itself.

No one wants a monopoly in the browser space, and it seems like Mozilla has some other bone to pick with Google. As stated before, developers and web users come before Mozilla's philosophy, and many of the objections to Google's proposals don't seem to further that mission. Maybe some of it is in Mozilla's self preservation interests, I don't really know. But a lot of Google outrage seems feigned, and contrary to developer and user interest.

If you haven't noticed, most devs here haven't raised any technical objections, instead seem pleased with the offering, that's a hint in itself


For a while I was ambivalent about Chrome dominance, my line of reasoning being "well they're still innovating, it's not going to be like it was in the IE days".

My mistake was in assuming that IE dominance was only bad because it wasn't innovating.

What we seem to be developing now is a kind of tunnel vision, where, to your point, we simply assume that Google is the web and the web is Google.


I'm not sure that there is a difference in your two scenarios.

We wouldn't think "Google is the web and the web is Google" if we were seeing a similar string of innovations coming from Firefox and IE.

This will only be bad if the others aren't innovating.


You're thinking about this incorrectly; don't worry when Google does propose new standards, worry when others don't.

Indeed.

At this point, a Gopher revival is looking pretty enticing to serve content, not lock in, vendor wars or piles of straw stuck together with poo.


The Skype for Web thing is completely blown out of proportion. There's many soft launches of apps that limit browsers to focus on priority browsers. Skype for Web works on those other browsers, and I'm sure they'll eventually open it up to them, seeing as ReactXP (what they use to develop it) does support other browsers.

Unfortunately, as other sibling commenters here have pointed out, this "process" has been how web standards have been done for some time. We've been collectively done with the pretense that the Web is anything other than what Google dictates since ~2004.

It's not just the Web, it's how the internet was created "Rough consensus and running code". People have always crafted their own specs and initial implementations first, and then sought out peer review.

It's how the IETF worked with RFCs.

It's usually never been the case that successful standards have been developed whole-cloth by committee and are almost always the result of a single inventors doing 80% of the work, making lots of proposals and shipping experiments to early adopters, and the standards groups refining it before shipping.


Is Mozilla interested in built-ins?

[flagged]


No personal swipes in comments here, please.

https://news.ycombinator.com/newsguidelines.html


The bigger story here appears to be a "Javascript Standard Library" which is summarized [1] without any list of proposed modules/packages...

A key-value store isn't quite the first module that leaps to mind for such a toolkit :-)

EDIT: found a collection of module proposals! [2]

[1] https://github.com/tc39/proposal-javascript-standard-library

[2] https://github.com/tc39/proposal-javascript-standard-library...


Agreed - Javascript Standard Library - is a simple but brilliant idea. Optional, so nobody is forced to use it, but if a library has useful functionality, seems to me to be a no-brainer to me to use such, likely to be higher quality and more performant. Let’s hope they can bring back WebSQL as such a library, Microsoft and Firefox’s appalling decision to drop the amazing SQLite from browsers and force in the inferior IndexDB, needs to be reversed. A local RDBMS is required, to enable PWAs to more easily match desktop capability.

> Let’s hope they can bring back WebSQL

Agreed, although having the implementation defined exclusively by Sqlite (which is why IndexedDB became the standard browser database) remains a rather large sticking point.

Aside from the standards group reverting the deprecation of WebSQL, Firefox adoption has dwindled to dangerous lows, and Microsoft now bases their browser on Chrome, so in the not too distant future one could simply target the dominant browsers, Chrome and Safari, and use WebSQL, deprecated or not (if in 10 years Chrome and Safari haven't removed WebSQL, when will they?).

Unlikely WebSQL will make a comeback though, frontend devs seem to like NoSQL-style structured data, which maps well to JavaScript object notation. Would be great to write the same relational SQL on server and client, but that's a pipe dream unless a drastic shift in standards committe direction happens.


Maybe it will be possible to compile Sqlite to wasm and then using indexdb as a backing store for it. In which case perhaps a good enough polyfill would be accessible to Firefox.

The issue there is that each client/user has to download the wasm binary, unlike WebSQL where Sqlite comes bundled with the browser for free -- there are issues there as well, versioning/security, but not having to download anything is pretty nice, like the KV storage option, it's there to use immediately, no download, no parsing overhead.

Does anyone use non-mobile safari?

On a fairly broad-interest site I just checked, Safari is the second most-popular browser behind Chrome. Even if you exclude iOS, which doesn't make sense for most sites, it's in the same range as either Internet Explorer or Edge.

Yes and even though you haven’t owned a TV in 15 years,people still watch TV too....

WebSQL is an interesting case: I like the capabilities it offers but it wasn't standardized more than “whatever a particular version of SQLite does” and that lead to things like https://blade.tencent.com/magellan/ because the attack surface was broader than anticipated.

I'd like it to come back but also in a more considered form with some API improvements — exactly the kind of thing you'd get out of it going through the standards process with more than one implementation.


I recommend reading through this issue [1]. It's amazing how much the JS community is firing against a bigger standard library, just because stuff like lodash exists and everyone can make their own packages anyways.

[1] https://github.com/tc39/proposal-javascript-standard-library...


As a (mostly) C dev this is like bizarro world to me but I've noticed on HN and elsewhere that JS devs are very quick to move from one framework to an other, declaring a library dead if it hasn't received a significant update in mere months. I'm sure that there are plenty of JS devs out there who have fundamentally changed the way they code several times over the past few years.

With this mindset I can understand that there might be some reluctance to consider a standard library that would be "set in stone" and shape the language basically forever (and potentially create a burden of legacy feature that needs to be maintained, C++ style).


That's going to happen whether they expand the stdlib or not. Witness Array.prototype.flat() (because .flatten() would break websites using MooTools). I think the staged process they use and compilers like Babel work to get the proposed API in use so it can be evaluated against actual in-the-wild use, without contributing to ossification.

You know what they say: If it ain't broke, let's move to a brand new framework.

Yeah, why get an uniform experience when you can have a different hard time in each new code base ?

Why have basic features defined such as namespace, hashing, basic data structures manipulation, and uuid when you can have the unpleasure of rewritting it everytime ?

Why avoid making your users download primitives they have a 100 copies of on their machines in all those dll and so ?

It's not like those are solved problems in all other languages that don't have an accidental monopoly on the Web.


somewhat cynical/defeatist: Because those languages came along at a time previous to the web era and were often ruled by a single entity but in the web era XKCD standards rules https://xkcd.com/927/

I wonder how many of those people rallying against a standard library ever had to support a legacy codebase where all the design decisions were made by someone else. Doesn't sound like they have that kind of experience at all.

Funny thing is, eventually Google will decide that they do need a standard library and all those freedom fighters will do a 180 flip in terms of what they support. This already happened so many times in the community. (Classes being the most notorious example. I'm not saying Google was behind them, but there was a definite 180 flip on whether classes were needed.)

Maybe this is the first step towards Google establishing a standard library.


Just to be clear, the "JavaScript Standard Library" and the "Web Standard Library" are completely separate. (they are worked on by a lot of the same people though)

The more APIs browsers have, the more expensive it is for browser vendors to implement and maintain them. Opera and Edge already gave up.

W3C and TC39 are constantly publishing new APIs and features. They are pushing the Web towards a single implementation (Chromium). It will become too expensive to have separate implementations.


built-in modules will actually make it cheaper to create a new browser because modules can be easily polyfilled.

A new browser will not have to implement kv-storage itself, either it can let the user polyfill it or quite easily copy another browsers implementation.


Kind of like when Google decided to ship a partial Shadow DOM polyfill to production that just so happened to make YouTube load 5x slower in browsers that weren't Chrome. https://twitter.com/cpeterso/status/1021626510296285185

Polyfills aren't a panacea, and we shouldn't blindly excuse anti-competitive behavior or attempts to monopolize the direction of an otherwise open platform that we all share and depend on.


It is a bit funny though. Our problem used to be "OMG there are so many browsers with their own little gotchas, and they don't follow the spec"

Now the problem is "Oh no we're converging to a single implementation, we need to diversify it!"

I don't know which is better but it is kind of funny to note.


We've actually gone full circle twice now, with a little mobius strip thrown in for good measure.

1. Early IE and netscape era: OMG there are so many browsers

2. IE6 dominance: Oh no we're converging to a single implementation... [and they don't follow the spec]

3. Rise of Firefox and Chrome: "OMG there are so many browsers

4. Chrome dominance: Oh no we're converging to a single implementation... [and they dominate the spec]


IE6's problem wasn't that it did or did not follow some spec. It was that it was never going to change. IT won and MS where never going to make another one, because they wanted the internet dead.

We do not have that problem with Chrome.


MS wanted the internet captive to protect their cash cows, Google wants the internet captive to protect their cash cows.

The core problem remains the same.


No it's a different problem, you're too reductionist, cries about IE6 are a distraction from that.

Should the only browser be from one company? No.

Was IE6s problem that it was from one company? No.


There are so many broken or outdated APIs in browsers, and there is still so much we need in the front end.

- Rich text editors

- Drag and drop

- Native data binding and reactive primitives

- Modern SVG support

Etc.


The KV Storage API appears to be almost identical to the extension Storage API in Firefox, which is a joy to use.

https://developer.mozilla.org/en-US/docs/Mozilla/Add-ons/Web...


Ohhh, it's key-value like localstorage, but it allows you to store abitrary objects, not just strings. That would actually be really nice. JSON parsing can definitely become a bottleneck when getting stuff out of localstorage.

The proposal doesn't seem to take into account the versioning of the included modules.

> import { Date } from "std:Date";

> import { Date } from "std:Date+2.1.6-alpha.1";

Which version is the first line going to include? Latest? First?

importmap for backward compatibility / polyfills is very neat though.

I would love to see lowdash/underscore included as a standard library.


This is not different from existing web APIs, which are also not versioned.

...and this has always been a unique pain point of development in the browser.

I assume whatever is shipped with the browser version it is running on.

Does versioning imports imply that the browser has to ship with all versions of a stdlib Module?

It could instead imply "if the browser didn't ship with a version that satisfies my requirements, use the polyfill"

if they want these libraries to ever be used as default they need to have a stable api, so it needs versioning and the ability to link to specific version or versions in a bracket and the backport of security fixes across releases within a reasonable timeframe

why would anyone develop toward a library that doesn't provide those basic guarantees?

there's also no guarantee that your web app gets deployed to an up to date browser, so it's either you can specify which api level your app wants or we're all back to specify which browser version an app runs onto.


A step on the perhaps unintended path to first-class library installation into browsers, rather than relying on CDN caching for shared code? Will deploying on the web might look like deploying to the desktop, just one step abstracted? :-)

I for one am looking forward to DLL Hell 2.0

Haven't seen anything from neither WhatWG nor W3C.

Is Google out on another solo raid?

Extra bad because the headline make it sound like a standard.

Or am I missing something?


There are links to all the standards work in the article. All of the APIs mentioned are going through the standards process.

At the W3C TPAC 2018 meeting, the spec editors for the IndexedDB spec agreed that after incubation (in the WICG, which is another less-formal standards group), KV storage should "graduate" by being incorporated into the W3C IndexedDB spec itself. (Indeed, they were enthusiastic about the possibility.)

This is a Javascript module, not a web standard. It goes through TC39

Not quite! Like built-in globals, built-in modules are a shared workspace between all standards bodies. In this case, KV storage is a web standard, for a built-in module that will be a web feature. This is similar to how IndexedDB is a web standard for a built-in global.

This is from TC39, the committee of ECMA that publishes Javascript standards.

I first thought it would be the response to Cloudflare's Workers KV (which I find an interesting concept).

Same. They’ll probably end up being a nice pairing, fallback to cloudflare workers KV when the local one is empty/stale.

When trying to port my web app into a Chrome app I had to rewrite all localStorage to be async because the Chrome app localStorage was async ... After a few weeks I gave up as there was too much work to make it into a Chrome app, but I kept the storage async in the web app. It seems that CPU memory is the new RAM and RAM is the new HDD, so it makes sense to make everything that access RAM async ... But then when everyone use async abstractions like "futures" it becomes a net negative in performance.

Might as well build an OS in the web browser at this point and then you can have an OS within your OS. A sandbox within a sandbox. Next step is getting three levels deep.

We are already there today. There is very little to distinguish a modern web browser from an OS.

You just have to look in chrome://gpu for evidence of this. So much effort to deal with hardware differences to give a uniform experience. Isn't that exactly what an OS is for.

And then there are (dying, because Chrome) JavaScript libraries to abstract away differences between browsers.

Isn't that the idea behind Chromebooks?

Something like that. I never understood why you can't just run graphics accelerated Android apps on Linux using native tools. Linux already has most, if not all, of the sandboxing/permissions components needed for apps. Wish apps were just distributed via containers.

Or the opposite. There are more and more hardware that support Android but not Linux. That's why Termux and other solutions to run Linux on Android are so popular.

I assure you that there is much more hardware supported by the Linux kernel, than what has been added by Android. Termux doesn't run the majority of the Internet, and very few developers use Termux-based distributions for development.

I couldn't find anything on how would bundled code or a simple script be able to access std: modules. Does anyone know the answer to this question?

https://developers.google.com/web/updates/2019/03/kv-storage... , and especially the demo , show how you can use tooling for this use case.

My question is a bit different: how to load the native std:kv-storage from a webpack bundle. It seems the only options are to either load the whole bundle as a module, or use async import like: eval('import("std:kv-storage")').

Ah, yeah, unlike bundling tools like Rollup, webpack doesn't have the ability to output modern JavaScript module scripts, only classic scripts. The article links to a feature request on webpack to support this: https://github.com/webpack/webpack/issues/8896

In the meantime, there are lots of workarounds, such as e.g. creating a script to bridge from built-in modules (which webpack doesn't support) to globals (which webpack does support). E.g. `import { storage } from 'std:kv-storage'; window.kvStorageStorage = storage;` then use `kvStorageStorage` in your webpack code.


Does anyone know how import maps work in terms of scope? Does it affect all javascript on the page, just those from the same domain, or something else?


Will the second module be jQuery??

Is there “import x as y” syntax?

Either its so good that its worthy of being standardized or its not and google is flexing its monopoly muscle and trying to kill competition.

Many here hate Microsoft, some probably copy others not knowing what's to hate but the minority that knows the 1990s knows what was so bad, and if they still use anything google then shame be upon them


In the 90s you would be right, these days web standards don’t get standardized without having implementations first.

Tell that to WebSQL..

All vendors implemented (essentially) the same API, then the W3C subsequently abandoned it due to lack of independent implementations [0].

0: en.wikipedia.org/wiki/Web_SQL_Database


As far as I can tell, only two browser engines ever implemented WebSQL: Safari and Chrome's WebKit (which hadn't yet been forked into Blink), and Opera's Presto (which hadn't yet been abandoned in favor of Chromium). Firefox never implemented it, and neither did IE (Edge hadn't been announced yet), so it certainly wasn't implemented by "all vendors". And both implementations were just thin wrappers around the same library, SQLite, so the W3C was correct in saying there were no actual independent implementations.

>And both implementations were just thin wrappers around the same library, SQLite, so the W3C was correct in saying there were no actual independent implementations.

Why should there be?


Not a great example, since "independent" is the important bit there: browsers just shipped sqlite, and you can't have a spec that says "just ship sqlite".

So if Google loses its case against Oracle and the court finds that APIs can be patented, other browsers may not be able to offer the same functionality as any of Chrome's built-in modules under the same APIs. So could this be an attempt for Google to kill all the other browsers? Create a feature that will lock in developers and make most modern web apps less usable in other browsers



Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: