
“The working group should not agree to freeze whatever syntax Chrome ships” - cylo
http://lists.w3.org/Archives/Public/www-style/2014Feb/0103.html
======
potch
The problem is the standards process exists to keep people from swinging their
weight to force rushed/poorly thought-out technologies into a platform some
already accuse of fragmentation and bloat. And to people who "But this is
Google!" Throwing the process under the bus today because you think your
favorite tech company can do no wrong screws everyone later when management
changes. Management will change. The process is here to protect the web's
interests regardless of whether any one actor's intent is good or not. If
Google steamrolls everyone today, who will steamroll everyone tomorrow? The
MPAA is on the W3C too.

~~~
ajross
The last example seems implausible: the MPAA doesn't ship a browser and can't
control features by fiat.

I guess I'm not understanding the firestorm here (rather, I am, but in an
uncharitable way as YET ANOTHER !@#?! Apple vs. Google proxy war on HN).
Google developed a feature they like and want to ship it. They want it to be a
standard. Apple (at least this one Apple person) doesn't want it standardized
yet. Google is noting that they are going to want to ship this anyway
("threatening to ship it anyway" if you swing that way).

So what? Surely this kind of argument happens any time you have a proposed
standard from a single vendor. What's special about Shadow DOM that it can't
tolerate this kind of disagreement?

~~~
pornel
> MPAA doesn't ship a browser and can't control features by fiat.

DRM extensions (EME) have shipped in Chrome and IE already (with whole W3C
process steamrolled), so it seems that MPAA does control browser features — by
proxy (Google Play/Xbox video store).

~~~
ajross
Missed the point. All W3C committee member influence the set of features
covered by W3C standards. That's what the W3C committe _is for_.

But the allegation here is that Google is ignoring the standardization process
and shipping features before getting signoff via a committee draft, and that's
a bad thing. The MPAA can't ship features at all, so they would never have
access to this trick without convincing a browser vendor to do it for them.

------
bryanlarsen
Here's the reply: [http://lists.w3.org/Archives/Public/www-
style/2014Feb/0138.h...](http://lists.w3.org/Archives/Public/www-
style/2014Feb/0138.html)

the money quote: "We feel the standard is sufficiently advanced, the remaining
issues sufficiently small, and the benefit to authors sufficiently great to
justify shipping sooner rather than later."

the Shadow DOM is pretty awesome, so I'll agree with the "benefit to authors
sufficiently great" part. No comment on the rest.

~~~
randallu
I disagree that the Shadow DOM is pretty awesome. I think scoping style is
valuable, but building components that are exposed as new tags is not
appealing given the vast complexity of the implementation and the limitations
of tags.

Markup has a very weak type system (strings and children) which makes building
complex UIs more painful than it has to be (this also stands for markup driven
toolkits like angular and knockout -- where the equivalent of main() is HTML
markup and mostly declarative). Markup isn't a real programming language, and
it's very weak compared to a true declarative programming language.

JavaScript however is a real programming language with all of the constructs
you need for building extensible systems. For building anything complex (which
is where Shadow DOM should shine) you will need to use extensive JS, you will
need your Shadow DOM components to expose rich interfaces to JS... At which
point, why are you still trying to do mark-up first -- it's something that's
more "in your way" than helpful.

~~~
curveship
Thank you! I thought I was the only one who felt this way. I truly do not
understand why Google feels that application composition should happen at the
presentation layer rather than the code layer, particularly when the
presentation layer is as weakly typed as HTML. This was tried and failed in
the very first round of server-side web frameworks back in the mid-late 90s.
More recently, the complexity of Angular's transclusion logic should have
clued them in that this is an unwieldy idea.

I agree that some kind of style scoping construct would be a good addition,
and far simpler than ShadowDOM. Simple namespacing would be a good start. It
would be a more elegant solution to the kludgy class prefixing that has become
common (".my-component-foo", ".my-component-bar", etc.)

~~~
cromwellian
Well, for one thing, div-soups are hard to read, create deeply nested DOMs,
and lack semantics or transparency. If you're trying to preserve the nature of
the web, which is transparent, indexable data, one where "View Source"
actually has some use to you, then having a big ball of JS instantiate
everything is very opaque.

~~~
curveship
The phrase "div-soup" makes me reach for my revolver. It seems to be a straw
man that means "Either you're for Web Components or you're for the worst of
web engineering."

\- How does ShadowDOM (or Web Components more generally) make your DOM
shallower? It's still the same box model. Deeply nested DOM structures are
usually the result of engineers who don't understand the box model and so
over-decorate their DOMs with more markup than is semantically or functionally
necessary. Nothing in ShadowDOM (or, again, Web Components) changes this.

\- Are custom elements really more transparent than divs? If "View Source"
shows <div><div><div>...</div></div></div>, do you really gain much if it
shows <custom-element-you've-never-heard-of-with-unknown-semantics><another-
custom-element><and-another>...</etc></etc></etc>? Proponents of Web
Components seem to imagine that once you can define your own elements, you'll
magically become a better engineer, giving your elements nice, clear semantics
and cleanly orthogonal functionality. If people didn't do that with the
existing HTML, why will custom elements change them? At least with divs, I can
be reasonably sure that I'm looking at a block element. Custom elements, I got
nuthin'. They're not transparent. They're a black box.

\- Finally (and more importantly), we already solved the "div-soup" problem.
It was called XHTML. Custom elements in encapsulated namespaces! Composable
libraries of semantically-meaningful markup! How's that working out today?
It's not.

TL;DR: a common presentation DTD is the strength of the web, not its weakness.
Attempts to endow web applications with stronger composition/encapsulation
should not be directed at the DOM layer but at the CSS and JS layers above and
below it.

~~~
cromwellian
1\. Shadow DOM scopes down what CSS selectors can match, so deep structures
can hide elements from expensive CSS rules.

2\. Custom Elements promote a declarative approach to development, as opposed
to having JS render everything.

3\. XHTML was not the same as Shadow DOM/Custom Elements. XHTML allowed
produce custom DSL variants of XHTML, but you still ended up having to
implement them in native code as trying to polyfill SVG for example would be
horrendously inefficient.

4\. The weakness of the web is the lack of composeability due to lack of
encapsulation. Shit leaks, and leaks all over the page. Some third party JS
widget can be completely fucked up by CSS in your page and vice versa.

A further weakness is precisely the move to presentation style markup. Modern
web apps are using the document almost as if it were a PostScript environment,
and frankly, that sucks. We are seeing an explosion of "single page apps" that
store their data in private data silos, and fetch them via XHRs, rendering
into a div-soup.

The strength of the web was publishing information in a form that a URL
represented the knowledge. Now the URL merely represents a <script> tag that
then fires off network requires to download data and display after the fact.
Search engines have had to deal with this new world by making crawlers
effectively _execute_ URLs. I find this to be a sad state of effects, because
whether you agree or not, the effect is to diminish the transparency of
information.

You give me an HTML page, and I can discover lots of content in the static DOM
itself, and I can trace links from that document to other sources of
information. You give me a SinglePageApp div-soup app that fetches most of its
content via XHR? I can't do jack with that until I execute it. The URL-as-
resource has become URL-as-executable-code.

IMHO, the Web needs less Javascript, not more.

~~~
godojo
Both are needed! Javascript is great for portability of apps that would
otherwise be done in a native environment (you wouldn't want to index these
anyway). Isn't there a standard mime type to execute js directly in browsers?
There should if not. If you care about being searchable and having designs
that are readable on a variety of devices, powerful and degradable markup is
very useful.

------
ChuckMcM
So Google is taking their cues from the Internet Explorer playbook.

Having been in (and out) of standards wars at Sun and NetApp it really cries
out how challenging doing "standard" stuff is. The original IETF standards
process was really powerful at keeping out cruft, it included "fully
documented" and "two or three interoperable implementations, of which source
code must be available for at least two of them." The goal being that someone
could sit down and write an implementation, and two there were at least two
pre-existing implementations they could compare against for interoperability
issues/questions and testing.

But standards break down when it comes to captured market share. If you're
market share capture depends on your "value add", then you don't benefit if
anyone can implement the same thing and you have to stay compatible with them.

~~~
MichaelGG
When did the IETF process change? From looking at SIP (with all its combined
RFCs, it's one of the largest standards), there's plenty of insane cruft.
There's even an RFC that takes delight in coming up with crazy messages that
are technically valid and suggestions on how implementations should try to
guess the intent of the message. SIP goes back to the late 90s, so the process
must have been corrupted by then.

~~~
ChuckMcM
Understand I've got some emotional baggage here[1] :-)

It changed when it became more relevant than ISO, I put it right at the end of
the IP/ISO war, probably 1997, 1998. Basically up until that point the people
who were motivated to subvert standards efforts pretty much ignored them, they
had their own transport (OSI), their own mail services (X.400) their own
directory services (X.500) and basically every else. IP and its "hacked
together" stuff was pretty much universally considered "on its way out" by the
"players."

When it became clear that IP wasn't on its way out, and in fact it was the
burdensome and complex OSI standards that were going no where, the movers and
shakers switched tactics, invade the IETF meetings (which did have an abusible
community participation process) and drove the organization off a cliff in
order to preserve their interests.

A really really good example of that was SIP and IPV6 both of which started
out pretty reasonably, (because neither the phone companies nor the networking
companies thought either was going to be relevant in the 7 layer OSI world)
until whoops, that is where the action is so lets get in there and "fix" it.

[1] I sat in front of 500 engineers and explained out Sun was prepared to hand
over any proprietary interest whatsoever in the RPC and XDR stacks (which had
been "standards track" RFCs before that has a more explicit meaning, and so
had sat there implemented by everyone but not officially 'standard') only to
have Microsoft and the guy they had hoisted out of Apollo/HP, Paul Leach,
invest thousands of dollars in derailing that offering, for no reason other
than to try to resist anything compatible being out there. It was redonkulous
in its pursuit of killing ONC/RPC at every venue. But like I said, that just
made it personal for me.

------
gress
Why is anyone acting the least bit surprised here? Google built Chrome for one
reason and one reason only. So that they could control the development of web
technologies.

This is the tale of the scorpion and the frog playing itself out as usual.

[http://en.wikipedia.org/wiki/The_Scorpion_and_the_Frog](http://en.wikipedia.org/wiki/The_Scorpion_and_the_Frog)

------
fidotron
This is because Google are trying to turn Chrome into an app platform to rival
Cocoa, while almost every other web player (except Mozilla) has a vested
interest in keeping the web a document viewer platform.

The recent arguments regarding Adobe's arbitrary region stuff were very
enlightening on this.

~~~
kenferry
I think you're thinking of entities like "Apple" where it is more useful to
think of "the people working on WebKit at Apple".

WebKit is what they do, all day, every day. They love the web, that's why they
chose to go get a job working on it.

To say that the people working on WebKit don't want the web to succeed as an
app platform is not something I have ever seen. Go hang out in #webkit on IRC
and see what you think.

Edit: oh, right, so what IS going on? Exactly what it looks like: once you
ship API, as Tab says, it DOES become very difficult to change due to
compatibility requirements. That's why you do API review. People are saying,
"why are you strong-arming this through?" It's not malicious.

I do also have sympathy for Tab. Progress on the web has a committee model,
and committees suck. I don't know how they ever get anything done. Maybe it
would be better if everything just fragmented, at least it'd move faster.

~~~
fidotron
Actions speak louder than words though.

Apple already have integrated and deployed the Adobe CSS regions stuff, Google
say it's just not their priority this year and would have performance
implications on mobile. This is the weasel way to say they don't care, much as
Apple are here. The devs may feel otherwise, but they are beholden to their
managers who will be well aware of the strategic implications of their work.

Google see Chrome as the best way to bust the iOS App Store, and as such Apple
will mysteriously make the e-books in WebKit vision more compelling in the
near future than the in browser app development one.

I'm not saying I agree with Google here (though more for tech reasons), just
that we need to be clear that the web is experiencing a schism, and that there
are certain subjects which it is hopeless to expect agreement on. This is one
of them.

~~~
kenferry
Are you arguing that if you care about native app platforms you aren't
interested in stuff like the Adobe CSS regions stuff?

One of the biggest changes to Cocoa in iOS 7 was TextKit.
[https://developer.apple.com/Library/ios/documentation/String...](https://developer.apple.com/Library/ios/documentation/StringsTextFonts/Conceptual/TextAndWebiPhoneOS/CustomTextProcessing/CustomTextProcessing.html)

This is concerned with the same sorts of things as regions (though it's more
than that), and developers really, really wanted it.

~~~
fidotron
I'm arguing that different browsers mysteriously prioritise feature
development to be in line with the strategic objectives of the larger
organisations that they are in, and that this inevitably leads to conflict
where those objectives are directly opposed.

------
coldtea
Hah.

Remember when Blink was announced, how some people said "competing engines
would be good for the web"?

And how the Blink team/Google paid lip service to improving the web going
forward, better standards compliance, etc.

Firefox is ahead in some ways, too burneded by BS like a plugin ecosystem
(focus on browsing damnit) and non-native UI skins, on the other.

IE is catching up but limping.

Webkit is not updating that fast anymore.

Opera, nobody but 10 people ever cared about. Besides, they're just Blink now.

Anybody holding their breaths for full ES6 support?

~~~
matthewmacleod
This is a shallow misreading of the situation.

 _" competing engines would be good for the web"?_

They will.

 _Firefox is ahead in some ways, too burneded by BS like a plugin ecosystem
(focus on browsing damnit) and non-native UI skins, on the other._

Plugin systems are key to a good browsing experience; Chrome includes almost
entirely the same feature set, so your argument is void; and, overall, there's
no evidence that Firefox is particularly burdened by anything.

 _IE is catching up but limping._

IE caught up already, and I don't see any evidence that it's "limping"

 _Webkit is not updating that fast anymore._

Well, there are fewer contributors now. But it's kind of hard to draw any
conclusions about how fast it's updating given how little time has passed
since Blink forked.

 _Anybody holding their breaths for full ES6 support?_

Yep, we're making good progress, and Firefox 29/Chrome 33 are doing better
than ever. Still work to be done, but the ES6 spec isn't even finished yet, so
that's to be expected.

In other words, chill out. Everything's looking pretty good, there are loads
of excellent browsers available, and they're mostly getting better. It's not
perfect, but what platform is?

~~~
lelandbatey
If I may chime in on this:

    
    
        > IE is catching up but limping.
        IE caught up already, and I don't see any evidence that it's limping.
    

I actually kind of agree, and I'm surprising myself in saying that. Just from
a consumers perspective, I found at least one place where IE offers something
that I directly wanted to do that couldn't be done on any other browser.

I wanted to watch a HD Netflix on my laptop, but the Silverlight client had no
hardware acceleration so it played terribly. _Turns out,_ there is a hardware
accelerated HTML5 version of Netflix that's only usable using IE since only IE
has the necessary DRM.

I was quite surprised. Ignoring the potential ethical quagmire here, I thought
it was interesting that there was something where IE gave me a better
experience.

~~~
fpgeek
I believe that hardware-accelerated HTML5 version of Netflix is also used on
ChromeOS (since Silverlight isn't an option).

It surprises me a bit to hear that the same doesn't happen with desktop
Chrome.

~~~
makomk
Not quite - it's used on (some) Chromebooks, as in the HTML5 version of
Netflix only runs on locked down hardware from Google partners. Unlock the
hardware so you can run your own software on the Chromebook and the media
decryption module refuses to decrypt anything.

The various content providers seem to have used the advent of HTML5 to insist
on stricter DRM requirements, ones that can only be met through control over
the entire hardware and software stack. I presume Microsoft's version uses the
long-standing GPU support for hardware decryption and acceleration of DRMed
video instead of whatever ChromeOS does. Apparently unlike the Google version
it's possible for other browsers to freely support HTML5 EME that's compatible
with Microsoft's DRM, but naturally only on Windows:
[http://msdn.microsoft.com/en-
us/library/windows/apps/dn46673...](http://msdn.microsoft.com/en-
us/library/windows/apps/dn466732.aspx)

------
sw87
Everyday a little bit closer, Google the _new_ Microsoft. Not surprising.

~~~
reuven
Last week, I was giving a lecture on HTML5, starting with the origins of the
Web, HTML, and the "browser wars." I indicated that the browser wars were an
attempt by two companies (Netscape and Microsoft) to add tags without checking
with anyone else, in the hopes of getting market share.

As I got to this point in the lecture, I paused, and realized for the first
time that Google is often playing a similar game with Chrome.

Everyone agrees that standards are time-consuming, bureaucratic, and prone to
all sorts of compromises. But the goal of standards isn't the speed of the
process, but rather the inclusiveness of the process. (And yes, you can make a
case for the W3C not being so inclusive...)

It's great that Google is interested in treating Chrome as a laboratory for
new Web technologies, but I think that some added humility would be in order.
It's one thing to say, "We think that this might work well, and are throwing
it out there to see what will stick, keeping the good stuff and throwing out
the bad." But instead, they seem to be saying, "We think that this is good
enough for a standard, never mind the process." And that can't be good for the
Web.

~~~
chc
That is how roughly all progress on the Web ever has happened. The existence
of HTML5 is a testament to that — the W3C was lost in the woods for years and
years trying to make XHTML 2 with nothing to show for it. Meanwhile, the
browsers were coming up with their own ideas. So the WHATWG came along and
specified _what the browsers were already doing_ , as well as a few simple
improvements. I think we can all agree HTML5 was a good thing. Well, if you
think so, then you agree that progress coming from the browser implementers is
often a good thing.

The problem in the browser wars was that they were _trying_ to be
incompatible, which is not really the case with Google. (Remember, Microsoft's
model was not "Embrace, extend, evangelize.")

~~~
pcwalton
> The problem in the browser wars was that they were trying to be
> incompatible, which is not really the case with Google. (Remember,
> Microsoft's model was not "Embrace, extend, evangelize.")

This is sometimes the case for Google's technologies as well. PNaCl is pretty
much impossible to embed into other browser engines because of the dependency
on Pepper, which is very Chromium-specific and there has been no attempt at
all that I'm aware of to standardize it. (Pepper even has undocumented stuff
that Flash uses!)

~~~
cromwellian
Honesty, if there was an attempt to standardize it, what would Mozilla's
reaction have been anyway? Mozilla seems completely invested in JS for
everything and hostile to any competing technology for execution in the
browser. It seems the result would be pretty much the same.

It's not like Mozilla doesn't do things for expediency when they need ability
to iterate quickly, e.g. The WebAPIs effectively ignoring the existence of
similar DAP efforts, and then having to turn around and rationalize them with
the previous work. If you need to ship a physical phone with Firefox OS, and
the manufacturers are waiting, are you going to block on W3C, or ship with
proprietary or un-ratified device APIs?

~~~
bzbarsky
We pretty seriously looked at PPAPI back when Adobe announced effective end of
life for NPAPI Flash on Linux.

If PPAPI were not as tied to Chrome's internals as it is, we might in fact
have implemented it.

As far as being invested in JS, I think it's more being invested in managed
code. Our experience with NPAPI is that you end up with the unmanaged code
depending on all sorts of implementation details because it can. The classic
example is that all NPAPI Flash instances across have to be in a single
process, because they assume they share memory. And the browser can't do
anything about it, since it's not involved in the memory sharing bits in any
way. Similarly, the fact that NPAPI plug-ins can do their own network access
makes them hard to align with CSP and the like. Managed code can start to
depend on internals in weird ways too (e.g. sync access to cross-origin window
bits), but you have more chances to pull the wool over its eyes in some way
(e.g. blocking cross-process proxies).

Once you accept managed code, JS seems like as good a place to start as any,
with at least the benefit of being there to start with. ;)

There are, of course, obvious drawbacks to the all-JS approach, starting with
the fairly lacking parallelism story. At least now we've grown Workers, and
there's work on things like SIMD, ParallelJS, etc. Then again, the one major
language addition to a browser VM recently (Dart) didn't exactly address this
need either....

~~~
cpeterso
I worked on Adobe's Flash Player team when Chrome was pushing PPAPI. Adobe
engineers were strongly opposed to PPAPI because it was complex and incomplete
(and it was only implemented by Chrome). The only reason a Flash PPAPI plugin
exists for Chrome is because Google did the work.

------
kevingadd
The Blink team seems dead-set on repeating the mistakes they made with the Web
Audio API (shipping an immature, vendor-specific API to the web without third-
party feedback and then forcing it through a standards process, etc). Pretty
frustrating.

~~~
cromwellian
On the other hand, the Mozilla counter proposal, which was trying to do low-
latency audio by using Javascript as a DSP was pretty awful. It's like saying
you'll do 3D by handing a framebuffer to JS and doing rasterization in
software.

"Forcing it through the standards process" sounds like the kind of rhetoric
Republicans use when bills get passed they don't like.

~~~
kevingadd
Web Audio API has a similar mechanism for generating samples with JS and it's
just as bad. In fact, Web Audio is _worse_ for playback of JS-synthesized
audio, not better. And lots of use cases demand playback of JS-synthesized
audio: emulators, custom audio codecs, synths, etc.

The Mozilla API did one or two things and did them adequately; it did them by
extending an existing API (the <audio> element). Use cases it didn't cover
could have been handled with the introduction of new APIs or extensions to
existing APIs.

The Web Audio API introduced an interconnected web of dozens of poorly tested,
poorly specified components that covered some arbitrary subset of cases users
care about. Most of the oversights still haven't been fixed and the spec is
still full of unaddressed issues and deficiencies.

~~~
cromwellian
>Web Audio API has a similar mechanism for generating samples with JS and it's
just as bad. In fact, Web Audio is worse for playback of JS-synthesized audio,
not better. And lots of use cases demand playback of JS-synthesized audio:
emulators, custom audio codecs, synths, etc.

Generating audio samples via JS is an escape hatch, the same way rasterizing
to a raw framebuffer is an escape hatch. If your system had audio acceleration
hardware, and many systems do (e.g. DirectSound/OpenAL), you want to leverage
that.

If you are deploying a game to a mobile device, the last thing you want to do
is waste CPU cycles burning JS to do DSP effects in software. This is
especially awful for low end phones like the ones that Firefox OS runs on.
Latency on audio is already terrible, you don't want the browser UI event loop
involved IMHO in processing it to feed it. Maybe if you had speced it being
available to Web Workers without needing the UI thread involved it would make
more sense.

>The Web Audio API introduced an interconnected web of dozens of poorly
tested, poorly specified components that covered some arbitrary subset of
cases users care about

The arbitrary subset being, those that have been on OSX for years? AFAIK,
Chris Rogers developed Web Audio based on years of experience working on Core
Audio at Apple, and therefore, at minimum, it represents at least some
feedback from the use cases of the professional audio market, at the very
least, Apple's own products like Garage Band and Logic Express which sit atop
Core Audio.

You assert that the other use cases could have been handled by just extending
existing APIs, but to me this argument sounds like the following:

"You've just dumped this massively complex WebGL API on us, it has hundreds of
untested functions. It would be better to just have <canvas> or <img> with a
raw array that you manipulate with JS. Any other functions could be done
[hand-wave] by just bolting on additional higher level apis. Like, if you want
to draw a polygon, we'd add that."

APIs like Core Audio, Direct Sound, OpenGL have evolved because of low level
optimization to match the API to what the hardware can accelerate efficiently.
In many cases, bidirectional, so that HW influences the API and vice-versa.
Putting the burden on the WG to reinvent what has already been done for years
is the wrong way to go about it. Audio is a solved problem outside JS, all
that was needed was good idiomatic bindings for either Core Audio, OpenAL, or
DirectSound.

Whenever I see these threads on HN, I always get a sense of a big dose of NIH
from Mozilla. Whenever anyone else proposes a spec, there's always a complaint
about complexity, like Republicans complaining about laws because they are too
long, when in reality, they don't like them either for ideological reasons, or
political ones.

Mozilla is trying to build a web-based platform for competing with native
apps. You can see it in asm.js and Firefox OS. And they are not going to get
there if they shy away from doing the right things because they are complex.
Mobile devices need all the hardware acceleration they can get, and specing
out a solution that requires JS to do DSP sound processing is just an
egregious waste of battery life and cycles IMHO for a low end web-OS based
mobile HW.

~~~
_delirium
> Generating audio samples via JS is an escape hatch, the same way rasterizing
> to a raw framebuffer is an escape hatch.

But that escape hatch is where all the interesting innovation happens! It's
great that canvas exists, and it's much more widely used than WebGL is,
because it's more flexible and depends on less legacy cruft. You don't _have_
to use it, but I do, and I'd like a "canvas for audio" too.

To make matters worse, the Web Audio stuff is much less flexible than OpenGL.
You can at least write more or less arbitrary graphics code in OpenGL: it's
not just an API for playing movie clips filtered through a set of predefined
MovieFilter nodes. You can generate textures procedurally, program shaders,
render arbitrary meshes with arbitrary lighting, do all kinds of stuff. If
this were still the era of the fixed-function OpenGL 1.0 pipeline, it'd be
another story, but today's OpenGL at least is a plausible candidate for a
fully general programmable graphics pipeline.

Web Audio seems targeted more at just being an audio player with a fixed chain
of filter/effect nodes, not a fully programmable audio pipeline. How are you
going to do real procedural music on the web, something more like what you can
do in Puredata or SuperCollider or even Processing, without being able to
write to something resembling an audio sink? Apple cares about stuff like
Logic Express, yes, but that isn't a programmable synth or capable of
procedural music; while I care about is the web becoming a usable procedural-
music platform. One alternative is to do DSP in JS; another is to require you
to write DSP code in a domain-specific language, like how you hand off shaders
to WebGL. But Web Audio does the first badly and the 2nd not at all!

> Audio is a solved problem outside JS

Yeah, and the way it's solved is that outside JS, you can just write a synth
that outputs to the soundcard...

~~~
cromwellian
WebGL is less used than Canvas for the most part, because 3D and linear
algebra are much more difficult to work with for most people than 2D. Also,
people work with raw canvas image arrays much more rarely than they do the
high level functions (stroke/fill/etc)

OpenGL was still a better API than a raw framebuffer even when it was just a
fixed function pipeline. Minecraft for example is implemented purely with
fixed-function stuff, no shaders. It isn't going to work if done via JS
rasterization.

Yes, there are people on the edge cases doing procedural music, but that is a
rare use case compared to the more general case of people writing games and
needing audio with attenuation, 3D positional HRTF, doppler effects, etc.
That's the sweet spot that the majority of developers need. Today's 3D
hardware includes features like geometry shaders/tessellation, but most games
don't use them.

OpenSL/AL would work a lot better if it had "audio shaders". Yes. But if your
argument is that you want to write a custom DSP, then you don't want Data
Audio API, what you want is some form of OpenAL++ that exposes an architecture
neutral shader language for audio DSPs, that actually compiles your shader and
uploads it to the DSP. Or, you want OpenCL plus a pathway to schedule running
the shaders and copying the data to the HW that does not involve the browser
event loop.

That said, if there was a compelling need for the stuff you're asking for, it
would have been done years ago. None of the professional apps, nor game
developers, have been begging for Microsoft Direct Sound, Apple, or Khronos to
make audio shaders. There was a company not to long ago, Aureal 3D, which
tried to be the "3dfx of audio", but failed, but it turns out, most people
just need a set of basic sound primitives they can change together.

I have real sympathy for your use case. For years, I dreamed of sounds being
generated in games ala PhysX, really simulating sound waves in the
environment, and calculating true binaural audio, the way Oculus Rift wants to
deliver video to your senses, taking into account head position. To literally
duplicate the quality of binaural audio recordings programmatically.

But we're not there, the industry seems to have let us down, there is no SGI,
nor 3dfx, nor Nvidia/AMD "of audio" to lead the way, and we certainly aren't
going to get there by dumping a frame buffer from JS.

Right now, the target for all this stuff, Web GL, Web Audio, et al, it
exposing APIs to bring high performance, low latency games to the web. I just
don't see doing attenuation or HRTF in JS as compatible with that.

~~~
_delirium
I agree that for games the market hasn't really been there, and they're
probably served well enough by the positional-audio stuff plus a slew of semi-
standard effects. And I realize games are the big commercial driver of this
stuff, so if they don't care, we won't get the "nVidia of audio".

I'm not primarily interested in games myself, though, but in computer-music
software, interactive audio installations, livecoding, real-time algorithm and
data sonification, etc. And for those use cases I think the fastest way
forward really just is: 1) a raw audio API; and 2) fast JS engines. Some kind
of audio shader language would be even better perhaps, but not strictly
necessary, and I'd rather not wait forever for it. I mean to be honest I'd be
happy if I could do on the web platform today what I could do in 2000 in C,
which is not that demanding a level of performance. V8 plus TypedArrays brings
us pretty close, from just a code-execution perspective, certainly close
enough to do some interesting stuff.

Two interesting things I've run across in that vein that are starting to move
procedural-audio stuff onto the web platform:

* [http://charlie-roberts.com/gibber/info/?page_id=6](http://charlie-roberts.com/gibber/info/?page_id=6)

* [http://www.bfxr.net/](http://www.bfxr.net/)

There are already quite a few interactive-synth type apps on mobile, so mobile
devices _can_ do it, hardware-wise. They're just currently mostly apps rather
than web apps. But if you can do DSP in Dalvik, which isn't really a speed
demon, I don't see why you can't do it in V8.

Edit: oops, the 2nd one is in Flash rather than JS. Take it instead then as
example of the stuff that would _be nice_ to not have to do in Flash...

------
donrhummy
Shadow DOM is a very, very bad idea.

[http://glazkov.com/2011/01/14/what-the-heck-is-shadow-
dom/](http://glazkov.com/2011/01/14/what-the-heck-is-shadow-dom/)

It kills cross-browser compatibility, kills standards (since they're
unreachable, undocumented elements that can handle input, interaction and
affect other elements.

~~~
badman_ting
It's starting to dawn on me what this will _really_ be used for, and I can't
say I'm stoked about it. But the article you've linked to seems pretty upbeat
about the whole thing.

~~~
grayrest
Author of the article is the spec author.

------
chucknelson
Any "insiders" have comments on this? Is this the Chrome team just trying to
keep pushing forward, or are they technically sticking to the referenced
compatibility guidelines they published?

I'm just wondering if there has been a lack of progress with Shadow DOM and
their solution is just go-go-go to get things moving.

~~~
bzbarsky
From the shadow DOM spec author:
[https://groups.google.com/a/chromium.org/forum/#!msg/blink-d...](https://groups.google.com/a/chromium.org/forum/#!msg/blink-
dev/ay9tVGRa8Rg/jPv0j06mP7EJ)

There's been a lack of progress in that it's taking a while, but there are a
bunch of still-open unresolved issues... The problem with big complex
features. ;)

------
rkangel
Forget the technical content, or the political point here - that thread is
worth reading just as an excellent example of how to conduct a civilised
discussion even in a situation with serious potential for strife.

Of particular note is how, even though it does get a bit stressed in the
middle, everyone very carefully allows the discussion to calm down and restate
their points with no emotional content.

------
youngtaff
Rather than link to a comment from someone at Apple part way down the thread
and starting 'burn the witch' style threads on HN perhaps the OP should have
started at the beginning - [http://lists.w3.org/Archives/Public/www-
style/2014Feb/0032.h...](http://lists.w3.org/Archives/Public/www-
style/2014Feb/0032.html) \- and included the Blink thread that discusses
Google's position and issues in more detail
[https://groups.google.com/a/chromium.org/forum/#!topic/blink...](https://groups.google.com/a/chromium.org/forum/#!topic/blink-
dev/ay9tVGRa8Rg) \- read Adam Barth's comments if nothing else

------
bsaul
Reading about google shipping unpolished things too fast in chrome makes me
smile, since i've just decided to stop using it as of yesterday, after
realizing how bloated that software has become ( cpu & ram wise ). Wonder if
that's a coincidence.

~~~
alimoeeny
what do you use instead, which has better ram and cpu footprint?

~~~
bsaul
Well, actually chrome was consuming about 1 gig of ram (if you're counting all
the "Chrome helpers" processes) and 2% constant CPU when i had absolutely no
page opened. So pretty much anything else is better (i'm on safari right now
and it feels extremely light). Note that it's not just me :
[https://productforums.google.com/forum/#!topic/chrome/y8VTVH...](https://productforums.google.com/forum/#!topic/chrome/y8VTVHmkuzU)

------
onethree
"hi relevant standards authority, you're not working fast enough for our
liking, so we're doing what we wan't and if you don't like it, too bad" if
this was anyone but google, there'd be a shitstorm...

------
ndesaulniers
I was happy to see this from the link in the email:

Vendor Prefixes Historically, browsers have relied on vendor prefixes (e.g.,
-webkit-feature) to ship experimental features to web developers. This
approach can be harmful to compatibility because web content comes to rely
upon these vendor-prefixed names. Going forward, instead of enabling a feature
by default with a vendor prefix, we will instead keep the (unprefixed) feature
behind the “enable experimental web platform features” flag in about:flags
until the feature is ready to be enabled by default. Mozilla has already
embarked on a similar policy and the W3C CSS WG formed a rough consensus
around a complementary policy.

I remember people getting worried that app developers might tell users to dig
through their about:config equivalent to access the "full version" of this
site. Glad that things are moving away from vendor prefixes. Kind of sad to
see web audio still vendor prefixed (and the O'Reilly book shipped with the
old API, lol!), but I assume that was implemented before the webkit/blink
split.

~~~
erichocean
_Going forward, instead of enabling a feature by default with a vendor prefix,
we will instead keep the (unprefixed) feature behind the “enable experimental
web platform features” flag in about:flags until the feature is ready to be
enabled by default._

For the feature's we're talking about (JS methods and CSS stuff), doing so
will mean that effectively no web authors use it. It's no different than
requiring a plugin to view content. Anything hidden behind a feature flag
might as well not exist, at least for those two categories.

For dev tools, stuff like that? Sure, feature flags are the way to go.

------
lmm
Funny how the biggest opponents of this kind of thing seem to also be the
biggest fans of Javascript, which was developed by freezing whatever syntax
Netscape came up with in three days. If this had been the process back in 1990
the web would never have got off the ground.

~~~
MDCore
It sounds like you're trying to say that fans of JavaScript don't like
freezing syntax but that that is disingenuous because JavaScript is the result
of a frozen syntax thrown together in three days. Is that accurate so far?
So... freezing something and throwing it out there is good? Or finding broad
consensus is good? I can't tell which approach you're going for. Which "this"
process are you referring to not having happened back in 1990?

------
josteink
I've said it before and I'll say it again:

Chrome is the worst thing which has happened to the web because of what it
allows Google to do.

MSIE ruined and fragmented the web with non-standards because it was in
Microsoft's interest at the time to hinder a successfull adaptation of the
web.

Google however is fundamentally a web-company and now they are using Chrome to
shoehorn anything which fits their company onto the open web without a
standards-comitee in sight, or accepting any sort of reasonable feedback.

This breeds monsters like HTTP2.0^W SPDY^W technogizzwizz kitchensink internet
everything protocol.

This is much worse than anything MSIE ever did.

------
ollysb
On another note, how great is it going to be to have components you can reuse
between angular, ember, jquery and the likes? A web-components.com repository
would be awesome!

~~~
josteink
And wouldn't it be awesome if multiple parties was able to read through a
_draft_ and make comments on it and provide constructive feedback before it
was finalized and pushed into production on the web, to be supported for all
eternity?

That way we could have a _good_ implementation of this whole thing.

Sadly so far everything except the "rush this unfinished thing into
production"-part is missing.

Obviously Google only see their own implementation, think "all is good" and
don't give a rats ass about anyone else and their opinions, but here on HN we
should have a wider perspective.

------
RyanMcGreal
A bit late to the party, but this seems relevant:

"[W]hy do we have an <img> element? Why not an <icon> element? Or an <include>
element? Why not a hyperlink with an include attribute, or some combination of
rel values? Why an <img> element? Quite simply, because Marc Andreessen
shipped one, and shipping code wins."

[http://diveintohtml5.info/past.html](http://diveintohtml5.info/past.html)

------
tiglionabbit
Oh, it's about Hat and Cat. I first saw these at a talk on Shadow DOM and the
future of the web by Rob Dodson.

[http://robdodson.me/blog/2013/11/15/the-cat-and-the-hat-
css-...](http://robdodson.me/blog/2013/11/15/the-cat-and-the-hat-css-
selectors/)

------
cromwellian
The current div-soup approach with massive JS enhancement is what is a bad
idea. It abuses the document model and makes information less transparent,
less semantically meaningful, and makes it hard to share and reuse components
between sites due to leakage. It also makes debugging and reading the DOM in
an inspector far more messy.

Scoped CSS and Shadow DOM clear up the leakage problems and introduce proper
encapsulation to the web beyond heavyweight IFRAMES. The templating system
lets documents expose information in transparent ways again, rather than
executing a big ball of JS just to get the DOM in a state that is meaningful.

Web Development has been fundamentally broken for ages. The basic ideas to fix
it have been discussed for years. Every day delayed is a day for more native
encroachment.

------
CmonDev
Nothing wrong with trying to do something different. Committees are too slow.

