
Why Your Desktop Won't Be Running Mir/Wayland Anytime Soon - munchor
http://shnatsel.blogspot.ru/2013/03/why-your-desktop-wont-be-running.html
======
green7ea
This post seems to be slightly misinformed about Wayland. I can't say much
about Mir as I'm not as familiar with it.

1\. You can use OpenGL, OpenGL ES, Software, etc. to render windows in Wayland
and most likely in Mir.

2\. 2D acceleration depends on the widget toolkit and not Wayland. Wayland
doesn't do 2D acceleration or need to do so since it is only responsible for
compositing windows (which is as fast in 3D). If 2D acceleration is used to
draw these windows, it isn't affected by the use of Wayland or X11 in any way.
Wayland has a public mailing list that has much more reliable info than
Phoronix and LWN.

3\. This is probably the most interesting point raised. Providing xrandr
functionality in wayland is much easier than in X11. This is because you have
to support a lot of legacy features in X11 that you don't in Wayland.

4 (I added this one) Throughout the article, he seems to assume that no
drivers will support Wayland. Wayland was buit to reuse most of the X11
drivers by being able to use DRI2. Mir seems to be able to take this one step
further by being able to leverage existing Android drivers (there was a
prototype of this in Wayland a while back).

I would like to finish by saying that I agree with the premise but not for the
reasons given. The reason you won't see Wayland and Mir in a distro this year
is because X11 is tried and tested. You don't replace something that critical
to the user experience on a whim: you make sure it works and works for
everyone first. Having said that, you will see Wayland and/or Mir faster than
you'd expect.

~~~
beatgammit
I completely agree. Some interesting factoids:

1\. Arch already has Wayland in its official repositories.

2\. KDE already has preliminary support

3\. Intel is driving Wayland development, so integrated cards will likely work
without a hitch

4\. Systemd was another large change with lots of FUD, and it's already
default in high-profile distros (OpenSUSE, Fedora and soon RHEL 7). Even
Canonical's on board.

This article is mostly FUD. I think we'll see a relatively stable KDE on
Wayland this year. I doubt the Wayland team is that optimistic, but it's
moving at an incredible pace.

------
rayiner
Speed, to a certain extent, doesn't matter. What matters is intelligent
drawing. Double-buffering updates, not showing partially laid-out screens,
etc. Software rendering is plenty fast for the relatively simple drawing that
makes up a typical application's user interface, and EGL does support the most
essential bit of hardware acceleration--bit blit (using the image as a
texture).

~~~
randomfool
For touch interfaces the lag between input and render is critical, and double-
buffering if done wrong can kill you.

Software rendering an opaque rect may be fine, but add in some big semi-
transparent overlays and performance drops.

~~~
rayiner
Only to the extent that it's not fast enough, which it clearly is given that
iOS doesn't hardware-accelerate CoreGraphics. Anything done wrong can be bad,
but flicker is really really bad. Pretty much only Apple gets it right in that
regard (maybe Jellybean, I haven't used it much).

~~~
pcwalton
Apple gets it right because they cache prerendered textures so aggressively on
the GPU. In this regard software rendering versus hardware rendering doesn't
really matter much, since actual rendering happens so infrequently.

------
JonnyH
Point 1 is incorrect: You can use the full-desktop (non-ES) openGL with egl
with no problems - just pass 'EGL_OPENGL_API' to eglBindAPI.

I don't know how good the support for this is from different drivers, however.

~~~
azakai
There is also Regal, a library that implements GL on top of GLES2. So it
should still be possible even with only GLES2.

------
wmf
AFAIK, OS X has been running fine (not super-fast, but acceptable) without 2D
acceleration for 13 years.

Also, doesn't Compiz still have some tearing or artifacting during window
resize? Wayland is supposed to eliminate that.

~~~
monkeyfacebag
I'm confused, I thought 2D acceleration was introduced with 10.4. I'm not a
Mac user, so I'm not sure, but Wikipedia
(<http://en.wikipedia.org/wiki/Quartz_%28graphics_layer%29>) seems to agree.

~~~
wmf
"Quartz 2D Extreme ... still remains disabled by default" Somebody correct me
if this is wrong.

~~~
glhaynes
IIRC, Quartz 2D Extreme (later referred to as QuartzGL) had to do with
hardware-assisted rendering/compositing of things like text and certain
primitives and, afaik, most of that is still not turned on in shipping OS X
and likely never will be. (Again, iirc: it turned out to not provide much
benefit compared to the substantial trouble of making the output the same
between different video cards.) Perhaps some of that has moved to OpenCL to
get the same effect but from a more appropriate layer? I really don't know.

But when you drag a window around or invoke Mission Control, etc, all of that
is "Quartz Extreme" (a marketing label that hasn't been used much in a while),
which is hardware-assisted and has been since 10.2. Is rendering of the shadow
cast by each window accelerated, and if so, how? Again, not sure.

~~~
wmf
I'm pretty sure that I'm using the same definition of 2D acceleration as the
original poster. We're not talking about compositing.

~~~
gilgoomesh
EXA and XRender (the 2D acceleration in X.org that the original poster is
talking about) is just 2D compositing. It is not about drawing line primitives
on the graphics card.

No mainstream 2D rendering system (not X.org, not Mac OS Quartz, not Windows
GDI/WPF+DWM) uses any graphics acceleration to draw 2D primitives. In some
cases, drawing may be done to graphics card memory (to avoid needing to copy
to graphics memory later) but that's it (see CALayers on the Mac). The lines
and other primitives are all CPU determined. This is still just a compositing
layer.

The intractable performance problems of the Mac's Quartz 3D Extreme are
exactly _why_ drawing is always done by the CPU -- it is actually faster to do
this work on the CPU. You'd link a graphics shader might help but performance
metrics prove otherwise.

The article itself is badly confused (on this point and others). X11/X.org has
no advantage in this regard.

~~~
mfunk
"No mainstream 2D rendering system (not X.org, not Mac OS Quartz, not Windows
GDI/WPF+DWM) uses any graphics acceleration to draw 2D primitives."

That isn't entirely true. Windows has Direct2D for graphics and DirectWrite
for text after all.

Direct2D and DirectWrite were first introduced for Windows Vista and 7 using
Direct3D 10. Windows 8 expanded it using new features found in Direct3D 11.

You can see the high performance and fluid animations it brings in many of the
Metro applications. Desktop applications need to migrate their existing code
of course. IE and Firefox have done this already.

[http://blogs.msdn.com/b/b8/archive/2012/07/23/hardware-
accel...](http://blogs.msdn.com/b/b8/archive/2012/07/23/hardware-accelerating-
everything-windows-8-graphics.aspx)

~~~
anonymous
And there's no reason we can't have the same kind of library on linux, except
one that would render using OpenGL. In fact, if I'm not mistaken, Qt already
supports rendering using OpenGL.

------
chipsy
I think the author mostly has a point with regards to backwards compatability;
however, mobile app development is already centered around OpenGL ES graphics,
so I don't see it as a major stumbling block for new code. Of the three points
he made, the shift from XRandr to mode-setting sounds like the most
potentially troublesome.

There was a presentation on the mode-setting architecture at FOSDEM in
February, I'm going to look at it now:
[http://www.phoronix.com/scan.php?page=news_item&px=MTI5N...](http://www.phoronix.com/scan.php?page=news_item&px=MTI5Nzc)

------
qwerta
I think article is completely missing the point. For now MIR is going to use X
server as backend. Priority for Canonical is to create single API portable
between Android and Linux Desktop. Sure Android compositor will eventually
replace X server, but it will be just side effect.

~~~
sciurus
I think everything you said is incorrect.

<https://wiki.ubuntu.com/MirSpec>

~~~
drivebyacct2
It is. Km looking forward to having to deal with people talking out their ass
for months like they did and continue to do about SecureBoot. If you don't
understand it, don't comment. Its kinda like how the anti-Wayland falsehoods
from the MirSpec doc continue to be parroted. Started from ignorance and
perpetuated by it.

------
Zigurd
Re OpenGL: If I understand this <https://wiki.ubuntu.com/MirSpec> correctly,
libhybris is what is currently used to cobble together the use of
SurfaceFlinger as a compositor for Ubuntu Touch. Ubuntu intends to pull this
compatibility with Android's HAL, more specifically the GPU drivers, into mir,
which will enable mir to be binary compatible with GPU drivers written for
Android, which support OpenGL.

Mir has specific goals for hardware and binary driver compatibility and
Canonical has demonstrated proof of concept by using SurfaceFlinger as a
compositor for Ubuntu Touch.

~~~
wmf
AFAIK Android drivers support only OpenGL ES, not regular OpenGL; that's the
problem he's talking about. As similar as they may be, apps written for one
will not run on the other.

------
cllns
Screen rotation is probably essential, but is screen resizing? I'm not sure it
is. I think we're all just used to it.

~~~
zanny
But this just points out an inherent trait in most software, if you are trying
to replace something older, you better _at least_ have feature parity, and
some killer feature to drive adoption.

In Wayland, I can easily see the ability to run an X server on top as a
motivator, but in the end the killer feature needs to be performance and a
tide of new software targeting EGL / WaylandInput insteaf of GLX / XInput.

~~~
npsimons
_if you are trying to replace something older, you better at least have
feature parity_

Which is why there has been so much fuss. I haven't kept up, but has the
Wayland project changed their tune on networkable graphics? I will grant that
X could use some uncrustifying, but I think the general gist of the OP is
correct: there is just so much that X offers that Wayland and Mir _will_ have
to catch up to (assuming they can, which requires not just time and effort,
but forward thinking on design and architecture).

And as for performance, my N900, a _four years old phone_ , runs X just fine.
The sad experience I have had with software (including OSS), is that the newly
born projects are _almost always_ designed without paying attention to
efficiency, usually through a lack of awareness because people are so used to
fast hardware these days.

~~~
zanny
> has the Wayland project changed their tune on networkable graphics

You mean "provide a flag in ssh to forward a rendered window?"

They were talking about having hooks so a VNC server could act as a compositor
for windows that it can forward over the VNC protocol. That would be a much
better solution than raw pixel buffers like X forwarding had.

~~~
vsync
X forwarding is just a tunnel for the normal X protocol, which is both higher-
level and provides much more functionality than a raw pixel buffer.

~~~
zanny
What extra functionality is available in networked X that is not available
over a VNC pipe?

------
drivebyacct2
That's weird, Weston runs fine on my desktop with nouveau and Intel on my
Macbook Air. Oh, and non-Xlib GTK apps too. And apparently Chrome too though I
haven't had a chance to confirm this yet.

