Hacker News new | past | comments | ask | show | jobs | submit login
Ubuntu may switch to Android technologies to keep the Linux desktop competitive (soltesza.wordpress.com)
94 points by mtgx on March 4, 2013 | hide | past | favorite | 90 comments

There seems to be little worry about the way Linux is going wrt Android - and to a lesser extent the Raspberry Pi - that pretty much tie folk down to one version of the kernel since all of the drivers are binary-only.

Ubuntu phone will support the "Android kernel" (as one Canonical guy called it on a recent podcast), since only Android builds of the Linux kernel works with the binary-only drivers that the manufacturers supply. This is just another form of Tivoization. You have the source, but you can't really change anything.

It seems to me that the infamous "Linux in a binary world" doomsday scenario is becoming more of a reality every day https://lwn.net/Articles/162686/ - but instead of servers vs consumer PCs, it is being played out on phones, tablets and laptops.

It is not Tivoization. Tivoization is when device manufacturers deliberately lock down their devices so the user cannot change the code even though it is open source.

What's the difference whether your kernel doesn't boot due to signature checks or due to lack of essential drivers?

You might be able to write the drivers without running afoul of the law. If it's a signature check, breaking it could be construed as a DMCA violation (in the US, probably some other law in friendly countries). Neither one's great, but there is a difference.

This is the Linux kernel developers fault to begin with. Microsoft does not break driver API for every minor update of the kernel. In fact with my Radeon x1900 I am running it under Windows 7 with Windows Vista drivers. Runs StarCraft2 and a bunch of newer FPS games just fine.

On the other hand, Linux is a disaster. The kernel devs are so determined to break binary compatibility, I haven't been able to run with ATI's proprietary binary drivers for years. AMD was a good open source citizen and released the specs. And yet the open source drivers for my card are useless for anything but 2D.

I am thankful that Google came along and is forcing a standard in the kernel. I welcome this "binary world".

The approach isn't without merit. Promoting open source is important. If there was a stable binary driver interface we would have far fewer open source drivers. With open source drivers we have added security and the ability to fix bugs. Not to mention the ability to maintain them ourselves if the need arises.

I've heard it argued that it's for this reason that Linux is more successful than other more liberally licensed open source kernels like the BSDs.

Regarding your Radeon card. I Hold Linux support very highly in my requirements for buying new hardware. For this reason I've always bought Nvidia cards over ATI. You can't just buy anything and expect it to work flawlessly. Saving $30 on the purchase of a graphics card has hidden costs. ATI can't even make a decent windows driver let alone a Linux one.


I'm looking for a tablet device where the graphics drivers are open source, or there are decent reverse engineered open source drivers. So far the list is very short: buy Intel HD graphics hardware.

The problem you raise is exactly why I'm so picky for graphics accelerators: they are most often what is stopping me from upgrading the kernel on a system, and I want to upgrade the kernel/distribution.

I'm not sure about the state of open-source AMD drivers, so if anyone can testify about their graphics drivers, I would appreciate it.

The radeon driver, I heard, works well on hardware that's not a few weeks old most of the time, with slightly better 2D performance and slightly worse 3D performance.

I think there's also the Mali400 chip in the "detect reverse engineered" department, if we're talking about tablets.

AMD has made serious commitments to their open-source drivers. They've released documentation for their chips and have several people working full-time on the OS drivers.

As for supported chips, everything up to "Northern Islands", the last generation, works really well. The latest "Southern Islands" chips also work well but don't have all the features implemented yet.

I use a HD7970 in my workstation. 2D works great and is very fast. 3D is pretty slow, b/c many things aren't accelerated yet. So, for example, minecraft runs about 20fps.

The next generation of AMD gpus "Sea Islands" will use the same drivers as "Southern islands" so the former will be supported as soon as the latter.

Probably the biggest issue is power management. dynpm dynamically changes the clock but doesn't work with multiple heads. The "profile" power management flickers when changing the clock speed.

The reasons for the lack of support for "Southern islands" is that support required many changes that couldn't be release incrementally. Better support is expected when the "Sea islands" chips come out. Check out this post by an AMD developer for a little more detail: http://phoronix.com/forums/showthread.php?73824-Bridgman-Is-...

Check out http://www.x.org/wiki/RadeonFeature for a table of chips and features and information about mapping engineering names to board marketing names.

If you have never experienced kernel mode setting(KMS) and in-kernel drivers I would _highly_ recommend it. It is such a better experience. It's so smooth, switching from xorg to a VT doesn't flicker. And having to upgrade proprietary drivers is a huge pain. I could never go back to UMS and proprietary drivers.

As for my recommendations, if the intel HD3000 or HD4000 meet your 3D needs, they work extremely well and have great in-kernel drivers.

If you need faster OS 3D or a discrete card, go for a "northern islands" Radeon. bridgman recommends a mid-range card for the biggest band-for-the-buck with OS drivers. The high-end cards won't be slower but the drivers aren't as optimized to get the highest performance.

If you are patient and like submitting bug reports you should get a HD7970 ;)


Oh and if you need OpenCL on gpu, intel doesn't support it yet. I believe its works on AMD "northern islands" and earlier.

For more info check out: http://software.intel.com/en-us/articles/opencl-sdk-frequent...



Dunno about new Radeons, but my experience with older series is very poor.

HD3000 - very common IGP, yet whenever I enable it there are GPU lockups every few days. This goes on for years, on two motherboards.

AGP Radeon 9600 - occasional crashes with bad kernel/X combinations.

The 9600 works fine on "good" driver versions. HD3000 boxes are unusable without some old 8-series GeForces running under nouveau.

And now, Intel. I've been using a laptop with GMA950 IGP for almost five years. Never a single crash, except for one time when I enabled KMS long before it was cool and everybody's dog was using it. And nowadays this driver runs some Wind0ze games faster than genuine XP on the same hardware.

EDIT: I've just checked that this Intel KMS bug was happening only on some -rc release.

What version of linux, mesa, xorg were you having those HD3000, Radeon 9600 problems on? The "radeon" driver is completely different from the "radeonhd" and the old reverse-engineered xorg ati driver from a few years ago.

And just to be clear, anywhere I said "HD3000" I was referring the integrated graphics chip in various Intel processors not the AMD chip.

The past couple of years AMD has directly supported driver development and the drivers have greatly improved.

9600: no idea, it sits in my parents' box, 300km away and no ssh.

3000: 760G chipset, many (most? all?) Linux/Mesa/Xorg version released in last two years. Completely random lockups, usually at few days of uptime. I no longer use one of this mobos so I'll have a look at this issue if I find some HDD to boot it.

I have been running HD3000/HD4000 for years and never had a crash or lock up. You're not talking about AMD's Radeon HD 3k/4k are you? Because I believe both dmm and I are talking about Intel's HD range.

Thanks for such an in-depth reply! Very helpful :)

> (as one Canonical guy called it on a recent podcast)

Do you remember which Podcast that was on? Sounds interesting.

Yep, it was the most recent Ubuntu UK podcast http://podcast.ubuntu-uk.org/

But you don't need SurfaceFlinger to get multicontext GL out of Android GPU drivers... You only need to use the "gralloc" buffer allocator and give EGL an address from there and you can start rendering to the framebuffer or offscreen.

SurfaceFlinger does handle some things like allowing windows to draw directly to the framebuffer when they're unoccluded, but the Linux guys don't care about that (or they would unredirect frontmost windows in their X compositors, which still nobody does, ugh!).

SurfaceFlinger is also pretty immature and featureless for a desktop surface compositor: 1. Only handles 2x2 transforms (+translation) which it expands internally to 4x4 GL matrices. 2. Written entirely around GLESv1 -- no shaders, no complex geometry: no "genie effect" or anything like that. 3. Immature multi-display support. It was literally just added in Android 4.2. Try taking a screenshot of anything that's not display 0. 4. No IPC system, unless you count Binder which has it's own set of problems ("lets make every message be synchronous").

I haven't kept up with Wayland, but I imagine it has a lot of these problems addressed (except for allowing unoccluded windows direct framebuffer access -- almost nobody bothers with that one, even though it's a big win).

Also, someone at Canonical wrote a library, "libhybris" which supposedly could load GL implementations compiled against bionic on a glibc system -- this seems a much more sensible approach to me...

I'd agree that SurfaceFlinger is pretty minimal as a compositor. However, because of the power / performance limitations on mobile devices. you're losing if you're compositing with the GPU rather than say overlays, which SurfaceFlinger does facilitate with the HW Composer (as I'm sure you know).

I wouldn't recommend this approach for the desktop but the reason I like this approach for mobile ubuntu initially is that you will get a more mature GPU integration with SurfaceFlinger/ANativeWindow than generic Linux.

Edit: we also know the limitation of glesv1 is because of the problems for glesv2 support in some is emulator environments. So it s a trade off. Agree that the multiple display support is a big problem too

It may. But in general an article whose headline that says "X may Y" implies far more than reality. This article is pure speculation (apart from SurfaceFlinger).

The article doesn't present itself as anything else. But appearing in the context of HN, the title is not appropriate.

This is a complicated cost benefit analysis for Canonical. All of the device manufacturers with the exception of samsung are at the mercy of Qualcomm/Intel/Nvidia for the core drivers for their SOC. If a device manufacturer wants to experiment with an alternative OS, they are unlikely to make the commitments to the SoC manufacturers needed to get debugged drivers for a non-Android stack. By running of the Android Kernel and using Android graphics drivers, Cannonical has lowered the barrier to experimentation for device manufactuers. The cost of this is it complicated their developer story considerably. The reality of this is the developer story needs to be considerably fleshed out before we can judge it. All we know is that it will be different from the current Desktop Ubuntu story.

Here's the original link that SurfaceFlinger (and other Android services) are running:


It's a very pragmatic choice in the short term. Most if not all ARM SoC GPU stacks are very much Android first with proprietary implementations. There's not a lot of ARM based manufacturer backing for generic Linux implementations (DRM/GEM/TTM etc) .

Maybe if Google sway towards something which works better in the generic Linux world the manufacturers might move.

This is insane. Surface Flinger is quite different from X apps. They'd need to port all their Gtk/Qt apps to it or abandon these.

(Alternately wayland/X working on SurfaceFlinger).

They could port Unity to SF if they wanted, but what apps are you actually going to launch then, ones for android ?

I doubt they'll port all the apps they have to SF.

They'd need to port all their Gtk/Qt apps to it or abandon these.

Both Gtk+ and Qt already run on non-X11 platforms like Win32. Canonical would simply need to add a SurfaceFlinger backend to the frameworks, and then the apps would just work.

The number of modern Linux apps that use X11 directly is small. Anything that makes substantial use of raw X11 is not going to feel "native" anyway by modern standards, so I doubt that Canonical would feel bad about abandoning compatibility there.

"simply need to add a SurfaceFlinger backend"...

Read: "simple need to write thousands of lines of code". Writing a GTK and Qt backend is no minor task.

doable though and it's a one time thing.

QT already works natively with Android.

and Gtk+, well, it's a C GUI toolkit modeled around an image editor.

>They could port Unity to SF if they wanted, but what apps are you actually going to launch then

What apps do people actually use on Ubuntu today? All the Ubuntu users I know just use it to run a web browser, and maybe a few other things that can be trivially moved into the browser (IRC, text editing, ssh). Native software is dead. Move to Open Web Standards and you won't have to worry about platforms changing core libraries and breaking all your apps ever again.

Firefox, Emacs, Terminal, KeePassx, Skype, Empathy, Chrome, Clementine, Tomboy, gcalc, vlc, transmission, sound converter, steam, gedit, evince (pdf viewer), virtualbox, kolourpaint, gimp, libreoffice, thunderbird, dropbox, cheese, alarm, system monitor, VPN client, screencloud.

That's just from recent apps.

Don't forget Eclipse and vim :)

I'm sure there's a lot I didn't mention. I've also reinstalled Ubuntu recently (I "upgraded" from 12.10 to 12.04 LTS :), maybe some apps are just not installed yet.

p.s.: meld, kdiff3, pgadmin3, sublimetext2

> Native software is dead.

Apart from Hacker news and reddit, I need nothing online, and in fact I hate to use the browser when a more suitable native option is available.

May be it is different for the million social apps in phones, but I don't need them nor care about them.

I think saying native software is dead might be a little premature, I could give you a list of the native software I use in Ubuntu but it would be quite a long post.

> Native software is dead.

That's quite a leap considering that the big movement of late has been towards native apps (in the context of tablets/phones), and considering that everyone hates it when someone releases an iOS or Android app that's simply a web app in a wrapper. Also considering that much effort is being spent on things like PNacl to get native apps to run over the web.

Web apps still suck compared to their native counterparts. And as long as that's the case, native apps aren't going anywhere.

>That's quite a leap considering that the big movement of late has been towards native apps (in the context of tablets/phones)

That was a few years ago, the industry has moved on since the iPhone. I can't think of any exciting new native apps in the last few years. Were there even any native apps at all created by the last batch of YC companies?

>Also considering that much effort is being spent on things like PNacl to get native apps to run over the web.

PNacl is not going to take off, and rightly so: it's Google's attempt to lock the web into their own proprietary platform.

>Web apps still suck compared to their native counterparts

I disagree. I don't have to worry about saving, I can use them from any computer anywhere in the world and continue what I was doing. I don't have to worry about updates. My system won't get compromised because I forgot to install a vital security upgrade for app X Y or Z. My app won't suddenly get deleted from my computer because the OS vendor decided to censor it. I don't have to worry if it will run on my system, or work out what dependencies I need to install. It will always just work.

Since when is "created by the last batch of YC companies" the benchmark for the direction the industry has been moving? How about: Apple spent the last few years making a huge amount of money pushing a new platform for native apps (iOS). Google and Microsoft followed suit with Android and Windows 8. That's been the story of the last few years. The big movement has been towards tablets, and tablets have been a vehicle for native apps.

Re: web apps sucking--many of the issues you mention are solved on iOS/Android, and can be solved by backup mechanisms like Dropbox.[1] Some of the issues you mention are imagined (OS vendor deleting apps from your computer). Finally, you fail to address the actual point, which is that web apps suck at what they do. The most popular and commonly used web app is probably Google Docs, and it's total shit. It's so shitty that I'm pretty sure AbiWord in 1998 was a better product than Google Docs circa 2013. I can't remember font rendering shittier than Google's PDF viewer in the last decade, and that's going back to GTK 1.x here. It's a cute trick that's helpful in a pinch, but it still sucks compared to even shitty native Office competitors.

[1] It's telling that arguably the most well-known YC company's product is fundamentally based on making it easy to integrate native apps with cloud storage.

Native will never die, and this is because there is a fundamental problem with webapps -- they lay on a wrong abstraction, a document delivery system. To this end, they will always require huge amount of hacks to overcome its restrictions (security issues, browser overhead and bugs) and to emulate stuff native apps gather for free from OS, like file systems, cross-process communication, i18n, networking, device access, compositing, UI look&feel and so on.

About breaking cross-platform apps... there is simply a better solution -- open, solid protocols. If a protocol solves some problem, it will get great apps on all platforms and will be supported for years, outliving dozens of closed solutions (think of e-mail, jpeg, ssh, pdf, ftp, csv or BibTeX).

The web has totally changed compared to just a few years ago. The things you mention were once issues, but are solved now :)

>security issues

web apps run in a secure sandbox. Native code is insecure by comparison. Users don't trust native apps.

>browser overhead

irrelevant in the age of multi GHz CPUs in your phone

> bugs

native platforms have bugs too

>file systems

most people don't need a file system at all when they have the cloud, but it's there if you need it http://www.html5rocks.com/en/tutorials/file/filesystem/

>cross-process communication



libraries for that will come


Websockets, WebRTC etc

>device access

anything common or mainstream is already available (cameras, microphones, gps, accelerometers etc)

> compositing


>UI look&feel

The freedom to create your own UI without being constrained by platform conventions is going to open up many new and exciting UI paradigms now that we finally get a true open and free market in interface ideas.

>About breaking cross-platform apps... there is simply a better solution -- open, solid protocols

protocols and file standards won't save your app from breaking when the OS vendor kills the framework you are using

"web apps run in a secure sandbox. Native code is insecure by comparison."

Of course it is safe. You have at least two options: software sandboxing (NaCl, Vx32, OS-based sandboxing...) or proof-carrying typed assembly. Of course, the first option is complicated by the horrible state of the most prevalent hardware architecture available today, but that's something to be fixed in the future anyway.

"irrelevant in the age of multi GHz CPUs in your phone"

Oh, thank you very much. You're perpetuating the trend of "What Andy giveth, Bill taketh away" with this kind of thinking.

"native platforms have bugs too"

And studies confirm that the amount of bugs is largely proportional to the LOCs. The web-based approach, however adds a few million lines of code to something that would have, say, tens of thousands at most on its own.

Browser sandboxes are regularly escaped, both between same-origins and surroundings of the browser. Native apps are usually verified cryptographically, mobile almost never. Webapps shamelessly reveal many information about your usage patterns and put even your cached data at risk of being revealed to third parties.

Relevance of overhead depends on what you do. Not all people restrict themselves to decoding cat videos.

Browser and web-app server are additional SPOFs; and browser is unauditably complex piece of software.

Cloud has storage limits, low speed, great latency and zero privacy. Moreover FSes are all about interoperability; this WebKit-only API has nothing to do with that, it only gives an access to a restricted lease of space.

Websockets only allow your app to talk with its mothership and its friends, not other apps working locally.


They won't allow me to connect two computers behind a NAT with SSH.

Nope, only certain fraction of their output. There is no option to configure them, and sharing policies do not exist.

I was thinking about a way to squeeze few things on one screen; most webapps are designed to work full-screen, at most giving you a chance to open a tab. BTW WebGL is a giant security hole because for years of GPU development no-one imagined that low-level access can be given to an untrusted code.

Great, but 95 in 100 of such innovations are total failures, 99 in 100 break keyboard navigation and accessibility, finally 999 in 1000 have zero personalisation options.

That's not the point; they give me a chance to switch to other, working app and continue using the service/data.

With the exception of BBEdit and Creative Suite 6, I have just about every staple of my Mac OS X environment in my Ubuntu workspace.

Sure, I don't have cool things like Evernote or Writeroom or Text Wrangler. But when I need to hunker down and get the job done, I do have my Eclipse, Emacs, vim, Firefox, Chrome, Thunderbird, LibreOffice, Gimp, Gedit, VLC, etc.

For every app in my Mac OS X dock, there's at least one native substitute on my Ubuntu boot. And it's just more fun to work on sometimes. There are errors you sometimes encounter which are simply easier to debug on a linux terminal than a Mac terminal in my experience.

I use Sublime Text, Intellij, Eclipse, along with LibreOffice, Firefox, terminal, and a handful of other things. I'd be doing the same if I was using a Mac or Windows machine.

> Move to Open Web Standards and you won't have to worry about platforms changing core libraries and breaking all your apps ever again.

Instead the user has to worry about whether or not the many libraries bundled by the webapp are all kept up to date by each and every app developer. Gone will be the days of shared libraries and centrally managed security updates for those. And don't suggest that webapps are secure because they run in a sandbox. Just wait, as more and more people move their work to webapps, more and more flexibility will be demanded. Soon you'll have to pierce sandboxes because AppA needs to throw data to AppB with a little Python script you wrote inbetween. Then the security issues will come, as will the library incompatibilities, and we'll be back where we are now, except the heterogeneous world of today will be replace by a JS monstrosity with browsers playing OSs.

Personally: Web browsers, Emacs, Terminal, LibreOffice occaisionally. That's about it.

And what do you use in the terminal?

Everything that's not a web-browser ;)

Hence "native software" isn't dead.

Well, presumably they just need to provide SurfaceFlinger backends to Qt and Gtk. AIUI, SurfaceFlinger basically gives the app a buffer and an OpenGL library, and the app handles the drawing itself; this is pretty much how Wayland works, and how GTK and Qt on modern X servers work, too. So it's not clear to me that porting to SurfaceFlinger would be such a huge amount of work (no more work, anyway, than porting to Wayland).

A lot of the porting effort would be handled by simply developing a SurfaceFlinger backend for Gtk, in the same way there was (still is?) a DirectFB backend. I'm guessing Qt might be even better suited to such a situation.

Obviously, this doesn't cover particular applications that directly access X11 libraries, but porting the toolkit and recompiling the more 'simple' applications would probably be a decent start.

This article, while entertaining, is not true.

Canonical will not migrate to SurfaceFlinger.

Speaking on behalf of myself here, and making an individual decision to prevent the spreading of FUD.

EDIT: And here is the public announcement of what we are working on that I've been waiting for:




Ah, well this at least confirms my guess. After I read about Mir this morning, I figured this was wrong.

If this happened surely the benefits would be two-edged. Ubuntu/Linux would unlock all the potential of sharing applications of Android, while Google would suddenly gain the ability to run a desktop OS on the android stack.

This is obviously purely theoretical, but would bring the ubiquity of Android to a whole host of new machines if used in a desktop context. That would make the whole experience quite interesting and exciting.

That would actually not be a bad thing. After using Android for a while, I really how applications cooperate so seamlessly using intents. I find myself frequently wanting to use intents on my Linux desktop, both as a user and as a developer. It feels almost like shell pipelines for the whole OS.

The RANDOM bolded words ARE extremely ANNOYING.

They're subheadings, not 'random words'.

Not according to the html they're not.

I only checked the comments for this article to see if I wasn't the only one... Indeed, very annoying!

It is hard to say if Google will abandon its own OS for Android. Completely opposite phenomenon might be happening where Chrome OS comes to phones.

I was expecting a few more positive comments. Forgetting all the caveats and difficulties - if this did happen wouldn't this be a huge win for the Linux desktop in general?

Both in terms of improving hardware support and adding an existing large app ecosystem if Dalvik gets ported too.

Here are a few questions for those in the know. What makes it so difficult to add, even as a non-default app, the GNU coreutils and bash to (non-rooted) Android? Even nice emulators such as BTEP use Busybox, which isba fine project for very limited devices.

But, as Canonical wisely reminds us time and again, we have some pretty amazing computing devices in our pockets by 2013.

Now a second question is whether Ubuntu Phone's default terminal will also use Busybox. If so, why? The user not interested in this won't launch it and hence not take up RAM, right? OK, it will mean a few extra MB of flash memory...

Please somebody tell me that Ubuntu Phone will make the solution as simple as simple as apt-get install bash.

I think he is missing the point that Ubuntu Phone apps are running QML based Qt apps that work on SurfaceFlinger, but that's about it. It's not like Evolution will run on Ubuntu Phone unless ported to QML.

I bought into this at first; it seems plausible and a reasonable approach.

Looking at this today, what with the MIR planing (detailed here: https://wiki.ubuntu.com/MirSpec and here: https://lists.ubuntu.com/archives/ubuntu-devel/2013-March/03...), didn't Canonical just rebuff this analysis. I see that today they are using SurfaceFlinger, but it isn't listed after May 2013.

I, personally, am very excited about the merging of phone, tablet, and desktop OS and the possibilities it creates.

I am also looking forward to seeing how it would affect my life as a web application developer. Mainly I keep thinking if the phone hardware (which are mostly ARM processors and limited RAM) will be able to run a number of servers and services we web developers usually have running on our desktops.

    The *Phone* and *Tablet* will be *Ubuntu*.
    The *Desktop* will be *Android*.
    That's a very risky prediction.

Not the other way around?o.O

And the proof that this is happening?

A critical factor allowing Wayland adoption on the desktop is that you can just run an X server as a Wayland client to use applications that are X only. For SF to work on the desktop they would need the same capability.

Symmetry, I am not sure if this is true. In the 1980s I relied on being able to use remote X servers for a lot of my work. But, it has been many years since I have needed to do that. In modern times, SSH, tmux, emacs, Clojure/Ruby repls, etc. just do everything that I need. I like Android a lot, and it might be very good as a general OS for the public and perhaps even for developers.

I was trying to refer to x servers in general, not just remote x servers. I can't remember the last time I used X over a network, but many of the programs I use very day don't use GTK or Qt, and so wouldn't work if I switched over to some windowing system that couldn't run X as a client.

> OpenGL ES driver comes with the Android kernel, released by the hw manufacturer, so SurfaceFlinger works right-away.

This doesn't make sense to me, a PC or laptop doesn't have a mobile phone GPU. Or am I missing something here?

No, you're not. A lot of stuff don't make too much sense. in the same vein :

> acceleration sensor, camera and GPS services would > also become easy accessible for traditional Linux applications

What good does it do if the hardware isn't available to a PC ? An accelerometer isn't very useful for a tower PC, is it ?

Too much assumptions in this speculative article. Not that any of it is impossible to achieve but some would require a lot more work than it seem and some wouldn't make much sense for the PC.

Integration between the different flavor of Ubuntu doesn't have to be so low level.

"An accelerometer isn't very useful for a tower PC, is it ?"

It is. You could use the tower in a wheel table like the Valve guys are doing, or inside any machine like cars or trucks or robots(btw I do so every single day), and they become useful for protecting hard drives, or rough sensing the movement of the car, or someone trying to steal the thing in the laboratory and activating a sound alarm.'

We use lots of accelerometers in the lab today connected to computers. I'd love to use a common API integrated in the OS for accessing them.

I would be able to rotate the screens of my arm mounted imac without having to press complex key bindings like today.

Ok, true, you've got a point, having an accelerometer isn't completely useless. Granted.

But I was talking standard PC, not appliance and in that sense, that reusability of drivers _today_ isn't very high. Makers don't put accelerometer on mobo. So to me the argument is kind of moot, IMHO.

> An accelerometer isn't very useful for a tower PC, is it ?

It could help to detect if the kids thought it was a toy horse

Well, for starters, who says the sensors have to be in the device itself? Accelerometers would be handy in gaming devices, for instance.

I'm not positing this as a huge win for Linux, but it's not useless either. Built-in GPS would be fairly cool for a laptop, even if it's just for wardriving or something.

Last time ubuntu replaced something we ended up with upstart ...

Which is good.

If anything, Ubuntu should be able to emulate Android apps. However, "switching" to Android would be a mistake.

If it meant I could add items to that panel at the top like I could in Gnome 2 then I'm for it.

I doubt that happening...

Moved to Linux Mint and away from Ubuntu, been pretty happy with Linux ever since.

I'm sorry, how is this relevant to the subject at hand?

I suspect it's referring to how many former Ubuntu users were driven away by Ubuntu's past UI changes. The switch to Unity was particularly disruptive, and alienated many staunch Ubuntu users very rapidly, for example. What's being discussed here sounds very much like the same kind of disruptive change, with the likely outcome of alienating yet more users.

Many of these users who have been forced out of the Ubuntu camp have moved to Linux Mint, because it tends to offer what were the benefits of Ubuntu, but without the painful points. The fact that they're better off now than if they had still been using Ubuntu shows that these types of changes aren't really helping the Ubuntu experience.

I have installed Linux mint in this computer, in a Virtual Machine, and I never boot it, because it feels like when I used Mandrake a decade ago. It feels old. In contrast Unity looks even better than Windows7, and it runs faster.

As far as I know, lots of new users got to use Linux precisely because of the switch to Unity. Also, another set of users got back to using Linux as the primary machine because of Unity, like me.

So, some old guys leave Ubuntu, while another set of new users join Ubuntu. It is probable that the end result is a net gain, even with all the complaints.

I would love to see real numbers (different to the old geezers complaints) to clarify this issue.

While I've been a Linux user for several years on and off, I don't really keep up with the news and happenings in that world.

From my perspective, the move to Unity made zero sense at the time. It was a clear case of NIH (Not Invented Here). "It works better on small screens" was translated in my head to "Netbooks are the future!" so I went out and bought an Asus eeePC. IT wasn't bad, but it got old after a while and I sold it.

Fast forward 2.5 years, and we're seeing the convergence of mobile devices. I personally carry a 7" Samsung tablet as a phone. Others around me carry an iPad or iPad Mini in addition to an iPhone or Android device. Now "It works better on a small screen" translates to "We're building a single experience across phones, tablets, laptops and desktops."

Apple has been trending that way for about a year with the introduction of the App Store for Mac, and Microsoft is jumping on the bandwagon too with Metro and its touch-based UI features.

Suddenly, the much-maligned move to Unity as a desktop shell makes Shuttleworth look like a freakin' genius, as Ubuntu sits here with a unified experience for both users and developers, while Microsoft tries to push its heavyweight OS onto tablets and Apple tries to close the gap between iOS and OSX.

Add in that Ubuntu's primary development language seems to be Python, while your other options are Java (Android), Objective-C (iOS) or .NET (Win8)... I'm seriously considering dropping my side projects and going headlong into app development for Ubuntu Phone. It's a gamble, but if anyone can threaten the status quo right now, I think it's going to be Ubuntu.

App development for Ubuntu touch is currently based on QML (Javascript & maybe C++). I'd love to see them properly support Python in the future, but I've no idea if they will.

I agree with most of what you say. But it seems like there's still a big technical gulf between Ubuntu desktop (X based, with apps in Python+GTK) and Ubuntu touch (Surfaceflinger, apps in QML). I'm sure there's a plan to bring them closer together, but so far I don't know what that will be.

EDIT: And just minutes after I post, at least part of the answer appears: they're making a new display server, Mir, for both desktop and touch devices.

My understanding is that QML describes the interface, but the actual application logic is written in something else.

From Ubuntu's developer portal:

> While Ubuntu and the applications that come with Ubuntu are written in many different languages, from C to Java to Haskell, when writing something new we recommend using Python. Many important parts of Ubuntu are already written directly in Python, and we work to make every important API and framework within Ubuntu available from Python. Python includes a rich standard library and a vast set of third party modules, so there are libraries available for just about everything you can think of.


I've not gotten far into it though, and likely won't for a month or two. My experience lies almost completely on the web; I've never written a native mobile app or released desktop software.

I can't help but feel that those people were using Ubuntu expecting it to be something that it was never intended to be.

To be surprised at evolutionary changes which shed the aesthetic of competing distributions and introduce simpler, more consumer-oriented user experiences is to fundamentally misunderstand what Ubuntu Desktop has always been aiming for - Ubuntu hasn't stopped delivering on promises in recent days, it's started delivering on them.

It always seemed obvious that Ubuntu's promise was aesthetics and simple consumer-oriented experience, but for some reason Canonical marketed it as "Linux for human beings" instead of "Linux for simple consumer beings".

Some people fell victim to this and now they are running away.

good for you!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact