I like a lot of their design decisions like the global menu, dechroming maximized windows, easy custom lenses etc. But the compiz/custom code base just isn't up to the task. Worse, unity is so tied to their non-mainline gui stack that it is a herculean effort to get it to run on anything besides ubuntu, preventing them from getting fixes from outside their ecosystem.
I know there is a lot of bad blood these days between gnome and canonical but they really need to bite the bullet and rebase on mutter, gtk3 & gnome shell. Canonical is great at a lot of things, but core performance oriented software just isn't one of them. Let somebody else do the heavy lifting.
I haven't gotten a chance to try the gnome shell spin that they're producing this cycle. Can anyone comment on how well its working? My experience trying to go vanilla gnome 3 on ubuntu in the past has been less than stellar, but I really appreciate the ubuntu community and things like launchpad ppas.
You can't configure it without an absurd number of extensions, the default interface makes me want to punch kittens, and it wiped my groups policy and network profiles so I am still trying to repair my startup procedure, and right now I need to manually start network-manager each time.
I won't blame that necessarily on Gnome Shell though. I will say I don't care how speedy it is, I can't use it for anything in its default state, and after a couple hours trying to get enough extensions to make it manageable, I just gave up. If I want a Gnome 3 desktop I'll use Cinnamon.
The only plus for Gnome Shell is that the system + search syntax that I love from many modern desktops works, but it also works in Unity (albeit slowly) and Cinnamon (albeit not as well, Cinnamon doesn't intelligently reorder results, it just lists them alphabetically).
I dunno, I just feel like the direction of "fork everything even when alternatives with our same mindset exist" seem like a waste of effort.
It seems like Mint is trying to execute a hostile takeover of GNOME. Of course, GNOME is free software and forkable, and branches aren't necessarily subservient to the lines they branch from, so "takeover" means "establish a reputation as the highest-quality fork"
If the majority of the community begins to respect and support Mint as a better GNOME than the current stewards of GNOME, and Mint "rescues" the good majority of the GNOME software, we can jettison the old GNOME management, welcome the migration of contributors from old GNOME, and carry on.
The switchover is a huge mess of confusion, though.
I still run Arch as my side distro for having a machine I'm prouder of. Upstart can suck it.
But I still think the "Lenses" thing is just awful. It's fine for starting a specific application by typing in the name (the most basic "Quicksilver" functionality). It used to be really slow, but I think that's better now. It still sucks for application discovery, which the old hierarchical menu excels at. And the other lenses are mostly just silly: you've got this simplistic interface, and people are trying to shoehorn all kinds of use cases into it. Look at  and . Some of those would be much more useful if they weren't limited by the restrictions of the Lens framework (others are just hopeless).
Fortunately the lenses are easy enough to ignore.
For example, the "expose" feature shows a bunch of windows and lets you browse between them with the arrows. However, the lowered the contrast between them so much that I really have no idea which window I'm currently selecting.
Similarly, multiple windows barely show which is in focus due to this weird fetish for near 0 contrast.
Also, I'm still missing my task list, wanting to know what I've been using to remind me where I stand in each desktop. The little arrows are much less helpful.
When pressing winkey, it waits something like a whole second before it shows me the unity-bar's numbering, and without those, the keyboard is useless. I use the keyboard to work faster, not slower, but Unity is slowing me down here as well.
These are just the criticisms off the top of my head, but overall it's been a pretty annoying week or so with Unity.
I tried GNOME Shell twice, once with Fedora 15 and 3.5 in Arch Linux: I couldn't use it for more than a couple of hours, it was painful.
I just finished installing Ubuntu 12.10, and I must say that I like Unity, I like the global menu: I like the direction Canonical is taking.
For me, the problems with Ubuntu are the Repos/Apt which I dislike and are somewhat broken: And mainly the Community, which is not as helpful as the Gentoo community for example.
Of course, this comment might be a trap. There's only one way to find out...
In fairness, after manually resolving the problem with removing gitosis and cleaning up a couple of other half-upgraded things, the upgrade process did recover and finish successfully, but it certainly wasn't painless.
Ubuntu is the best <3
However, it seems it only deleted its EFI-Grub installation so it should be quite fixable.
But yes for an easier experience it seems that reinstalling normally proves a lot easier.
EDIT (adding context): the article says about the use-case for Juju "configuring and setting up complex services with lots of application components can still be a bit tricky"... then goes on to give a basic LAMP WordPress setup... the thing is that a LAMP WP setup is neither complex nor it has lots of components, it's basically the "drop dead simplest" case of web app deployment, so they are clearly missing the point using this as an example :|
Here's the thing why Ubuntu is on shaky ground these days.
The above idea is grand but like everything that tries to be the future of something—in this case computing—it leaks. And leaky things will never take off for real much like leaky abstractions will not hold because the truth will leak out sooner or later, and then the abstraction isn't worth much since you can only trust it superficially.
The early Ubuntu and Windows 95 and XP had something in common: they all were built mostly on how the computer worked. These operating systems tried to make the underlying computer available to the user, give or take a few sugar-coatings. And they all pretty much succeeded.
Conversely, I think most systems that try to pretend to be something that they aren't will not succeed. Web applications won't become local applications just like that: the user will just see some visible glue that holds some parts together. You've seen it so many times: something comes with great features that only work till you really want some things done, and then it turns out the system doesn't do its magic all the way through. You just see the one kind of magic that has been preprogrammed into it and you've already observed that besides an initial impression, that one kind of magic can't deal with everything you need from the system. Then you can't trust the system anymore since you know there's more available than the system can agree to offer you.
Ubuntu is still basically a local installation: some stuff can originate from the cloud but it can not be a grand computing environment that unifies web and local services because it matters that the user has his own local installation. You can't boot Ubuntu from a USB stick and have your environment seamlessly load from the cloud.
Something like Android or iOS are much better positioned for seamlessly integrating local and web applications and local and cloud services. Using a tablet interface you don't really have the sense of local vs. web at all: you just have apps and once you sign up another device your apps will be available automatically. This is maybe what Shuttleworth is envisioning with the Unity and his current plans for Ubuntu, but the downside is that the regular Ubuntu desktop will suffer.
Ubuntu suffers because it doesn't pay respect to its natural, physical environment that is a local computer. It can be a highly tuned system that takes the most out of your hardware or it can be an ethereal, ubiquitous cloud service that's available regardless of hardware. But not both.
"Ubuntu suffers because it doesn't pay respect to its natural, physical environment that is a local computer. It can be a highly tuned system that takes the most out of your hardware or it can be an ethereal, ubiquitous cloud service that's available regardless of hardware. But not both."
Emphasis added there at the end.
My claim is that "we", folks who engineer software and create products, have made those software products indispensable to folks who never used to care about computers much less shell out money to own one. They never did want to buy a computer, they don't want to buy one now, what they want is the function that is provided by some app or collection of apps. These people want to follow tweets, or facebook, or chat, or see cat pictures, they buy computers to do that because they have to, not because they want to. What is worse, the 'computerness' of computers, their re-programability, their flexibility, causes more problems for these people than it solves. They want turn-key, instant on, instant off, tools.
And folks are making these tools for them, Chromebooks, and iPhones, and iPads, and Slates and Surfaces. They are marketed as tools that get a particular job done, not a universal tool. Can you imagine a power drill where you take the motor off and use it in your mixer, then take it off and use it in your desk fan, and then take it off and use it to pump water and wash your deck? No you get separate tools for those tasks, and they all have a motor in them but the motor isn't universal, its optimized for the tool. We are moving that way with processors. No more 'boot whatever you want' no more manuals describing the instruction set or peripherals, no more general purpose tool chains or operating systems. Processors designed to do one thing well like be a 'phone' with proprietary value added parts (like a GPU) and special instructions (like Jazelle) which you only get to know about if you agree to buy a million a month and design it into your specialized product.
The needs of an operating system for one of those devices is very much different than the needs for a programmer or developer's operating system. Sure they share some things in common (both render to a screen) but how or when they do that, and what API they use, those things are important to a developer but not to a tool/appliance user.
Yason is exactly correct that Ubuntu is standing astride this crevice while it widens underneath them. Soon, unless steps are taken, it may find itself neither fish nor fowl, an unacceptable environment to developers (too closed off) and to appliance users (too technical). Personally I'd love to see two distros in Desktop one is "End user" and one is "Developer" with very different design targets.
Worth mentioning, I've used many Linux boxes before, but after using Ubuntu 12.04 from the past 5 months, its the most Productive System I've ever used. And It's only getting better over time.
Many thanks for the team that's behind this change.
For example the configuration changes - instead of gconf, you now have mateconf, which uses a whole other set of files that are not compatible with gnome. So switching from an old Gnome desktop (say) to Mint+Mate means your settings are all lost, even though the forked code between gnome2 and mate is really close.
On a default LMDE install it adds gnome-terminal and mate-terminal, so on the menus you have 2 "Terminal" entries. Ditto for a load of other core applications that are in Gnome and Mate - stuff like Archive Manager, screenshot, and document viewer. Was there really a need to fork file-roller?
Debian have already pretty much said "no" to including Mate - the reason being that the duplication and resuscitation of crufty old code is not a good idea. It would be better, the say, to work with Gnome upstream to get a more Gnome2-ish feel in Gnome3. I think Cinnamon is in Debian or will be shortly, which probably has a brighter future.
Anyway, for now I'm happy with Mate - it works fine and didn't need any tweaking to get to a usable state, compare that to Gnome3 classic-session on Ubuntu 12.04 which I spent weeks tweaking and patching (!) to get back to a state that worked like 10.04 - but I think its days are numbered. The Mint devs alone can't keep a full-on Gnome2-fork alive forever, it has too many duplicated applications that are already fine in base Gnome, and eventually I think Cinnamon will be the only viable option for Mint.
The immediate rebuttals for my comment will be (1) user studies, (2) the os is fine, (3) were making money, (4) customers told us.
The not so obvious rebuttals will be:
(1) someone told us to do these, (2) we've reached user and customer saturation, (3) desktops aren't the rage, (4) core customer/users moved to using server only and were reeling them in to desktop, (5) corporate entities cannot perceive enough value thus all the new mashups.
It is this second set which I want to hear discussed and why they are or aren't the case.
Basically my guess is that core users are moving away from where these guys are spending money and they think spending more in the area will bring them back.
take with a grain of salt but the marketing theory is people when obtaining something (convinced or otherwise) will benefit if product in context serves all their needs and more if possible. That grey area or boundry of core->nice to have is what is in question here. The more they use it, and the more it serves them, the more likely they will return to it (product context). Take with a grain of salt.
Ars has its hands on 12.10 and we'll be bringing you a
full review within the next couple of weeks.
Today, Canonical has released version 12.10 of its Ubuntu Linux
distribution, code-named "Quantal Quetzal" after a ridiculously
Looks like Ars jumped the gun.
This is using both the main and my local archive...
If not, I will not be downloading this.
Even Windows 8 religiously asks you these questions at first boot after installation.
In the case of Firefox, a default search page for Google is included in the Firefox Chrome as is a default search engine. The default page doesn't call home at all until you enter a search i.e. until you use it. The rest of the browser is functional without using this feature, so you can type "duckduckgo.com" without transmitting data to google.
The same is not true for the Unity shopping lens which transmits your searches for local data and applications on your workstation to Canonical and Amazon by default.
Unity: Nothing transmitted to Amazon until you perform a search
Sure, I'm not agreeing with the default out of the box functionality, but other than the fact that user expectation is that a local search does not transmit information to remote servers (same issue as chrome's omnibox for typing URLs) it is exactly the same. But maybe Canonical should make it explicit what the default does (maybe they do, I don't know)
Install the OS, Click on gear icon (top right), settings, privacy, turn off online results
Point is your specific criticism can be levelled at firefox as well
Confusing a private local search for your own files with an online search that sends information to a third party is is a fairly substantial privacy issue, and the fact that Canonical is just blowing it off (by saying "we already have root") is incredibly concerning about their judgement.
If I enter analyze.com in the address bar and stop at anal, nothing happens. Nothing is sent.
If i enter analyse in the unity launcher and stop at anal, i get a nasty surprise and permanently get my ip address tagged with anal.
And secondly, your ip address is not logged. The data sent to Amazon is via Canonical and it's anonymised. If you don't trust Canonical, why are you concerned about using a distribution from them?
Edit: I'm not saying there is no problem and I'm fine with all of this. Just that I wish better arguments were put forward (I'd personally like the functionality made explicit during install/upgrade)
We did that because we believed it was a better alternative. So some of us get passionate when advertisements creep in the default installation...
However, I can not use it which is how I've decided to deal with this issue.
Won't this have any impact on existing python installers, designed to be used with python 2.x?
If you see MAC, it does come with lower versions of python too, so it supports older python projects.(python2.7, python2.5 based and doesn't break the system)
Latest Ubuntu might well come even with lower versions of python too - as many Ubuntu apps and tools itself use the lower versions of python. But the default python will be 3.2.3
To use the right version of python we have to -
Manually install older versions if it doesn't have one.
And do symbolic Linking the installed versions to /usr/bin and set $PATH appropriately for the correct Python version we need. Just use them as python2.7 python2.5 python2.6 while calling your scripts.
Also see virtualenv.
I think Ars misunderstood, as well: 12.10 has both Python 2 and 3 by default. They're working towards dropping Python 2 from the default install, but it's not there yet. When they do, it will just be an "apt-get install python" away.
"Python 2 will continue to be available (as the python package) for the foreseeable future. However, to best support future versions of Ubuntu you should consider porting your code to Python 3."
"I felt a great disturbance in the Force, as if millions of voices suddenly cried out in terror and were suddenly silenced. I fear something terrible has happened."
Although the installation was a bliss. The fastest OS installation I've ever done. Even the boot time is pretty fast, around 6 seconds with my workstation.
This is for my work machine which I run PHPStorm for web development, virtualbox for CS5 and Chrome and various other apps.
I'll give this one a try again (sigh).
Without hardware OpenGL it is just a crap.
unity-2d-shell eats tons of memory, more than chrome. No wonder when
ldd `which unity-2d-shell` |wc -l
but without running X11 it is nice and fast.) gcc-4.7, linux-3.5.0, python3.2, etc.
My choice is Xubuntu. I've been using Xubuntu on my laptop for the past 6 months, and the 12.10 beta for the past two.
The link below is my tasklist after install to really make Xubuntu shine. Oh! And now with updates to the Ubiquity installer, FDE can be done with the GUI instead of fiddling with the debian-installer text partitioner!
I guess I should give up trying to like Ubuntu and start using XFCE instead. Thank you for your list, I will use it to check this desktop environment.
It took about 2 days to get used to it, but now I find myself super+(search term) for everything instead of trying to find stuff in an applications list and that spilled over to my Win7 use also.
I still miss a few functionalities, mainly right-click->open terminal here, but I find myself reaching for my mouse MUCH less and I'm not wasting time trying to find something buried in a list or folder somewhere. Search is FAST (on both platforms).
I TRIED to like Ubuntu's Unity, but the fact that I can't either: (a) right click things like panels and choose 'configure' or 'properties' and adjust everything to my liking [the Windows way, that I still prefer] or (b) have a goddamn' config/settings center where I can actually adjust the UI settings I care of [the KDE and Xfce way that works...] ...I mean, I still don't know how to disable window grouping for the damn "taskbar"
The second release I tried again. It almost passed muster, but it was still just too buggy.
The third release I tried again.
I never went back to the GNOME desktop.
I love Unity. It's not perfect. In things related to keyboard shortcuts in particular it's still got a ways to go, but it is good. So much better than the GNOME 2 desktop. (I've never tried GNOME Shell, so I can't comment on it.)
It's not the perfect desktop, but I've got it almost how I want it, and it largely gets out of my way. I particularly enjoy the two-second login time.