Hacker News new | comments | show | ask | jobs | submit login
Ask HN: What do you want to see in Ubuntu 17.10?
1374 points by dustinkirkland on Mar 31, 2017 | hide | past | web | favorite | 1146 comments
Howdy HackerNews!

Dustin Kirkland here, Product Manager for Ubuntu as an OS platform (long time listener, first time caller).

I'm interested in HackerNews feedback and feature requests for the Ubuntu 17.10 development cycle, which opens up at the end of April, and culminates in the 17.10 release in October 2017. This is the first time we've ever posed this question to the garrulous HN crowd, so I'm excited about it, and I'm sure this will be interesting!

Please include in your replies the following bullets:

- FLAVOR: [Ubuntu Desktop, Ubuntu Server, Ubuntu Core]

- HEADLINE: 1-line description of the request

- DESCRIPTION: A lengthier description of the feature. Bonus points for constructive criticism ;-)

- ROLE/AFFILIATION: (Optional, your job role and affiliation)

We're super interested in your feedback! Everything is fair game -- Kernel, Security, Desktop apps, Unity/Mir/Wayland/Gnome, Snap packages, Kubernetes, Docker, OpenStack, Juju, MAAS, Landscape, default installed packages (add or remove), cloud images, and many more I'm sure I've forgotten...

17.10 will be our 3rd and final "developer" release, before we open the 18.04 LTS (long term support, enterprise release) after October 2017 (and release in April 2018), so this is our last chance to pull in any big, substantive changes.

Thanks, HN!



- FLAVOR: Ubuntu Desktop:

1. HEADLINE: A way to have different scaling for external monitors hooked up to my HiDPI laptop.

Currently I need can only set a single scaling factor, so I need to ajust my laptop screen resolution to match scaling of the external monitor. If that's not possible, a way to automatically set resolution and scale for both screens once you hook one up would already save me a lot of manual switching and restarting lightDM!

2. HEADLINE: "Native" multitouch gestures like 3-finger swipe to change workspace.

There are some programs that can do this already like xSwipe and Fusuma, but I expect this integrated with a nice and easy menu.

3. HEADLINE: Better battery management.

Battery performance under Ubuntu is often much worse in Ubuntu than Windows. TLP helps, but it's not enough.

User: I want hi-res apps!

Dev: Sure, here you go.

User: But why is it so small on my new shiny tablet high density screen?

Dev: (SHit it worked okay for me) Okay now it detects the density and scale..

User: But when I move the window to my old good lcd screen it becomes way too big!

Dev: Okay let's see if I can dynamically adapt to a new monitor density, it's just one scale factor.

User: But when I put it on my big tv flat screen it is too small!!

Dev: (Oh shit you gotta be kidding me, the pixels are actually a viewing distance relative unit??!)

MacOS actually in my experience seems to handle this all perfectly. Normal-DPI screens you chose resolution and dragging windows between monitors works as you naturally expect (it pops between DPIs).

Concerning DPI and trackpad integration, Ubuntu should strive to be as Mac-like as possible, at least imo. Macs absolutely win in the trackpad arena; multitouch works like a dream, configurable gestures aplenty to achieve whatever you want (of course, it could always be more customizable).

DPI scaling between monitors work exactly as you'd expect. Windows stay the same size when moving between high-DPI and regular monitors.

These two problems are two of the biggest reasons I don't use Ubuntu (or any Linux) desktop (I use a macbook with a headless Ubuntu Server box and, when necessary, X11 forwarding over ssh).

Does it understand "I want 2x magnification on my 15" 4k laptop display, but not on my 43" 4k monitor"? Windows 10 decidedly does not, so I have to switch manually every time I switch display (just forget using both together), and it then tells me to close all my work and log out and in again to make scaling consistent between UI elements on screen.

Yes it does. It even remembers different window sizes and locations for different monitor configs :)

Huh? On Win10Ent I've got both of my displays set to different scaling factors and changes take effect immediately (like, as soon as I release the slider).

It works for me, with one annoying caveat. I have a set of regular 1920x1080 monitors on my desk at 100% scaling. My laptop has a 4k screen, and when I plug it in, I have it set to turn off that screen.

I have to log out when I plug/unplug or the windows will end up blurry or the wrong size.

Have you ever tried shouting at Microsoft about this?

I know they have a bad track record of not listening, but I think things might be different now, they seem to be a bit more receptive to feedback, particularly with the beta updates.

Or maybe I'm remembering the prerelease "hai where r the bugz halp" back when Win10 was not yet RTM...

This is exactly what I mean.

Yes it does, sometimes. But you can configure it as well. I think they used predefined lists of hardware though. EG if name contains tv then scale is 1

Pretty much. Works just fine when I hook my 15" laptop up to my 130" projector.

It's a bit weird to drag things onto my 4k TV through HDMI and try to track my microscopic mouse pointer to the tiny window to maximize my video, but otherwise works well. I suspect I could fix that in settings somehow though.

On newer macos shaking the mouse back and forth makes the pointer get larger (so you can find it).


KDE has a similar feature: When you hold Ctrl+Win, it will draw revolving circle segments in black and white around the cursor to allow you to find it. It looks like this: https://imgur.com/a/67wfI (You'll have to imagine the cursor inside these circles. My screenshot utility won't include it in the image for some reason.)

Heh...I think Windows 3.11 already had that feature (Ctrl -> Circle zooming in to mouse) :)

have you considered swapping out to a BIIIG mouse pointer? that's what I do on my home 65" HTPC/ TVPC :)


I've always thought that arcdegrees should be what we measure UI's in: how much of a user's field of view does this thing consume? After all, what makes text "small" is that how much of my FoV it consumes (or doesn't). Not inches, or points, or pixels.

(Admittedly, "points" are still likely a good measurement for print. Perhaps one can work backwards and fudge point as a measure of angle if you consider 12 point font at a typical viewing distance.)

I assume the real hard piece is figuring out the distance the display is going to be viewed at. Some definite defaults exist (phones are typically about the same distance away, same with desktop monitors) but unique situations certainly can exist. (I'm also assuming that the monitor can report it's physical size and resolution; combined w/ viewing distance, it should be possible to calculate FoV.) If you did this, you should be able to mostly seemlessly split a window between two displays, and have it be equal "size" in the FoV. (of course, some displays have borders, so that fudges it a bit.)

> I've always thought that arcdegrees should be what we measure UI's in: how much of a user's field of view does this thing consume? After all, what makes text "small" is that how much of my FoV it consumes (or doesn't). Not inches, or points, or pixels.

That is a good starting point for calculating the default "optimal UI scaling", but there are going to be adjustments needed for the FoV of the whole screen area (not per pixel) too.

With large screens, for example 24-30" on your desk, just the per-pixel FoV measure will probably be good enough. You have plenty of "space" for windows and content, and want to get the optimal scaling.

But once you get to very small screens like phones, there is a tradeoff between keeping font and UI sizes comfortable, and being able to actually fit enough content on the screen without endless scrolling. I am willing to strain my eyes with smaller font sizes on my phone than on my laptop, just so that I can see more than 5 sentences of text at the same time.

"CSS Pixels" are actually supposed to be based on viewing angles:


If the OS (not an app) could allow you to tweak the native pixel resolution, scale, size of each display, even under "advanced settings" that would go a long way towards helping.

This is at the Operating System level, not like some random one-off application.

For me, the one feature I miss the most is a checkbox option for "Native Scrolling"; Did this really need to be removed?

X11 did — run xdpyinfo and you'll see its idea of screen dimensions and resolution. (It's unlikely they'll have been configured correctly, of course.) If you look hard enough, you can find some ‘outdated’ plain-X software from the workstation era that respects it. It was the ‘Linux desktop’ crowd that threw that away, since they couldn't think beyond building Windows clones for PC clones.

And the vector-based competition to X (e.g. NeWS, Display Postscript) would have done better.

Simple answer is don't auto-detect. Allow the user to set the scaling factor per screen and then just auto-apply that when using that screen. This just requires a way to uniquely identify screens and requires the user to set the scaling factor for that screen once when first used.

Initial autodetection and scale-factor setting is ok. Otherwise most regular users would just say "all my icons and text are too small on my new notebook". Windows detects the high dpi in that case and sets the scalefactor to 200%, which gives a good starting point. Of course the user should be able to override this permanently if it isn't his preference.


As someone who loves singular they, I have a request: please don't do this. It is OK if grand parent uses he/him. Thanks!

The distance between of the third and fourth formulations of the problem is very small. Once apps can be dynamically redrawn with a scale factor, simply make the scale factor customizable.

Simply introduce zoom in/zoom out for the whole desktop separated to each screen like in browser (you can zoom certain tabs/sites and have that memorised). Problem solved.

> 1. HEADLINE: A way to have different scaling for external monitors hooked up to my HiDPI laptop.

This would be awesome. Even when both the laptop and the external screen are 1080p, different scaling could be helpful if you want to use a dual monitor setup effectively.

Unfortunately, it's a tough nut to crack given current desktop behavior. For example, you can have a window that straddles both monitors. What should the scaling be? You need to switch at some point as you're moving a window back and forth - when? So it's a challenge, but solving it would be so worth it!

Widows 10 handles different scaling (zoom) between monitors far better than any Linux distro I have used. A window keeps the zoom of where it came from until it is entirely on the new monitor. Works pretty well.

While I get that it's uncool to like Windows on HN, I really like Windows 10. With WSL, all of the CLI tools I need for development are here along with better hardware support (including suspend / resume, high DPI monitor support, latest GPU drivers / etc).

Plugging two 4K monitors into my laptop (which has a native 1080p display) is an awful experience when booted into Ubuntu. You either have to set the DPI to make the laptop display unusable or set it to make the 4K monitors look like $hit.

Plus... you know... games.

Windows isn't the best at multi-DPI in general though either. Only recently did Firefox on Windows get multi-DPI support - not sure if Chrome does yet because I gave up on it and went to dual 4K because the scaling was easier. If you want to see really good multi-DPI support, OSX is really good at it with most apps supporting it out of the box.

Multi-DPI is kind of a hack in general though and is likely to cause issues unless applications have been tested for it very thoroughly, it causes serious issues on major frameworks like Electron and Qt - though both of their support for it is improving slowly. If you want things to work smoothly for now, try to stick to 1 DPI setting.

I think you're confusing Windows DPI scaling availability vs lack of support from the apps you use.

It's not windows fault the apps don't take advantage of DPI. You can also disable dpi scaling for individual apps.

You're right, but even many builtin Microsoft apps - while they supported DPI scaling - did not support multi-DPI switching and rather than scaling properly just scaled pixels and looked blurry.

It's "not Windows fault", sure but it certainly makes it a worse experience than other platforms like OSX where multi-DPI is much more commonly supported.

I would much prefer the Windows behavior to what I see on Linux. Right now, if I open an app that doesn't support high DPI, it is just unusable because it is so tiny.

Couldn't you just manually reduce your screen resolution? Or is that too drastic to be worth it?

It's worth considering whether there is some flaw with windows multi dpi scaling such that apps don't use it. Firefox and Chrome have scaled properly on Mac for years now, while even Windows 10 ships with first party apps that don't scale right. (E.g. device manager.)

For sure that this has been an issue in the past with Windows. UWP helps make muli-DPI work by default in new applications.

Sure, but 99% of my Windows software isn't UWP. It's all good and well to say it's there, but that doesn't make the experience good for the user. Contrast to KDE and OS X where it just works for 99% of software.

I mean, I get it, same issue as Vista for Microsoft - people expect 100% backwards compatibility, but it turns out that terrible design decisions made many years ago tend to mean you need to break compatibility. Just like UAC, resolution scaling will be an issue that becomes less painful in Windows over time. Right now it's not great, however.

I mean, you say that, but on KDE, for example, every application except one on my system works with DPI scaling (the odd one out is Unity3D) - that's because at the QT level DPI scaling is built-in, so the toolkit supports it and the applications get it for free. Clearly this wasn't the case for the older Windows UI stuff, where they are literally just scaling the image of the window up (which means horrible looking text).

Actually, the really old windows stuff did support scaling - the 'Large fonts (120%)' option was there almost forever. I remember that original Delphi, circa 1995, supported it.

Just most apps chose to ignore it, the developers took the 'anyone uses 96 dpi anyway' attitude and at the end of 90's most applications started to suck at 120 dpi.

Yep, Windows API already had support for logical pixels in the 16 bit days and all good books always preached to convert between logical pixels and physical ones.

I guess people got lazy, as you say.

I think that the monitors stayed more or less the same later pixel density for a very long time. Is only been gradually increasing very slowly for 20 years, until a few years ago.

No point in spending time on logical pixels if it makes almost no relevant difference...


Anything running its own renderer doesn't get to benefit from component scaling since they don't use components.

That was my point - running KDE, this is extremely uncommon, running Windows, it's practically every application.

The problem isn't just scaling between two different resolutions, it's the inconsistencies (yes, apps don't take advantage but that's not the only issue). For example, if I want 200% 4k (my monitor) and 100% 1080p (my 2 side monitors), I have to choose between ultra-tiny text on my 4k with regular text or blurry text on my 1080ps.


Is that Windows 10? On my Windows 10 Ent desktop I'm able to set the scale factor of each display independently.


This is Windows 10. How do I enable that option?

Erm...click on the display you want to change (1,2,3) and simply drag the slider?

This month's Windows update fixes DPI scaling for old toolkits.

Yes, it's true that there are issues. It seems like most Microsoft apps handle multi-DPI well. By comparison, on Fedora 25 (the latest release), the only program I have found that handles multi-DPI is Terminal. Firefox doesn't do it.

Yeah, Windows support is better than Linux for it, but it's still pretty iffy. While IE and a few other things do, even stuff like Windows Explorer and OneNote doesn't handle multi-DPI well or even just runtime DPI changes in general, I'll RDP my box from a 100 DPI system and have my session screwed up when I come back to my system.

If you're making a decision about whether to make a purchase, don't make it unless you're prepared to do it all at once. Stick to ~100 DPI until you can make a commitment to go all at once.

Chrome hast had DPI scaling since 2015 on Windows. I remember having to report lots of initial bugs. Now it works fine.

DPI scaling yes, but not multi-DPI, when dragging from a 100 DPI monitor to a 300 DPI one text should remain sharp and not blurred by scaling pixels. Or even vice versa.

>While I get that it's uncool to like Windows on HN

That has not been my experience here at all. There is a rather active, and sometimes vocal, Windows fan base around here. Misconceptions about the current state of desktop Linux are commonly seen as it seems most people around here only use either Mac or Windows.

Agreed, while I see some MS/Windows hate... some of it technical, some political, and a mix of founded/fud... There's been a fair amount of counter to that.

I mostly use mac at work, mostly windows at home, and a bit of linux for servers, and my htpc (most of my casual browsing at home)... Each experience is fairly different. And they all have pluses and minuses. That said, more often than not, I prefer the Windows UI desktop/menu, but osx & unity app integration and linux/bash shell environment. I wish that Ubuntu/unity would integrate more of the menu/taskbar features found in windows. (And bring back natural scrolling checkbox)

Microsoft integrated Ubuntu instead.

My experience (currently running two 27" panels at 3840x2160 and one 27" panel at 2560x1440 in KDE for most stuff, Windows for gaming, and previously had one of the first edition retina MBPs with external non-retina displays):

OS X, years ago when the first retina MBP was released, did everything right. It was seamless from monitor to monitor, scaling done well.

Windows 10, now: OK, ish. Most applications scale badly with blurry text because it's just literally scaling the image afterwards. Newer applications are fine. The actual scaling isn't great - having a window half on one monitor and half on the other leads it to 'picking one' and looking weird on the other.

KDE, now: Pretty good. Correct scaling once you set it up. The autodetection can be dodgy, and the DPI scaling for text isn't linked to the rendering scaling for windows, for some reason. The GUI still only gives you a single scaling option for all monitors, but the autodetection can do different for each monitor, and environment variables can be set to solve it manually. The actual scaling is perfect for the vast majority of things. Things scale correctly and no blurriness. The only application that doesn't handle scaling is Unity3D, so everything is tiny (no fallback to raw image scaling).

In general, it's what you'd expect for interace stuff across the platforms - Linux does it right, but the interfaces around it are bad, Windows does it fine for new stuff, old stuff (which is most stuff) sucks, but the interfaces are OK for doing it, and OS X gets it all right.

Edit: Just to be clear, it's only the Unity3D editor that doesn't do scaling, the actual games work fine, as you'd expect they just get the full space and the game chooses how to render to it. To be fair to Unity about the editor, they support scaling on OS X, and the Linux build is still a beta. It is annoying though.

I use this on Windows 10: http://windows10_dpi_blurry_fix.xpexplorer.com/ and it works fine. If you have blurry text, disable DPI scaling in that app (right-click -> Properties -> Compatibility -> Disable DPI scaling) and this will take over and make it usable. There are a couple of applications that act wrong no matter what (Battle.net for example), but most of the time this fixes it well enough.

I only use Windows 10 for gaming, so fortunately I don't really need to worry. Useful for those who use Windows all the time, though.

Windows also gets my vote when it comes to the per-app volume mixer controls which have been awesome since Windows Vista.


PulseAudio provides this feature and actually provides more features and functionality than Windows. Ubuntu's default mixer isn't the greatest so I recommend this instead:

    sudo apt install pavucontrol
You can then find it in the application menu labeled, "PulseAudio Volume Control". It lets you set the volume for individual applications (and with Chrome, individual tabs!) and also pick which output/input device will be used.

It lets you configure some neat tricks. For example, you can setup an audio device that forwards to another computer running PulseAudio, an RTP receiver, and a few other similar protocols then set say, Spotify to output to that device. So if you have some network-enabled audio receiver somewhere in your house/office/whatever you can send audio from your Linux workstation to it.

You can of course also pass that audio through various filters/plugins to mess with the sound before it goes out to the remote receiver. For example, equalize it, noise removal, etc. PulseAudio supports LADSPA plugins so if you wanted to you could setup a little Raspberry Pi audio receiver at your front door and yell at solicitors in a robotic voice from your desktop. All with a bit of PulseAudio configuration fiddling =)

I still remember the first time I was in a computer lab and I leaned too far away from my computer and my headphones that were blaring music popped out... and the whole room WASN'T subjected to the same loud music. And I opened up the Kubuntu audio controls and plugged in my headphones and the volume slider suddenly jumped up, then I unplugged again and it muted again. "Woah."

I remember trying it on whatever Windows computers were in the lab just to make sure I wasn't crazy and that this wasn't there all along, and sure enough, they kept the same volume no matter whether the headphones were plugged in or not.

One of the first PulseAudio victories I remember, at a time when I vaguely recall that it was a newcomer and people were really pissed at PulseAudio's bugs and recommending just straight ALSA instead.

+1, PA + pavucontrol are very flexible. You don't even need weird protocols to send your audio to another computer, I just used its tunnel module (enable it in the receiver, then configure its IP on the sender) to send my browser's audio output to my home server, which has a decent stereo attached. The latency is quite good too, the delay even over wifi is barely noticeable.

There's also pulseaudio-dlna[0]. It works as advertised.

[0] https://github.com/masmu/pulseaudio-dlna

Thanks for the heads-up! This is one thing I miss mightily on my Mac.

This feature comes by default with PulseAudio, maybe Ubuntu doesn't expose it well enough in their audio settings. I think Gnome Settings has it, KDE definitely does.

Pulse Audio solves the same thing for Linux.

I get correct auto scaling-switching like this on Gnome 3 with Wayland, but only for a subset of programs (basically those that are fairly vanilla GTK+3), and at the cost of weird bugs with Wayland and program support thereof that still crop up fairly regularly.

I've always resorted to xrander and can get the screen looking pretty good. Though I really think something like this should just work.

Weston does multi-DPI really well. When you drag a window between monitors, the half on the HiDPI monitor is scaled and the half on the LoDPI monitor is unscaled. So it looks perfect without windows growing or shrinking when you move them to another monitor like GNOME on wayland.

I'd love to read more about how this was done, if you have a link perhaps.

macOS handles that edge case. It just displays the window in only one screen. The one with the biggest area of the window shown. There is no need to be held back by cases like this.

If you zoom in, you can sort of force parts of one monitor to be shown on the other monitor. You can see how everything's upscaled/downscaled from there.

> This would be awesome. Even when both the laptop and the external screen are 1080p, different scaling could be helpful if you want to use a dual monitor setup effectively.

Actually, this is especially true when both are 1080p, because laptop screens are never as big as desktop monitors, and we also tend to use them closer. I have this exact problem right now but I think I've just adjusted my eyes over time to squinting at 1080p at 14", or perhaps I turned on some display scaling and forgot about it.

> For example, you can have a window that straddles both monitors. What should the scaling be?

Intuitively, I feel like you should use the physical DPI of both screens to make sure that the window has the same physical dimensions on both. But that'd probably lead to weird scaling factors like 1.17 instead of nice round ones, and thus fuzzy scaling, so it probably couldn't quite work. I guess perhaps you'd just snap each display's DPI to the closest predefined value (eg. .25 increments which I think most systems use these days). Then you'd get a similar-sized part on both sides of the boundary.

But yeah, I think overall if you actually use physical DPI for scaling everything should work out close to nicely.

FWIW, macOS changes a window's DPI mode when the cursor that is moving the window passes over from to one screen to another. Just tried that out. :D

That's what happens when "Displays have separate spaces" turned on. (With that setting on, windows are only present on one monitor at a time, and, when dragging a window, that transition happens when the mouse cursor moves between displays.)

With "Displays have separate spaces" turned off (so windows can be present on more than one monitor at a time), it looks like windows take their DPI setting from whichever monitor the majority of the window is on-- with my current two-monitor setup, the DPI transition happens at the halfway point of a window, regardless of where the mouse is as I'm dragging.

Leaving aside the implementation difficulty, the answer to "what should the scaling be" seems obvious? Use the monitor scaling for the part shown on that monitor. The switch should happen on a monitor level, not on a window level.

The painting happens on window level, that's why it handled there. Application paints the window whenever it receives event "paint me" for it and it cannot paint different portions of a window at different DPI - from the applications POV, it is a single canvas. Another thing is, that the window resize and dpi change are separate events, so you cannot really call it twice in a row with different DPI and expect the app not getting confused.

Another approach would be to let the application render at higher DPI and the compositor would downsample the portion on the lower DPI display.

OSX handles this by upsampling/downsampling the parts of windows that are drawn on the other screen.

This isn't just external monitors! MBP with "retina" screens are also unusable for Ubuntu :(

Fedora with Gnome shell on Wayland already handles both 1 and 2, although power managements is about the same as Ubuntu and Wayland comes with its own set of issues.

I switched from Ubuntu to Fedora about a year ago and am quite happy with it.

Well 1 depends on Wayland actually detecting your external monitor, I normally end up having to drop back to X to get it to detect my secondary 28" 4K monitor :-(

I couldn't figure out where I can change the different scaling for the external monitor on my fedora 25. My 1080p external monitor just looked huge comparing to my dell xps 13 hidpi display.

It requires Wayland features that are used by GNOME 3.24 (so F26).

really, there's a native multitouch support for touchpads? do you have more info about that?

Yes there is native 4 finger swipes to change desktop on Wayland. And I wrote an extension to add 3 finger gesture support for an action of your choice. Check it out here: https://github.com/mpiannucci/GnomeExtendedGestures

I can't find much information, but things like scrolling, switching work spaces etc. worked out of the box for me when I was testing Fedora 25 a month or so ago.

Two-finger scrolling works really well on Fedora with Wayland, in fact at some point it appears to have become default behavior (at least on my machine running the latest version).

Fedora uses libinput. Of course, it is not without issues from those, who would like to tweak every little setting. Libinput is designed to be as automatic and configuration-less as possible.

+1. I recently got an Dell XPS 13 and hooked it up to my external monitor (4K). Icons were way too small so I adjusted those and standard text size. But getting applications (e.g. PyCharm) to run at a reasonable size was frustrating (I had to google it and then modify some configuration file somewhere). With OS X, which I just came from, the external monitor "just worked" when I plugged it into my Macbook Pro.

it really is a mess. i connected a 4k xps 15 to FHD monitor, the only way for it to work is via open source nvidia drivers and using xrandr to scale the external monitor and then use other settings to scale everything to FHD. that and some other things made me return the xps and order the new macbook.

Windows scaling should auto configure and work in any modern application. It always works like that for me, and I only have issues for software written in 2003 in Java or really old versions of QT.

> (I had to google it and then modify some configuration file somewhere)

I even had to do this with Chrome [1]. It's crazy how obscure this was when I was setting things up. Other apps, like Gimp, still look like shit because I can't find a way to do the same thing; their GUI just rends at a tiny scale and is difficult to use.

[1] https://superuser.com/a/1120078/103402

One way to solve the scaling issue is to set the external monitor to a virtual higher resolution while still driving it at its native resolution (with scaling down done in GPU).

Actually Linux/Xorg generally support this out of the box, it is just the higher-level software that would need to make use of it. You can try it youself:

xrandr --output <output-name> --scale 2x2

the result should be the given monitor will appear to have twice the resolution, so if applications believe they are running on a high-DPI display, they will look fine on the external monitor as well.

However due to lack of support and awareness in desktops doing just this might leave you with an unsatisfactory configuration, e.g. part of the desktop erroneously shown on both monitors - you might need to use further xrandr commands to setup the regions that each monitor displays.

I use the same approach to solve this issue on a Windows 7 system I am using, it is just slightly more involved (I need to setup a custom resolution in the Nvidia control panel).

Unfortunately, this scales after drawing. The entire point of hiDPI is to have a crisper image. To achieve that, the scaling must be done at the drawing level.

Unity and Gtk would scale everthing up, so things that are properly drawn before being re-scaled.

So quality will be there.

For normal/low-DPI screens instead, you'd scale everything down, so you'd lose some memory CPU power, but you'd still get the quality result.

Battling xrandr is not for the feint of heart. It is tedious to get the right behavior and differs from one display to the next (the precise dimrnsions, etc..)

#1 is absolutely the biggest one for me and #3 is a solid second.

I have a Macbook Pro with retina and stopped using linux simply because I couldn't get a good resolution on my laptop and monitors. And then when traveling (flights etc), ubuntu chewed through battery probably 3 to 4x as fast as OSX so I wasn't good for that either. As a result, I have been on OSX for a couple years now but would love to be back on ubuntu some day.

HiDPI is still a huge problem in the linux desktop I can't count the number of hours I've spent researching and fiddling with it. Wayland is the answer, but it's slow moving, and Sway currently looks terrible when scaling double.

The biggest issue with Wayland is video drivers. Try getting Wayland to work with any proprietary blob, and see your efforts fail miserably.

I have the dual problem of 1.HEADLINE: I've got a HiDPI notebook and suffer when I have to connect it to a common 1080p screen.

Ubuntu Unity Developer Here...

I'm mostly replying at the point 1., as it's closed to what I do...

I know we should offer an UI for that, but waiting for that you can just workaround this.

Well, as said unity supports scaling, although it's not possible to scale toolkits per monitor.

However... There's actually a good workaround for this, that works fine for multiple monitors.

The idea is that you scale everything up to 2x / or your maximum scaling (including window contents), then you scale the non-HiDPI monitors down using xrandr --scale

For example, if you want to use normal resolution there, you just have to do something like:

xrandr --output <OUTPUT> --scale 2x2

In this way it will be scaled down, and everything will be readable and almost 1x1.

You can test this in normal resolution monitor as well, and you'll see things should be pretty good.

I should find some time to implement this directly inside UCC / USD, so that users will get this for free...

Notice that there's also a bug in X causing some mouse trapping, so you'd probably also need X to be patched as explained in this bug: https://pad.lv/1580123 (we'd like to include this upstream, but we're waiting for X upstream approval for that)

Better out of the box HiDPI support would be great.

Autdetection would Be nice, but just being able to set the scaling option in one place and having it apply not only to my desktop but the login manager as well would be very useful.

Also, afaik there is no documentation on changing the scaling factor in the login manager, or at least not in the official docs.

I would not buy a standard resolution monitor at this point, so having simple support for it in Linux is very important to me.

On 1.: Seriously, I was gonna write the exact same thing. Just today I researched once again, since it's quite a hassle, and nowadays seems pretty common to have a HiDPI laptop screen in combination with a standard-DPI external screen.

I had the same problem yesterday, I use fedora but we share the same pain missing this feature. It would be awesome to have this setting. Being able to set different scaling for external monitors is a must have feature.

Thanks for mentioning TLP - I hadn't heard of it before

3. agree with the default WM; no issues (same or better than Win) battery life with i3wm. In my experience ofcourse.

More work on gesture!

Including the ability to configure what gestures you want in a GUI interface!

if multi-monitor support was as solid as it is on macOS, i'd likely switch

Would love support for #2

I would absolutely be in favor of #1 and #3.

- FLAVOR: Ubuntu Desktop

- HEADLINE: Please, please, please fix space issues with /boot.


I'm constantly running out of space in /boot, due to kernel updates. It drives me so incredibly batty. If I had to guess, this is due to poor defaults in the installer for folks that opt to encrypt their whole disk. Even still, this system was setup back on 14.04 (don't think it started on 12.04), and I have no intention of reinstalling from scratch just to fix it.

Publish something official on how to fix this problem! Make it easy and stress free! Yell at the people who didn't catch this bug before it went out! Sorry, but this is just a really bad problem: it leads to folks like me wasting time, and probably a whole bunch of other folks just not being able to install updates, and no idea why.

- ROLE/AFFILIATION: software developer in the federal government

+1 -- This is the one and only problem I have to regularly help my non-technical Ubuntu friends (and their friends) with. Every few months they cannot install updates anymore because their /boot fills up and apt fails to install a new kernel package.

The simplest fix would probably be to make /boot large enough by default (in the order of 10GB or 20GB or so -- the current size is 512MB IIRC).

A better fix would be to purge old unused kernels automatically but as far as I understand there were some difficult edge cases around that.

> The simplest fix would probably be to make /boot large enough by default (in the order of 10GB or 20GB or so -- the current size is 512MB IIRC).

Sure, I'll just use 1/6th of SSD to store 60 megabytes.

  $ du -hs /boot/
  56M	/boot/
If 512M is not enough space for /boot you're doing something wrong.

>If 512M is not enough space for /boot you're doing something wrong.

I don't know what planet you're living on but it's certainly not this one. Between a Ubuntu desktop, a laptop and personal server with multiple Ubuntu VM's on it, all of which are kept up rigorously to date, I fix this problem at least three times a year, every year.

The command line process to fix it[1] is a multi-stage mess of dense bash-foo that comes with a 140 word, two paragraph explanation so that /ubuntu veterans/ can figure out what is going on without resorting to scouring the man page for flags. The friendly GUI process to fix it relies on a third party tool that is no longer maintained[2].

It is not possible to explain to non-technical users what is happening here, which means the only thing they can do when they see this is call their technical friend and cry for help. This is exactly the kind of user experience that makes people think Linux is not ready for widespread desktop use.

This is definitely something the OS should take care of itself. I'm ignorant of the challenges that caused it to be this way in the first place, but in my ignorance I would advocate that:

a) the partition be made larger by default b) the OS auto-purge any kernel package more than three revisions old

[1] https://askubuntu.com/questions/89710/how-do-i-free-up-more-... [2] https://launchpad.net/ubuntu-tweak/

Here is my old-timey one-liner personal solution[0] for it that has worked flawlessly so far, obscure theoretical edge-cases be damned, because the non-edge case situation is just awfully worse and practically impactful.

(warning, rant inside)

[0] https://gist.github.com/lloeki/520acee8ba3b44c532c7

Um, isn't the fix `sudo apt auto-remove --purge`, which autodetects unused kernels? What am I missing?

If you do not run that command before /boot fills up, and you have a full /boot with a partially installed kernel, then that command fails. So this works fine if you remember to call it regularly, but it does not solve the problem once it occurs.

Interesting. I haven't encountered that edge case. I've many times filled /boot and resolved by doing an auto remove.

It seems silly to me that I need to manage this myself. Why do I need to be worrying about different kernel versions? I just want to make websites.

Following the chain of links and answers and explanations, we come to the conf file that says in the comments that it commonly results in two (2) kernels being saved, but can sometimes results in three (3) being saved.

IOW, it does automatically remove old kernels, it just keeps the last 2-3.

So, yes, run "apt-get autoremove", that's it.

I think it has solved the problem for me, but still is not a good solution for anyone who would answer "What's a terminal?"

I love having a terminal with bash and use it constantly, but I don't think it should be needed for the system to just go on working.

I've been using Ubuntu either part or full time since 2007. I've literally never encountered this.

Which is not to say you're lying, I'm just sort of flabbergasted that this is an issue for so many people. Do you run autoremove much? Maybe that would solve it for you?

I run ubuntu 16.04 on a laptop, desktop and a TV streamer and I get this all the time. My boot partition on the desktop is 15gig and it gets plugged every now and then.

I've hit this before, but honestly do not think it's a big deal. Sure the installer could default to a larger boot, but it's manually configurable during install. And cleaning it up once in a while is just good sys admin practice.

sudo bash -c "apt auto-remove --purge; apt update; apt upgrade" is what I usually run.

Prefer they focus engineering cycles on actual engineering problems.

Sorry, but Ubuntu is doing something wrong here, not me. This should be handled automatically. Ubuntu wants to be the system for everybody, but you can't expect people to open the terminal and fix this manually. Making boot 20GB is ridiculous, but 1GB should be no problem, and for me 2GB would be OK if that means that this problem will disappear forever.

And I believe my boot partition is only 256MB, and I didn't set it to that. That was a system default.

Ubuntu is absolutely doing something wrong here and we'll get that fixed. Thanks!

Yup! That "something wrong" is installing every single kernel update for two, three, four years and not deleting any of the old kernels.

Super common in enterprise deployments. I ran into this a bunch on my $EMPLOYER-issued workstation.

The installer should handle this. When you apt-get upgrade anything besides the kernel, does it leave the old version lying around?

I understand that it may be wise to keep the old kernel around so the system can be booted in case there is a hardware incompatibility or breakage in the new release, but that justifies only one additional kernel. Ubuntu keeps those kernels sitting there until you `apt-get autoremove`, and that means that unless you're running that command routinely, the boot partition is going to fill up at some point, no matter how big you make it.

This is especially a problem for people who use the unattended-upgrades package. I've autoremoved and had it clean up almost a gig of old kernel images before.

If you're running updates weekly it will fill up on Ubuntu. This is a recent problem, and I've only experienced it on my laptop with full-disk encryption.

The update process generates on the order of 100mb/month.

It's not new. It's been happening to me since I started using Ubuntu in the 8.x range.

Doesn't `apt-get autoremove` remove those old kernels? Not that it's a solution; it should of course be done automatically! Here's what I get when using it:

    > apt-get autoremove
    Reading package lists... Done
    Building dependency tree
    Reading state information... Done
    The following packages will be REMOVED:
      linux-headers-3.19.0-79 linux-headers-3.19.0-79-generic
      linux-image-3.19.0-78-generic linux-image-3.19.0-79-generic
      linux-image-extra-3.19.0-78-generic linux-image-extra-3.19.0-79-generic
    0 upgraded, 0 newly installed, 24 to remove and 39 not upgraded.
    After this operation, 1,732 MB disk space will be freed.
    Do you want to continue? [Y/n]

Whenever a kernel is updated autoremove should be called immediately afterwards. It should be called before the restart now / restart later dialog box of update-notifier appears.

Currently, Ubuntu installs a new kernel and update-notifier tells the user a reboot is needed. The autoremove notification only appears when using the terminal which explains why users are running into this issue. Also, update-notifier informs the user another reboot is needed after autoremove is run.

To avoid this mess I’ve commented out the lines of /etc/apt/apt.conf.d/99update-notifier and wrote my own updater using bash and zenity and incorporated needsrestart. It’s not pretty but it works.

absolutely not, automatically running auto remove may lead to bad things; on occasion autoremove flags other more useful packages for removal.

For example I'm using LVM with my installation on my Ubuntu laptop and after updating the kernel and running "apt autoremove" it removed the LVM package leaving me scratching my head shortly on reboot as to why it wouldn't find my root filesystem (frankly i have no idea how it became "unneeded").

A more sensible approach is how Red Hat do it with YUM/DNF, that is, to allow a certain number of the same packages to be installed, "installonly_limit" in yum.conf. Doing this means that when a new kernel gets installed the oldest is removed to keep the the system at the limit specified.

On my RHEL/CentOS machines I tend to narrowly provision /boot to around 250-500MB. set "installonly_limit" to 2 and the system will keep the most recent kernel and one back. it works for me.

I see you’re point, though I too use LVM and haven’t seen that happen... weird. I could have been more exact with my response as autoremove does more than just remove old kernels. Anyway, it would be nice to see Canonical resolve this.

Care to share it? Maybe it could help others...

I thought about sharing it but like I said it’s not pretty. It involves editing sudoers and holding back config updates for sudoers and update-notifier-common which might cause problems in the future if you’re not aware. I’d much rather see Canonical address it properly.

>Doesn't `apt-get autoremove` remove those old kernels?

Of course it doesn't! Why would you assume such a silly thing? /s https://askubuntu.com/questions/563483/why-doesnt-apt-get-au...

I confirm that it doesn't autoremove. I had to empty /boot on some servers lately.

Anyway sometimes one wants to keep old kernels. I have an old laptop that runs OK with a 3.something kernel and has wierd video sync problems with any newer ones. Ubuntu 16.04 keeps running with that old kernel so I keep booting from that, maybe once or twice per year.

However the proper solution would be pinning a package and autoremoving the others.

Yes, it does remove old kernels. Read the very link you posted.

> It's better to err on the side of saving too many kernels than saving too few

But Muh Freedoms! I hate to be subject to one man's opinion of things /s

It's also kind of a garbage argument.

People who know they have broken kernels don't keep upgrading them, they stop and fix them.

People who don't know they have broken kernels also don't know they can boot with an older kernel, so they get nothing from the "backup".

We want to leave some time for people to realize their kernel is broken, so keeping three is probably just fine. Honestly, it would probably be adequate to just bump the oldest one off the queue whenever a newer one is requested. If you've got a tiny boot partition, maybe that means only two revisions. If you've got a huge boot partition it could be 20.

But just keeping them all and making people manually uninstall them gains you nothing, it's user-hostile for no reason.

Except, it would be nice to keep a few of the (recent) older kernels, in case things go awry with the new update.

This already happens: apt autoremove won't remove the package for the running kernel. It'll clean up "old" (N-1 and lower) kernels, but installing kernel N+1 won't allow kernel N to be deleted as long as kernel N is still executing.

Once you reboot/kexec into the N+1 kernel, it'll let you remove the N (now N-1) kernel, bringing you down to one. But at that point you've proven the new kernel works—at least well enough to get to a shell you can run apt autoremove from.

This is why autoremove isn't so auto: if it happened automatically after reboot, it might be running on a now-wedged system (e.g. one that can't bring up the display manager), removing the last-known-good kernel and leaving you with only the broken one.

I think the right middle-ground solution would just be for installing kernel updates to touch a file, and for Desktop Environments to notice that file and trigger a dialog prompt of "you've just rebooted into a new kernel. Everything good?"—where answering "yes" runs apt autoremove. On a wedged system, you can't answer the prompt, so the system won't drop the old kernel. (In other words, just copy the "your display settings were changed. Can you read this?" prompt. It's a great design!)

Fedora/RHEL yum has a much better solution: installonly_limit, defaulting to 3. Kernels which have been updated will only be kept up to this depth. The excess are automatically trimmed during update.

Wouldn't a good solution then be to run autoremove before installing a new kernel?

That way, you have kernel N running, first autoremove wipes kernels N-1 and older, then it installs kernel N+1, so that when you reboot into N+1, you'll always have known-good kernel N if it doesn't work.

It's a very similar solution to how a good programmer solves an off-by-one error, doing a shift/rotate shuffle on a for/while loop.

What happens when you have a high-uptime system where you repeatedly "apt dist-upgrade" and end up installing packages for kernels N+1, N+2, N+3, etc., all without rebooting into any of them?

I agree that if the user manually runs an apt [dist-]upgrade—or really any manual apt command—that that's a good time to do apt maintenance work. (Homebrew does maintenance work whenever you invoke it and there haven't been any complaints so far.) But kernels usually get installed automatically, so it can't just run then.

Now, if there was a specific concept of a "last-known good kernel" (imagine, say, the grub package generating+installing a virtual package when you run grub-install, that depends on whatever kernel you specified as your recovery kernel, ensuring it remains around), then your approach could work—you'd always have two kernels, the LKG for a recovery boot, and the newest for a regular boot.

Exactly what happens on Fedora.

I agree.

I'm running Ubuntu 16.10 currently. A kernel upgrade hosed my setup yesterday, and having an older kernel available saved my butt. I was able to do another `apt-get update` and things eventually worked with the latest kernel.

For Ubuntu Desktop, it may make sense for the package manager to keep only the latest 2 or 3 kernels, and automatically purge the rest.

I had the /boot filling up problem but had thought it was fixed, I'm on 16.04+. I'm pretty sure the last two kernel updates I did removed older kernels leaving me with the current one and previous one ... ?

You can configure apt unattended upgrades to autoremove by default, perhaps you did that?

Nope, still doesn't do it without manually invoking autoremove.

This is the main problem that keeps me from wanting to set up less technical family members on Ubuntu. It's possible to get in a spot where even a simple command won't solve this.

Solus uses https://github.com/ikeydoherty/clr-boot-manager now, which purges old kernels and modules, but keeps the modules for the currently running system so HW still works

> The simplest fix would probably be to make /boot large enough by default (in the order of 10GB or 20GB or so -- the current size is 512MB IIRC).

What? This is ridiculous and unacceptable. I don't use Ubuntu anymore, can someone tell me what is filling up the boot partition?

I'm currently on ArchLinux and mine is 200MB and it's 14% full! I can't fathom what could occupy so much space.

It's the way kernel update come in apt. The kernel update is a new package, not an upgrade of a previous kernel package. Thus the old kernels are left in place and the new ones installed alongside. After about 3 kernels have been made available in /boot the previously recommended size for /boot is full and attempted update to a new kernel fails.

It can be manually fixed by removing older kernels ("sudo apt purge ...").

Perhaps I'm mistaken but i thought a fix was in place for this, maybe it was something third-party but apt definitely offered to remove unused kernel package for me recently.

In other words: integrate purge-old-kernels


Maybe add some stats to know which kernels have booted successfully, so one knows which old ones can be safely deleted, and keep the last 1 or 2 good ones (not always the latest!).

$ apt-get install purge-old-kernels

    The program 'purge-old-kernels' is currently not installed.
    You can install it by typing: apt install byobu
"byobu"? packages.debian.org to the rescue...

https://packages.debian.org/jessie/byobu "Using Byobu, you can quickly create and move between different windows over a single SSH connection or TTY terminal, split each of those windows into multiple panes, monitor dozens of important statistics about your system, detach and reattach to sessions later while your programs continue to run in the background."

Uhm... what?

So I'm the author of both Byobu and purge-old-kernels. It's in Debian, because I help push it there after I push it to Ubuntu.

It's completely wrong that purge-old-kernels is in Byobu, rather than in the kernel or directly handled by dpkg/apt. Thanks for all the feedback here -- we'll get that cleaned up in 17.10!

Are you sure you typed exactly that?

For `apt-get install purge-old-kernels` I get

    E: Could not open lock file /var/lib/dpkg/lock - open (13: Permission denied)
    E: Unable to lock the administration directory (/var/lib/dpkg/), are you root?
And for `sudo apt-get install purge-old-kernels`, I get

    Reading package lists... Done
    Building dependency tree       
    Reading state information... Done
    E: Unable to locate package purge-old-kernels
But I can reproduce your error by `purge-old-kernels` alone. That is indeed strange.

EDIT: I straced bash by `strace -o bashlog -f -s 10000 bash` and found the culprit to be /usr/lib/command-not-found. Indeed, if you run `/usr/lib/command-not-found -- purge-old-kernels` directly, you get that same message about byobu.

EDIT2: I assumed this was some kind of bug with the database, but now I actually tried installing byobu and it does make purge-old-kernels available.

The first of the two errors has nothing to do with the package itself - packages need root permisions to be managed, which is what exactly the error is telling you:

    E: Unable to lock the administration directory (/var/lib/dpkg/), are you root?
In your case, you didn't prepend `sudo`.

Regarding the second, which is instead the core of the problem, see the man page (http://manpages.ubuntu.com/manpages/xenial/man1/purge-old-ke...):

    Provided by: byobu_5.106-0ubuntu1_all bug
Therefore the correct (and full) command is:

    sudo apt-get install byobu
As pointed out in another comment, such package is arguably a poor placement for this type of utility.

I'm aware that the first errors don't have anything to do with the package itself, but this was the command line given. I was wondering what kind of setup might lead to an apt-get command (without sudo even) to display that error message.

One case (which may be irrelevant in this context, but still a valid scenario) is auto-updates being executed in the background.

> "byobu"? packages.debian.org to the rescue...

You don't ned to use packages.debian.org to look up what packages do.

> apt show byobu .. Description: text window manager, shell multiplexer, integrated DevOps environment Byobu is Ubuntu's powerful text-based window manager, shell multiplexer, and integrated DevOps environment.

Quite right. One uses packages.ubuntu.com:

* http://packages.ubuntu.com/yakkety/all/byobu/filelist


byobu is basically an abstraction over screen and tmux, letting you use either with some common keybindings for spawning new windows and a common toolbar at the bottom telling you disk usage and load average and the like. I am not sure why purge-old-kernels is in the byobu package, that seems like a really poor placement for it.

It is a bad place for it. But it's there because I wrote both of them, and Byobu is always everywhere I want it to be, and I generally always want purge-old-kernels there too.

But yes, you're very right. It needs to be moved out.

Forgive the intrusion, just want to say thanks for Byobu. I use it regularly, and it's much nicer than bare tmux or screen.

:-) Thanks!

It shouldn't be needed. AFAIK part of the post-installation script of a new kernel already marks the oldest kernels for removal with `apt-get autoremove`. They just need to run it automatically after an upgrade.

Unfortunately, that's not enough. There are many corner cases where kernel packages hang around much longer than they should, for odd reasons.

Another related issue: I have helped get a few co-workers set up with Ubuntu on their laptops. Inevitably, once every few months, one comes to me and says "I just ran an update and the 'Restart' popup came up, so I restarted, now my laptop says 'No bootable devices found.'" This happens when Ubuntu is installed in UEFI mode. A kernel update sometimes wipes out the boot image. To fix it, I to get into the BIOS and reselect a bootable UEFI image. This should never, ever happen.

I have similar experience, but with Virtualbox in UEFI mode. After any restart, UEFI will complain that it cannot find anything bootable, I will run the bootloader from UEFI shell (Virtualbox does not have BIOS menus), in booted system run the efibootmgr to register it, just to have it lost at next reboot and doing the dance again.

Only Ubuntu does that, other linux distributions don't have this problem.

+1 - I didn't even know this was an issue. Usually i just use apt to update and run autoremove after I get a new kernel and verify it is working.

I recently installed Ubuntu on some old computers for my relatives and they really like it. If they keep updating and after a couple of months their system fails to work that will be a disaster for any good will they will have developed for Ubuntu.

This needs to be fixed NOW! How can Ubuntu even pretend to be a viable desktop operating systems if normal updating renders the system unusable?!

+1 then I can kill my custom "remove oldest kernels, except the running one, and leave at least two other kernels" script.

Yes, I had this same problem when I was on Xubuntu. Fedora and CentOS(and I'm assuming many other distros) seem to handle kernel updates just fine without forcing me to manually clean out old images from /boot periodically (was always too lazy to write a script).

What Fedora and CentOS do is a generic yum/dnf option: installonly_limit. It allows you to keep limit on the amount of packages with the same name and different versions, kernel included.

would you mind sharing this in the meantime?

I've been using this for years. I believe it only keeps one old kernel version though, not two like you requested. Tested and working for weekly use since 12.04 though 16.04:

echo $(dpkg --list | grep linux-image | awk '{ print $2 }' | sort | sed -n '/'`uname -r`'/q;p') $(dpkg --list | grep linux-headers | awk '{ print $2 }' | sort -n | sed -n '/'"$(uname -r | sed "s/\([0-9.-]*\)-\([^0-9]\+\)/\1/")"'/q;p') | xargs sudo apt-get -y purge

I guarantee there's a way to get that pipe to purge all packages in your system.

This is the new way to do it:

    sudo purge-old-kernels --keep 3 -qy

I agree, I've had to deal with this issue many times. I think it has happened at least 4 times in the last year. A few times I had to manually go in and start deleting old kernels because it had completely run out of space, so apt couldn't do anything without crashing.

I eventually set up a cron script to regularly delete everything except the last 3 kernels. I think this should really be the default behavior. "Save all the old kernels until you run out of space and everything crashes" doesn't sound like a very sane default.

+1 as well. This made me brick my entire installation (entirely my own fault: trying to fix it with insufficient knowledge). It made me switch to another distribution.

Despite having cleaned out old kernels before, I spaced one day and accidentally removed the kernel I was running. This is fixable if you boot a live distro you can mount the relevant volumes (like your root disk to /mnt/foo, and your boot partition to /mnt/boot), then, after a chroot to /mnt/foo you can re-install the kernel. Here is an article describing it (but it misses mounting the boot partition) http://askubuntu.com/questions/28099/how-to-restore-a-system...

+1. This is one of my major annoyances w/ Ubuntu. Been using it 3 years now and this seems like a fundamental issue that should be fixed.

It's user error on your part. The proper way to upgrade a Debian/Ubuntu system is:

  $ apt-get dist-upgrade
  $ apt-get autoremove
dist-upgrade installs new kernels, and autoremove will automatically remove old kernels (and keep the last 2 most recent ones.)

I like to use apt-get dist-upgrade --auto-remove, all in one step. Though for kernels they will be removed on the next invocation after reboot as the default logic keeps the last installed kernel version as well as the current one.

As for the original issue of not cleaning up the kernels, this is fixed in xenial/16.04 but not in trusty/14.04, see bug report here: https://bugs.launchpad.net/ubuntu/+source/update-manager/+bu...

Xenial/16.04 has other issues regarding kernel cleanup involving DKMS leaving files behind:



autoremove followed by dist-upgrade to 16.04 for me somehow decided to build multiple kernel versions and ran out of space on /boot mid build. That was really annoying to fix.

There is a lot of confusion in this thread.

dist-upgrade is NOT (in most cases) meant to upgrade to a newer distribution (eg. Ubuntu 14.04 to 16.04). dist-upgrade is just like upgrade except it also installs additional packages if necessary (eg. linux-kernel-4.1 AND linux-kernel-4.2).

Most people should always run "apt-get dist-upgrade" and never "apt-get upgrade" in order to simply keep their packages up-to-date.

An actual distribution upgrade is triggered by a different command. On Ubuntu: do-release-upgrade

dist-upgrade shouldn't try and move you to another release version, do-release-upgrade alone does that, as far as I know. Things get a little more confusing using apt update/upgrade vs. apt-get update/dist-upgrade, they don't seem to be quite the same in all cases for me. But I agree with the general frustration that it shouldn't be necessary to run apt(-get) autoremove frequently to keep /boot from filling up with old kernels.

Run autoremove after dist-upgrade.

I quit using Ubuntu as a Desktop OS because this was so obnoxious. I would gladly return if they fix this.

This will get fixed. Mark my words ;-)

Awesome! thanks, looking forward to it!


This is a feature (kernel update) that is used by everyone, including newbs. They don't have to understand anything to enjoy the update, and they shouldn't have to understand anything to avoid the space filling issue.

Lots of ways to fix it, I don't claim to know which is best:

* Always leave N% /boot available, and delete or move old kernels to satisfy that.

* Move old kernels to /old_kernels, outside of /boot. Driven either by satisfying N% space, or no more than K number of kernels kept in /boot.

* Opt in, opt out, ask this on installation and for non-server installs, default to "never have to think about this again."

* Easily configurable in the "whatever that box was called when I last used Ubuntu years ago" box.

sounds like you're keeping too many old kernels, I've had this problem too in the past..try this nice little oneliner: dpkg --list | grep linux-image | awk '{ print $2 }' | sort -V | sed -n '/'`uname -r`'/q;p' | xargs sudo apt-get -y purge

really, this is what your package manager should do automatically before/after installing new kernels

for your reference: http://askubuntu.com/questions/2793/how-do-i-remove-old-kern...

That's not a good one liner, because it doesn't do anything about the linux-headers packages (which are hundreds of thousands of small files) or the linux-image-extra packages.

Just run "apt-get autoremove".

This would be fantastic. This problem has no official or easy solution for non-technical people and delays updates which could cause security issues. Please, please provide a fix for this!

-FLAVOR: and Ubuntu Server +1

Bit late to the party, but I believe this was fixed in xenial/16.04 but not trusty/14.04

The relevant bug report is here: https://bugs.launchpad.net/ubuntu/+source/update-manager/+bu...

Side note, give up on a separate /boot in future, not needed 99.9% of the time.

Indeed this is due to missing options in the installer: with some manual fiddling, it's long been possible to make grub boot from encrypted disk (thus removing the need for a separate /boot and its space issues): https://bugs.launchpad.net/ubuntu/+source/grub2/+bug/1062623

You could probably fix this on your system without installing from scratch, but it would take some careful planning (mostly wrt backups!). You'd need to boot from a recovery cd or such, copy /boot into /, edit /etc/fstab accordingly, and then follow the last few steps in the bug report above (from the chrooting onwards). I'd probably test this in a VM first.

This might be aggravated by this DKMS bug which was found in v2.2.0.3, which incidentally is used in Ubuntu 16.04 LTS, and maybe other versions. It doesn't remove old initrd files in /boot which lead to full /boot issues and subsequent update problems

I manually removed a bunch of old initrd images the other day.



They've got a bug reported upstream. We need this fixed in Ubuntu

Ubuntu Launchpad bug here regarding DKMS leaving old files to fill /boot:


You can add yourself to the list of people affected to increase the Bug Heat and get this issue fixed.

Why do you have a /boot partition?

  $ mountpoint /boot
  /boot is not a mountpoint
I don't know if this is the default, but my KUbuntu machines have been fine for many years without a separate /boot.

If you want to install with full disk encryption, you normally want a /boot partition.

In theory, GRUB can load kernels off of a LUKS-encrypted partition, but in practice I've never managed to set that up without having two passphrase prompts, one from GRUB and one for mounting the root filesystem under Linux.

I try to do the same, but sometimes there's no way around it. LUKS comes to mind: can't boot an encrypted kernel because EFI/BIOS has no decryption facilities. I'm sure there are other cases, but this is the only one that comes to mind.

Ubuntu mounts my EFI partition as /boot.


I don't use Ubuntu, but in Fedora it keeps only a handful of old kernels (i.e. one or two back) and deletes the rest. I guess Ubuntu just holds onto every kernel you've ever had since the beginning of time?

Why is boot still a separate partition. I most cases it doesn't need to be.


omg, I am on 14.04 LTS and didn't believe until now it is not fixed in newer releases! :)

I've also encountered this issue, and I assumed it was a bug to do with the fact that I sometimes use apt-get and sometimes accept the Ubuntu GUI software update prompts.

If people are encountering it on this wide a scale, it must be a truly severe problem (that's also probably causing many people who haven't learned about apt-get to give up on software updates entirely!).

This is the primary reason I've recently​ replaced​ Ubuntu as the OS on my home server/NAS. I'd be happy to come back!

This is an annoying problem indeed but there is an easy workaround. Here is a script I use whenever I ran out of space on /boot partitition:


Ubuntu should just integrate Debian's /etc/kernel/postinst.d/apt-auto-removal script.


This more than anything. I believe it is this issue (at least for me): https://bugs.launchpad.net/ubuntu/+source/unattended-upgrade...

The old kernels should be purged automatically on desktops. That really is an annoying issue.

+1 I recently had this issue. I have Ubuntu on my machine (not dual boot) and I installed with all the default options. I hardly installed any heavy applications and it started saying it cannot install updates as /boot is full. Thanks for bringing this up.

I used to use Arch and seriously one of my favourite things to see was an update that used negative disk space. Apt has come a long way, but this DEFINITELY needs to be fixed.

I had this problem on a previous work laptop. Very annoying. I thought it was the local IT people who had installed it wrong.

apt autoremove will clean it up.

Only if you catch it in time, before you've completely run out space. `apt autoremove` will crash if there is not enough space on the disk.

Which would be another worthwhile fix: some way to run `apt remove` or `apt autoremove` in a zero free-space environment. It could detect it and pick pick data somewhere (like /usr/src) to persist to memory temporarily while it works, then replace.

I've never noticed autoremove removing old kernel images in /boot.

+1 ... plusa nice command to clean out unused kernels


Yes! Please!!


- FLAVOR: Ubuntu Desktop

- HEADLINE: More stable dock/undock and sleep/wake handling.


I've noticed that my system often hangs unrecoverably with a blank screen during dock/undock and sleep/wake events. I've learned, though, that I can reduce the likelihood of having problems by trying to minimize the number of state changes that the system has to handle at once. For example, if I'm leaving the house with the laptop, I'll first open the lid, wait 10 seconds to see if the display wants to turn on or not, undock it, wait 10 seconds for it to adjust, and only then put it to sleep. Same thing waking it up: one step at a time, with 10 second pauses in between. Seems to reduce my problems by about 90%. As a developer, this screams "race conditions" to me, but what do I know? If there's a bug filed for this already, I wouldn't know -- no idea what I'd search for.

I take the uptime game pretty seriously: having to reboot means that I lose a ton of context. Right now, I've got nine separate workspaces/desktops going, all with several browser, terminal, etc. windows. A reboot means I'll spend anywhere from 10 to 20 minutes installing updates and recovering all of that state. It's painful. Right now, my system has only been up for 9 days, which is weak sauce.

- ROLE/AFFILIATION: software developer in the federal government

It's kind of crazy how long this has been a problem and across many different hardware configs. Sleep doesn't work on my desktop or on a windows laptop with standard intel everything.

As a side note, the same issue persist on Windows with HP current gen EliteBook and Office. Everytime I undock the notebook, MS Outlook gets disconnected and it simply refuses to reconnect. The only option is to close Outlook and open it again - but waiting 30s between closing and starting again otherwise some locks on the PST file are still in place and Outlook would hang on start forever. The only way to fix it then, is by a Windows reboot. That they killed the QA department was the dumbest idea ever, now things are so buggy.

Just want to note I have similar problems on MacOS especially when using multiple monitors. I.E. one monitor will work and the other won't until I restart.

Yeah, I've had intermittent suspend/resume issues with nearly every laptop I've tried linux on.

My current xps 13 is the only one I've ever used where it works 100% reliably.

Interesting. I did a test run of the new xps 13 the other day and it had all sorts of issues with a stock install.

What sorts of issues? I've tried a multitude of distros on it and provided that it has a recent kernel (4.8 or newer, so if you tried 16.04 you'd want to make sure you're running the HWE stack) everything seems to work really well.

The only hardware related issue I've had has been static background noise from the headphone jack (which I also see on a clean install of windows), but I was able to get past that with the steps here:


I had this same problem. I switched back to nouveau drivers for my Nvidia card, instead of the proprietary drivers, and everything works perfectly. I also seem to be getting better overall performance in day-to-day desktop activities and battery life. I don't really do anything that needs 3d other than the desktop compositor.

When I was using he non-open-source drivers, Ctrl-Alt-F# to change virtual terminals would sometimes help. I would switch to a random virtual terminal and back and sometimes it would be working again.

Sadly, the ctrl+alt+f# trick doesn't work for me (OP). I wish!

And btw (should have mentioned this): ThinkPad W530 running nVidia drivers.

nVidia soft hang issues are super annoying. since mine is always plugged in, i just set my laptop to never suspend.

+1 Happens all the time. Can't tell specifically what the cause is. Most common seems to be the display hangs after unplugging HDMI while the laptop was in sleep mode.

I've found myself able to access my computer for a good while before the lock screen would turn on after a wake sometimes.


My ubuntu laptop has now became a stationary computer permanently stuck to the docking station and i never put it to sleep because I'm scared every time i unplug or plug in an external device. Anything from the most simple USB keyboard to ethernet, to VGA monitor, HDMI/DP/DVI monitors, USB3 docking station, close laptop lid, wake from sleep... EVERYTHING has a big chance that the hot plug will fail and you either have to reboot for the device to be detected or in worst case the OS just freezes.

This is a must fix for ubuntu to ever be viable as a desktop OS.

I use tmux-resurrect for this exact reason. Between that and Chrome's session restore I end up in an OK state after a reboot.

+1 for fixing this issue, just trying to work around it in the meantime.

Yes please!! So painful. I also have dozens of applications open, all the time, which I lose when I dock/undock when in sleep...

So much this! _If_ it succeeds in waking up from standby, only 2 of my 3 monitors work, this means I end up rebooting anyways...

This is actually one of the main reasons I am thinking of switching to a Mac. It's really hard to live with all these 'black-screen-after-lock' moments. If that could work as it does in MacOS/Windows, I'd probably never switch.

I am experiencing this when I resume with a VPN connection on.

Even though it doesn't take me much time to recover state its really annoying to know that I have to figure out all these issues anytime I start using a Linux laptop.

My laptop, everytime I open from sleep, is starts but it donesn't turn on the screen. I have to close and reopen, then it turn on the LCD.

this is one of the main reasons i don't use linux on laptop

A huge +.

OK here goes..

- FLAVOR: Ubuntu Desktop

- HEADLINE: Drop Mir & collaborate with Wayland

- DESCRIPTION: I know this is a touchy subject and I'm not looking to self-righteously re-re-re-ligitage everything but... between Intel walking away, licensing concerns, Ubuntu varients not jumping onboard, and various community concerns, would you re-consider abandoning mir and joining forces with Wayland? I understand you felt there were some technical shortcomings regarding how input devices were handled. Perhaps in today's climate those concerns can be better addressed by Wayland if you can provide the engineering leadership on those efforts?

- ROLE: Code Janitor

I was about to write this exact post. This is the main reason I use Arch Linux. Arch just follows upstream without worrying about holy wars.

There are dozens of ways Ubuntu can innovate/differentiate. Fighting a holy way of Mir vs Wayland (when you came second, and Wayland already had major support) is not a good use of your time. Same goes for Snap vs Flatpak. And what about Juju, or Bazaar (though that's dead now, isn't it?). You did well accepting systemd over Upstart. You could do the same with Wayland (over Mir), Flatpak (over Snap), and officially EOL Bazaar.

I'm not a Unity fan, but that seems like a genuine way for Ubuntu to differentiate and/or compete. Mir and Snap are not. They're just incompatible alternatives that divide the open source community. I know that alternatives can often be good, and inspire competition. But when it's clear you've lost, it's sometimes better to follow your Upstart/Systemd example and join the winning side.

> - HEADLINE: Drop Mir & collaborate with Wayland

That would more accurately be:

- HEADLINE: Port Unity 8 to be a Wayland compositor

This would be at least a multi-year effort. Consider how much time and effort went into porting Gnome Shell and KWin over to Wayland. It would be at least as much work do the same with Unity 8.

Finishing Mir is also a multi-year effort. So, where would the programmer-hours best be spent?

Exactly. Let's avoid the sunk-cost fallacy.

Sorry if I wan't clear, the point was that it wouldn't be possible to do in time for 17.10, which is what the original post was asking about.

> Drop [Canonical-specific] & collaborate with [leading variant]

That would be great in general. Linux Mint is known as "Ubuntu minus Canonical" for a reason.

Except they invent their own shit all the time also. Sometimes to the detriment of existing products.

Linux Mint is also "Ubuntu without Security". That's right -- Linux Mint does not install Ubuntu's security updates, nor supply their own.

Stated another way: 100% of Linux Mint machines in the world are either:

(a) already pwned


(b) vulnerable to multiple CVEs and will be pwned

+1 to explore using Wayland over Mir. I think it's intuitive that everyone (distros, developers, users) would benefit from unifying the Linux graphics stack to the greatest extent possible. The display server decision should not just be based on what is best for Unity, but what is best for the Ubuntu ecosystem as a whole. Having multiple display servers means there will be a duplication of effort for developers of application toolkits, window managers, and, of course, the display servers themselves. The proper question in my mind is whether all the additional downstream and lateral effort from maintaining multiple display servers is a bigger opportunity cost to the overall Ubuntu experience than the additional development necessary to get Unity running on Wayland. It's worth revisiting the display server question because this is something that may have major consequences for quite some time.

Look at the main request on this page: hiDPI. That is what we need a better graphics stack for.

The reason we need focus on Wayland is that NVIDIA and AMD need to focus on a specific graphics stack. That wasn't our choice. That was theirs. Thankfully, AMD is headed in the right direction with AMDGPU.

Unfortunately, NVIDIA shows no signs of sensibility here, so we must be wise, and focus their efforts away from Xorg. The best way to do this is for canonical to forget about Mir, and push its support behind Wayland.

We need your help, Canonical! Let us fight stagnation together, lest we find ourselves using Xorg for another 10 years.

There's no technical justification. Mir is a purely political decision. Canonical wants to exert more control over the platform. It's a totally rational position, but as you note, it seems to be failing.

To be fair, I could absolutely understand their decision. After watching the people behind Gnome3 and Wayland pretty much run amok with their ideas and their attitude towards other, dependent projects, I can't blame Canonical for not having that much faith in a possibly unstable foundation. Especially with their plans back then of building a mobile product on it. So I really can't see how Canonical are the bad guys here. It was not so much about more control but against losing it somewhere down the road.

Yeah, I didn't say they're bad guys. I just think it's important that people recognize Mir was never about technical considerations. It was a land grab, and I agree, we may very well have been better off with Canonical in control of the platform.

> - ROLE: Code Janitor

That sounds like a fun job. You just clean code all day?

Oh god yes. I forgot that one.

I'm sure Mir is a perfectly good replacement to crufty old x.org. But you know what? Upstart was a perfectly good replacement to crufty old sysvinit, and we all know how that one turned out. I predict this will turn out exactly the same; Ubuntu will go with Mir for a few years before eventually caving to community pressure and switching to Wayland, just like the ultimately gave up Upstart in favor of community-favored systemd.

Flavor: Ubuntu Desktop

Headline: Good (or even acceptable) high-DPI & multi-monitor support


High-DPI support is really bad in Ubuntu right now, and multiple external monitors are poorly supported. Here are some of problems I experience regularly:

- Ubuntu won't remember screen configurations when unplugging and "replugging" external monitors, which means I have to reconfigure them again and again.

- Often Ubuntu will freeze / crash when unplugging external monitors or when powering the laptop up after putting it in sleep mode and unplugging the monitor cable while the laptop sleeps. The only safe way to unplug a monitor is to first manually disable it in the "Display" settings, which honestly is not acceptable.

- Ubuntu often does not even notice when monitors get unplugged, hence it keeps displaying apps on (now unplugged) monitors. When opening the "Display" settings it will usually recognize the mistake and remove the extra monitors from the config.

- High DPI in general is still poorly supported in apps and the performance is very bad compared to e.g. Windows, to the point that I'm not even able to play 4k videos.

- Some keyboard/mouse gestures don't work on secondary monitors (e.g. using the arrow keys to navigate through menus)

Role: CTO



By high-DPI I especially mean 4k displays (e.g. 3840 pixels wide), which are becoming more popular and which are almost completely unusable without proper DPI scaling.

Another problem with the "Display" settings dialogue is the weird behavior when dragging window icons around to arrange them: Often they will get stuck or outright refuse to move where I want them to be, such that I need to resort to some hacks (e.g. moving monitors around each other in circles) to get them where I want them to be. Also, when plugging in an external monitor often Ubuntu will not detect it correctly and display it as having a resolution of 800x600 pixels, refusing to adjust it or enable the monitor. The only way to fix this is to reboot the machine.

In general I want to thank all developers of Ubuntu, which -while not perfect- is still by far my preferred OS for any serious development work.

Thank you. Having used macOS for years now, where the transition to High-DPI displays was pretty much un-noticable, I was shocked to install Ubuntu on a machine with a High-DPI display and not see everything working right. Mouse cursor too small. Fonts too small. Icons too small. Change this. Now text is OK. Tweak that. Now icons are all right, but it didn't affect Firefox for some reason. Change that other thing. Now mouse cursor is the right size for a while but the system changes it back every time I come back from sleep.

I still haven't gotten it all to work reliably. And I have no idea if it's Ubuntu, if it's the window manager, if it's the desktop environment, if it's X (or Mir, or whatever they are calling the window system these days) if it's individual applications, or a combination of some of those. It's crazy land.

+1 to all these points.

A couple of related issues that cause me frequent problems...

- In addition to detecting when monitors are removed, detecting when they are there from the start. If I have a monitor plugged in at boot, it is generally not detected. I instead have to wait for the login screen to appear before connecting the external monitor.

- When opening a new window, there seems little logic as to which monitor it will appear on, and where (particularly annoying in combination with being limited to a single DPI setting across multiple monitors despite each needing different scaling (sudden unexpectedly huge or tiny windows) - see https://news.ycombinator.com/item?id=14003160 )

Role: Developer

I run gnome on Ubuntu because of how bad it was in Unity, but that's also not perfect. On a two-monitor (both 4k) desktop where I'm not plugging in or unplugging screens, waking from sleep will commonly only bring up one screen, or the arrangement will have changed, or it'll get stuck in a mode where the screens go dark after 10 seconds of inactivity.

+1 that the UI scaling is nowhere near as good as OSX in either unity or gnome. I'm either stuck with normal sized fonts but oversized UI elements (button / text field height) or the reverse.

Definitely agree about the "Display" settings dialogue annoyance.

Also, Monitors with varying DPIs were so bad that I just bought another 4k screen rather than trying to make my 1080p one work alongside the 4k one.

Debian Gnome dual monitor setup also darkens one monitor regularly when pugging/unplugging screens, took a while for me to understand that it only sets the screen brightness of one monitor to zero. So adjusting the brightness up bringd the monitor back alive again.

FLAVOR: Desktop HEADLINE: Pick an official laptop for the release. ROLE: End-user, Sysadmin, Developer

I would love for Ubuntu to, with each release, pick a laptop vendor and a laptop and just Make It Work.

All the components. Out of the box. As near perfect as one can get it. So when I'm in the market for a new laptop, I can just buy that one. And I'm not talking about a pro gear like the XPS. Just simple, cheap consumer stuff.

What's wrong with using XPS? I picked one up for $1200 last time I bought one, that's not too high of a price.

You'd typically want the flagship device to be at least mid-tier to show off the best features and support for new mainstream technology, like the HiDPI stuff other people are talking about.

>What's wrong with using XPS? I picked one up for $1200 last time I bought one, that's not too high of a price.

If you ship it to Australia it's over the $1000AUD gst threshold so it picks up an extra 10% as it crosses the boarder.

If you ship it to Iceland it picks up ~30% in VAT and other taxes, and you have to pay tax on the shipping as well. If the Icelandic customs decide what you've bought was a "luxury good", it could pick up as much as 50% more tax in the process[1].

The USA is not the world. The base price tends to multiply when you ship it elsewhere.

[1] I don't think anyone understands how the Icelandic Customs applies tax, including customs themselves

You want Ubuntu to pick the best laptop for Australia and Iceland? Or, are you saying they should conduct a global survey, researching the models available in each country and their import tariffs?

>You want Ubuntu to pick the best laptop for Australia and Iceland?

What on earth /are/ you talking about? Laptops are laptops. With the exception of the international space station, they work the same everywhere.

The best laptop for Australia and Iceland is the same laptop that is the best laptop for the USA, and the XPS is pretty up there wherever you live.

What I'm saying is that there's a lot of value in picking something that /isn't/ the best if the logistics and legalities of shipping it globally are better.

So, you don't want Ubuntu to recommend a laptop anywhere? Or you want them to recommend the best valued laptop for each country?

Just trying to follow what you're saying... It seems like you described what's wrong with selecting the XPS, then here you describe why the XPS would be a good choice after all.

I think they want Canonical to recommend a laptop that's sufficiently low-cost to be accessible to a wide audience.


You've broken the site guidelines more than once by posting uncivilly. We ban accounts that do that, so please don't do that.



Too bad that the whole point of why people do this passive-aggressive pretend-stupid gambit (".. just trying to understand .." is one, and ".. genuinely curious .." is another), is that it's almost impossible to call people out on, without sounding at least slightly uncivil.

That's a good point, but the solution is worth trying harder for.

Oh jeez Dang, we've had this conversation before. Overzealous moderation kills websites. Funny how it's only you I ever see wandering into these situations.

Take my posting history into account and make a decision. If you wanna ban me, do it and I'll go find somewhere else that enjoys robust discussion. I've certainly delivered that here, if you'd care to go take a look.

However, please stop helicoptering in and tutting your finger at me. I've never seen you make meaningful contributory posts, so I don't respect you.

Your post got user flags. It's not just dang asking you to stop.

You don't need to include "You obviously can't read." nor "for the hard-of-comprehending among us" in your post.

almost 100% if you ship it to Brazil, it literally doubles the price, including the shipping costs.

For most people, $1200 is a lot of money to spend on a laptop. I think for Canonical it might send the wrong message to their users that Ubuntu is trying to be the Apple of Linux rather than trying to make computing accessible to as many people as possible.

But for a lot of people $1200 is a fine amount to spend on a laptop. For those people who don't want to spend that much there's plenty of other, cheaper laptops they could buy and run Ubuntu on.

haha. 200 USD the best I and my friends can do can do in my corner of the world. but ofc, "Weeer all living in Amerika, Amerika, AaMERika! Is wunderbar!"

I would think with a team as large as Ubuntu, they already have their "favorite developer laptop," and it just might be the XPS. (Disclaimer: I honestly don't know what it is.)

But his suggestion has merit: in addition to the high-end dev laptop that their team will just fix because they use it, it would be good to _publicly_ announce a mid-level consumer laptop at the time of release. It doesn't have to be a sponsorship. They can put disclaimers all over the announcement. etc. etc.

I see how you came to that conclusion, but it is wrong:

We don't have a "company blessed developer laptop" on purpose - we're small (compared to other companies with similar reach), so that's the only way we can cover enough ground.

Every technical employee works from home, and buys his/her own gear. People are expected to report (and fix, when relevant. Some teams do a better job at that than others, certainly) their issues with gear that is relevant to their own profile and market.

The company gives out money bonus for people to buy what they want and stay current every once in a while.

Source: I'm a Canonical employee (not working on the desktop - but expected to at least report my desktop bugs on the particular brand of laptop that I use).

I do appreciate the reply!

I did indeed just mean clustering of laptop choices in the past, with no "company blessed" laptop to date.

I then went with the idea that the company could bless one in the future. Thanks for explaining how it really works!

He didn't say "company blessed". He said "favorite". I would be very surprised if there weren't some clustering in laptop choice.

It's a huge amount of money for the majority of people living on this planet. Time for you to check your SV privilege.

Here in .nl, the cheapest XPS is 1500 euros, or 1700 dollars.

$1200 is not cheap consumer stuff.

I'm about ready to upgrade my 4 year-old laptop so this is relevant to my interest. I'm looking for an Ubuntu-friendly laptop and this always causes anxiety. My current laptop was some version of HP Pavilion. It mostly works but it was much more hassle than I wanted getting Ubuntu installed and there are still a few minor glitches that I never got figured out.

Short of this, an official C(c)anonical web page listing popular laptops at different price-points with basic specs and a README-like install guide for each would be great.

Anyone have a url for a solid reliable well-maintained guide (official or unofficial) like this -- best laptops for Ubuntu? I'm sure I've bookmarked a couple over the years but I don't feel like I've found the one true source yet.

Ubuntu tends to have a list of hardware[1] that plays well with it. From my personal experience, I've had good luck with Thinkpads and Dell precisions.

[1] https://certification.ubuntu.com/desktop/

I've had good luck as well. But I would so much rather have a supported hardware platform rather than luck.

Possibly would be even better if they sold the hardware themselves. Just resell one or two laptops and fully support them for the duration of the Ubuntu release.

I can dream, can't I? :)

I've consider the dell developer editions (the ones that ship with ubuntu) to be sort of official ubuntu laptop (I got one last year and the experience has been pretty good). But I agree, I'd always welcome more laptops that are linux friendly :)

Both Dell and System76 sell laptops with Ubuntu pre-installed.

Are you looking into some sort of certification of the sort "Ubuntu Verified"?

My experience with Ubuntu-preinstalled Dell laptops is far from perfect.

At one point, Ubuntu updated the binary nvidia driver in the LTS release (the one that came preinstalled on that particular Dell model, I think 14.04). The new driver removed support for the chip in the laptop, making it unusable without some extensive fishing for old driver packages and freezing further updates.

I think this is excellent idea. +1 (+1000000)

And to add, when they release new version, they could also do a demo on a desktop and laptop of their choice, where they can show new features.

This is a nice idea, actually. I'll have to see how to work this out with our partner ecosystem.

In the meantime, perhaps you'll find the list of hardware certified to work with Ubuntu useful: https://certification.ubuntu.com/

>, pick a laptop vendor and a laptop and just Make It Work.

I think what you're also looking for is and for the LTS release, ensure it continues to work on updates.

Excellent idea!

FLAVOR: Ubuntu Desktop

- HEADLINE: Replace X11 with Mir or Wayland

- DESCRIPTION: X11 is old, slow, and full of security issues. Mir, even in alpha, is much more responsive and provies important 21st centry feature set. Wayland is already used by a major distro. X11 is that cobweb that's gone uncleaned in our closet for too long.

- HEADLINE: Improve UI.

- DESCRIPTION: When I use Ubuntu it's often easier to use the terminal then to learn the 10 different UIs to configure everything. This makes it impossible to convert specific people to using Ubuntu because they just don't have the time to learn all of the terminal-spells I know. Ideally there'd be a single place that could detect most configs for standard packages and a way to add hooks to that to get your package to show up in that menu. I don't know if this exists but if it does it's definetly not used.

- ROLE/AFFILIATION: "Undergraduate Research Associate", I program and do sysadmin stuff for a department at my college.

I'd much prefer wayland over Mir. I know they're already set in their ways but wayland on fedora 25 works beautifully and has a way more reasonable license. It'd also be nice to have Ubuntu's support in continuing wayland adoption.

The "X11 is full of security issues" is full if FUD, IME. Like, they say true-but-irrelevant-and-misleading statements like "any program running in an X11 server can view/alter any other program in the same X11 server, and Wayland fixes this" - this irrelevant because common security of practice, IIRC, is to run nested X11 servers and give each program its own.

X11 also has a whole lot of commonly-used security/sandboxing extensions, but these are ignored in lieu of comparing vanilla X11 with vanilla Wayland, and pointing out that only the latter does security properly.

Meanwhile, Wayland forces monolithic design, in requiring the panel, hotkey daemon, WM, etc to be built into the compositor. Essentially, each Wayland compositor is its own DE (not its own WM, despite common misconception).

I want to see X11 die, but Wayland has some serious failures as an X11 replacement.

> this irrelevant because common security of practice, IIRC, is to run nested X11 servers and give each program its own.

Really? I don't think I've ever seen such a setup.

Yeah, that's not common at all. Plus doing so breaks a lot of things, like copy/paste, IIRC.

> this irrelevant because common security of practice, IIRC, is to run nested X11 servers and give each program its own.

If true that is a valid reason to drop it in my opinion.

You are preaching to the choir.

They have been working on Mir for years now.

I would vote for the contrary. First finish Mir and make sure that when the change happens, everything works as expected.

+1 I have a 2 yr old MacBook Pro. In OS X, window manager performs seamlessly using the Intel inbuilt graphics. On Ubuntu 16.10, the window manager is slow to the point of taking half a second to update window borders whatever the app. Makes doing responsive web design frustrating to the point of wanting to switch to another OS. I am really hoping that 17.04/Mir improves on this.

Have you tried another WM before blaming X? I spent the 90s watching X perform well on hardware that cannot handle OS X.

I +1 this one. ubuntu should be a force to push this forward.

- FLAVOR: Desktop

- HEADLINE: Make trackpads great again! Bring on gestures by default.

- DESCRIPTION: Trackpad config situation is a mess. Pretty much every Ubuntu derivative has its own simplified (reads severely lacking) interface. What's worse is the gestures configuration. It's mostly done via some dude's one off scripts found on some forum post 2 years ago.

Give me a MacOS like experience on the trackpad (especially the 3/4 finger workspace switching) and I'd never look back on MacOS again.

+1 to this. You currently have a lot of devs switching to the Dell XPS 13/15 (myself included) from the MacBook Pro line, wanting to give Ubuntu a go and speccing that particular laptop out with it.

Coming from mac, the most jarring experience of moving over is the trackpad. We know that for most trackpads, you can configure them to have similar behaviour for clicking (two finger right click, one finger click, no dedicated buttons), but it is hidden in config files etc. The option to emulate this experience should be baked into Ubuntu and made easy to access.

Palm rejection is also another big point with these trackpads. It doesn't work very well out of the box.

And we know that the hardware is more than capable enough since the XPS touchpads conform to Microsoft's "precision trackpad" spec.

I can't emphasize how annoying bad palm rejection is. I can't have my cursor randomly jumping across the screen and selecting windows that I don't want to type into (always seems to be at the worst time).

Try libinput. I configured my XPS 15 with libinput and it's been a blessing in disguise.

>Give me a MacOS like experience on the trackpad (especially the 3/4 finger workspace switching) and I'd never look back on MacOS again.

This. This. This.

This exists in Gnome 3.22 and up

You can achieve a primitive workspace switching scenario via libinput, xdotool and similar bandaids in any distro. What they boil down to is... once a certain 3/4 finger gesture threshold is crossed, it switches to the next/prev workspace in an instant. I'm not talking about that. The instant switch flashing the whole screen makes my head turn. It does not feel natural.

What I'm talking about it the exact MacOS behaviour, that is workspace switching occurs simultaneously while tracking my wrist motion, cancelable in the midst of the transition. That feels natural and allows you quick peeks.

ps. I guess you cannot have the MacOS way via the tools above because those tools should work hand in hand with the window manager, which they don't. Probably, this is a feature that would better be handled within the window manager itself.

yeah I get what youre saying for sure. Even Windows 10 has that baked in now!!!

The default for X is now the libinput driver. It's a miracle. My guess is that even if Canonical does no work in this area, the touchpad support will be much better.

I've been using it for over a year on MBP/Pixel2 and I've literally never once had a problem with palm detection.

Gesture support though is definitely still weak. Even GNOME that has 4-finger workspace switching has no configurability or hot corners...

Instead of putting too much effort in to Ubuntu phones they could team up with some manufacture and release Apple-like external touch pad with 100% integration with the system.

Yes. I tried to use Ubuntu on MacBook Air. Was missing smoothness if scrolling as on MacOS and support for resting thumb on trackpad.

- FLAVOR: Desktop

- HEADLINE: More stable and polished desktop

- DESCRIPTION: This one is hard to pin down, but I'd like to see more general polish and stability in the Unity desktop. One example would be around multi-monitor support, it's pretty good, but a bit funky in some places.

For example, if I have a monitor plugged in and I let the laptop screen lock come on, I can sit there and watch while both displays cycle through an On -> Off -> On -> Off loop. I think when one display goes to sleep it sends a signal which wakes the machine back up, or something.

I'd also like to see more options for configuring multiple mice/trackpads/trackballs in the Settings app, general improvements to quality-of-life issues which are very noticable when transitioning from, say, macOs to Ubuntu.

One more polish issue: I'd like to see more attention paid to power-drain regressions in the OS. I had an issue recently where a process related to automatic updates was spinning in the background and consuming 100% of a CPU core, and cutting my battery time in half compared to what it should have been. I looked into it and found it was a known issue that wasn't fixed yet, but could be solved by deleting one of the default apps. If I were a less sophisticated user I would have just concluded that battery-life simply sucked on Ubuntu, and frankly I would have been right.

[EDIT: all these issues were encountered on a Thinkpad T460, which should really be one of the best supported machines in the world for this OS. If things are flaky under the best of circumstances, I dread to think what it's like on some weirdo Siemens laptop some user might have] - ROLE/AFFILIATION: Software Developer

-FLAVOR: Ubuntu Desktop

-HEADLINE: Network manager that works

-DESCRIPTION: The single thing that would make Ubuntu seem 10x more polished than it now is the horrible state of the network manager.

The little wifi bar in the top right. Sometimes, randomly, after dropping a wifi connection, or going to sleep and waking it will:

1) Stop listing SSIDs except the one I've already configured and want to connect to. (But I know there's more)


2) Show the "wired connection" icon. Gray out the entire wifi section of the network manager dropdown menu. All while it is actually connected to some wifi and I can use the internet.

These issues are mostly fixed by a `systemctl restart network-manager`, but sometimes require a full restart.

I'm the kind of person that recommends people to get Ubuntu. "Everything just works nowadays on Ubuntu". Then I get a call a week later and have to explain to them "just type sudo systemctl restart network-manager into terminal" They then give me the "What? That is so stupid."

ROLE/AFFILIATION: Student / Sysadmin / Machine (Deep) Learning Engineer / Memeber of a students' club that organizes an event on every Ubuntu release where we help fellow students dual boot Ubuntu (or another Linux distro, but we recommend Ubuntu)

EDIT: formatting.

Agreed. There are new Network Manager bugs in just about every Ubuntu release.

At one point the NM task bar applet was simply gone.

Then it stopped managing wired connections breaking networking entirely for me.

Now the network no longer works after disconnecting from a VPN, which is an improvement compared to the previous situation where it often showed the VPN as active when it wasn't.

Good alternatives exist too, like the light and fast connman. I use it on a thinkpad running NixOS and it works beautifully.

A big +1 from me. Network Manager is so bad I wonder how it ever got adopted by the main distros.

And there's currently a big but where you can't get internet when you reconnect to a vpn after disconnecting. https://bugs.launchpad.net/ubuntu/+source/network-manager/+b...

Yes. Connman (developed by Intel) is a much more modular design. Network Manager evolved from a Gnome applet.

Add the fact that openvpn via Network manager has been broken for ~3-4 years as well.

Yes. Network Manager is horrible.


+1 This!

I was writing the exact same thing. The apps are not integrated, fancontrol doesnt work on recent desktops (dunno whose fault is this) compared to windows, and its hard to get a bird's eye view of the system settings. A better package manager UI would help.

Better stability is really important! Recently there was an update for the nvidia drivers for 16.04 LTS, this broke our GPU servers on Amazon and ruined desktop login for my grandfather. All because the updated Nvidia driver no longer supported the GPU's.

To Canonical/Nvidia defense, while they do remove support for older chips, they always do that in major updates. You, as the user, can avoid this by using nvidia-<number> package, not the nvidia-current one.

I have a machine, that has GPU supported in nvidia-340, but no later driver. Ubuntu only updates to the newer -340 packages, it does not skip to the highest possible number available.

- FLAVOR: Ubuntu Desktop

- HEADLINE: Better Mouse Settings

- DESCRIPTION: Right now mouse acceleration is enabled by default, and for heavy mouse users this is really not usable. There is no way to change this behaviour in the mouse settings. The only way as a user to get a workable mouse configuration is with custom startup scripts, and it took me as an experienced Linux user and software engineer a long time to figure out exactly how (The recommended way to do this kept changing). Non-expert users cannot be expected to do the same. All it needs is a checkbox or possibly a slider in the Mouse & Touchpad settings to configure the acceleration speed.

- ROLE: Desktop User

I switched from a Macbook air to Ubuntu on a Thinkpad this fall, and mouse/scroll settings were by far the most frustrating part of the transition. I ended up with a script I run every time I boot up my machine:


It's nice that xinput exists for low-level tweaks, but it seemed necessary because the GUI is so lacking. I still don't think I completely understand what settings I'm changing, but everything feels normal now.

The trackpad scrolling still doesn't feel nearly as good as on a mac, but I've gotten used to it.

I think you should be able to put those settings into your X11 configuration so you don't have to re-configure every time you boot, unless xinput on Ubuntu is different from Arch: https://wiki.archlinux.org/index.php/Libinput#Common_options (Although debugging xorg config files is not a terribly fun exercise :/)

I fully agree with the post above me. This is so annoying. Why is acceleration default and why isn't there a setting for it?

This is also something that may scare away casual users such as gamers, where acceleration is not acceptable.

Mouse scroll/wheel acceleration is also sth which gives a very natural feeling on osx but which is missing in Linux. Together with that, pixel-based scrolling, not line based.

On windows I can chance the scroll speed so its one line per tick, on ubuntu its stuck at 3 lines per tick. I have a logitech 'spinny wheel' and this makes it very frustrating as the screen zooms by and I lose precision.

I agree wholeheartedly!

The default mouse settings are unusable, and it is impossible to get it good, but with a lot of fiddling it's almost possible to get rid of the weird acceleration.

Sensible defaults for mouse: fixed scaling (downscaling) without smoothing.

Sensible defaults for pad: fixed scaling (or minimal acceleration) with some minimal smoothing.

For future reference for people suffering from this issue, the most reliable solution I found so far is to add the following lines to /usr/share/X11/xorg.conf.d/90-mouse.conf

  Section "InputClass"
    Identifier "mouse"
    MatchIsPointer "on"
    Option "AccelerationProfile" "-1"
    Option "AccelerationScheme" "none"

+1. It made me quit Ubuntu, along with another bug (Ethernet authentication).

- FLAVOR: Ubuntu Desktop

- ROLE: Java developer, now founder

I just saw your comment after posting a fairly similar one myself: https://news.ycombinator.com/item?id=14005808

Last time I was using Ubuntu consistently I was shocked by this deficiency +1

A thousand times this.

+1 was about to post the same thing


- FLAVOR: Ubuntu Desktop

- HEADLINE: 1st party hardware

- DESCRIPTION: I'd love to buy hardware from Canonical that will just work, just like I do with Apple. Dell comes close, but not close enough that I will recommend it to people. System76 build quality is something I hear people complain about, so I can't recommend them either.

- ROLE/AFFILIATION: Software Developer, Ubuntu Member and Ask Ubuntu moderator.

Yes definitely. See this forum [0] for a potential manufacturer who tries as good as a small manufacturer can by offering computers without an OS and also are careful when selecting components, but they can in no way guarantee linux compatibility or offer Ubuntu pre-installed because it would cost too much.

  [0]: http://forum.novatech.co.uk/t/linux-compatibility-what-are-the-best-current-laptops-to-choose-from-any-experience-im-thinking-novatech-for-a-new-laptop-i-wont-use-windows-anyway/621

Also see Linus' comments if it wasn't obvious enough that this is really important [0]

  [0]: https://youtu.be/MShbP3OpASA?t=24m8s

Super important, but when I posted this nobody had stated the obvious, so I did. :-)

Someone should make a Nintendo Switch-like PC that runs Linux; a full desktop-strength OS with a touchscreen UI when in mobile mode, that you can dock at home, or into a laptop-like chassis you can carry around (like Apple's Smart Keyboard), and get the full desktop UI.

The home dock would have more powerful, user-customizable and user-upgradeable hardware, and the OS should seamlessly handle the transition.

BQ put out an Ubuntu tablet that is similar to what you're talking about, I've heard bad things though. However that is the goal of Ubuntu for mobile.


Which sadly no longer exists. :(

FLAVOR: Ubuntu Desktop

HEADLINE: Dump Mir!!!!!

DESCRIPTION: I know Canonical has put a lot of effort into Mir and at this stage it is probably "too big to fail". But for various reasons my bet is that it will fail. I think this is Canonical repeating Microsoft's Metro mistake. I have a $12K dollar desktop and I don't want an OS optimized for phones !!! I will be able to avoid it but I would rather your engineering effort was better placed.

ROLE/AFFILIATION: Software Engineer / Data Scientist

They really ought to be using Wayland. There are technical reasons for Mir vs X11, but not really any for Mir vs. Wayland other than NIH.

I think most people are unaware of Mir vs Wayland and the issues that Ubuntu using Mir will cause. I only know about it because I was thinking about building a tiling window manager (because I'm weird like that).

I think there is a good chance that Mir will be technically inferior to Wayland and a better chance that Ubuntu UI designers will reinvent the user experience from a phone perspective.

Ok, I'll bite, how do you spend 12,000$ on a desktop? Just keep throwing videocards at it?

Pretty much. It's probably worth $6K at replacement cost today. It's for deep learning and I included what I spent on it and I've refreshed the 4 GPUs once. The current config is probably still good for a couple of years. Now if I need more GPUs I'll just get a second box. The irony is that I graphics power coming out of my ears yet I spend most of my time in vim and tmux.

Just out of curiosity - are the GPUs really the best bet for you at that point? I believe you can buy dedicated FPGA cards, some of which support OpenCL workloads even which might offer you better performance - though they might cost a shitpile too.

Not OP just honestly curious, of the $12k how much is dedicated to Canonical (donation/others)?

I don't like the direction they're going and I'm certain giving them more money won't change it.

It's in the ~$100Ks. I usually pay people directly to build things I think the world needs and then have them give it away.

I'm not here to brag and I'd like to stay anonymous so I'm going retire this HN account now.

- FLAVOR: Ubuntu Server

- HEADLINE: Built-in support for installing up-to-date packages

- DESCRIPTION: Currently, `apt install [package]` on LTS Ubuntu will install a package that is up to 24 months out of date (or more if you're not on the latest LTS).

Literally one month ago, using the latest version of Windows 10, I installed Ubuntu for Windows (which installed Ubuntu 14.04 LTS), and `apt-get install nodejs` installed Node 0.10 (from 2013! in 2017!).

I understand users want stability in the core OS, but there's no reason it should be made so difficult to install up-to-date software from elsewhere. `apt` is useless for installing things like `youtube-dl` because old versions of youtube-dl quickly stop being compatible with YouTube.

Ubuntu's current solution to this problem is PPAs, which are very non-ideal because they only work if someone maintains a PPA, but this involves:

1. googling for the software's PPA, 2. finding the PPA, 3. possibly trusting a third-party PPA maintainer, 4. running at least three different commands, which you have to either memorize or re-google

Basically all software's Ubuntu installation instructions are something like "curl this script and pipe to bash" or "build from source" or "install this other package manager, then use the other package manager to install our software", just because it's impossible to install the latest version using Ubuntu's built-in package manager out-of-the-box.

For instance, here's Redis: "Installing it using the package manager of your Linux distribution is somewhat discouraged as usually the available version is not the latest."

I want to be able to do something like `apt install-latest youtube-dl` to get a usable version of youtube-dl, and considering the number of workarounds for this issue I find online, I think a lot of other people have the same want.

- ROLE/AFFILIATION: owner of a top-2000 US website

The entire point of LTS is that things don't change. Security features are back ported, but you're on Node 0.10 because when 14.04 was released it included Node 0.10. You have an old version of youtube-dl because that's what was available at the time.

You use an LTS release when you want to do 'apt-get upgrade' and not change anything. No new features, no deprecated features, no changes other than back ported bug and security patches. No moving to a newer version of Apache which uses different modules by default, or a newer version of PHP which changes a critical default variable. It's always the same software, just fixed when it's broken.

If you're looking to keep your system up-to-date with the latest and greatest, then there are a few suggested solutions:

1. Install stuff yourself 2. Use a backports PPA or vendor packages 3. Use a more specific package manager (e.g. pip for python, cpan for perl, npm for node, etc) 4. Update your servers more than once every four years

If you want the latest version of software, you need to not use the LTS version, and you need to update to the newer releases as they're released, because the entire point of LTS is not to do the thing you're asking to do.

Also as follow on, you'll find that the packages that draw community support also tend to have PPAs (personal package archive), these are very easy to add to Ubuntu these days and typically backport more recent versions of software to older LTS (and sometimes non LTS) Ubuntu releases.

For example I use the nginx PPA on some of my older servers and it supports nginx 1.11 on Ubuntu 14.04 even thought 14.04.4 LTS only supports 1.4.6.

apt-add-repository is what you're looking for.

Every time someone suggests this it makes me a bit nervous because you're advocating I install binaries compiled by some random person and deploy them on all my production servers. If there were official backports of some sort from the python people for example, then yeah, but otherwise, nope nope nope.

The parent example was saying about official ppa only, official as in the original software company providing that ppa. In this case, nginx ppa is provided by nginx developers. Other example is postgres.

But I get it, there are not many ppa like that.

Just in case, there's Linuxbrew (http://linuxbrew.sh/), Homebrew for Linux.

While you're correct, it should be noted that "security backports" are much less complete than they'd have you think.

Do NOT take the promises of enterprise distributors for granted, both because there are a lot of security problems that never get fixed, and because the expertise to sufficiently backport is often lacking (cf. the Debian SSL entropy crisis of some years back).

You are usually better off using a newer upstream with the security fix integrated than relying on a distro-pushed backport. For a demonstration, try installing redis from the official Ubuntu repos.

For one thing: Non-LTS versions of Ubuntu have the same problem, just replace "24 months" with "6 months".

For another thing: LTS Ubuntu is the recommended version of Ubuntu: https://www.ubuntu.com/download/desktop

New users going into Ubuntu have no clue that going to ubuntu.com and clicking "Download" will give them an incredibly out-of-date OS.

> The entire point of LTS is that things _don't_ change.

That's not the root problem.

The root problem is that people want non-broken software, and updates sometimes break things.

LTS is one workaround, by committing to make as few updates as possible (mostly only high-impact security patches), you limit the likelihood that updates will break things.

But the problem is, not having updates also breaks things. `youtube-dl` is my usual example. Others include that having an old version of Node will break software that expects newer versions of Node.

The solution I'm proposing is to let the user choose whether or not they want up-to-date software on a case-by-case basis. I did not expect this to be so controversial.

> 1. Install stuff yourself

This is not a suggested solution, this is the problem we are trying to find solutions for.

> 2. Use a backports PPA or vendor packages

> 3. Use a more specific package manager (e.g. pip for python, cpan for perl, npm for node, etc)

These are workarounds that are MUCH less usable than just using the distro's built-in package manager. It is true that I can do this (mostly, I just build from source), but what I am asking is a way to make this easier, because it is a common problem that everyone has, and because it's the entire problem that a package manager is supposed to solve in the first place.

> 4. Update your servers more than once every four years

> If you want the latest version of software, you need to not use the LTS version, and you need to update to the newer releases as they're released, because the entire point of LTS is not to do the thing you're asking to do.

Sometimes, you want some of your software to change and other software not to change. The job of a computer is to do what the user wants it to do, is it not?

Like, if I'm using LTS, it's presumably because of something like "wifi works on this exact combination of system software and I don't want to risk it breaking with a random update". So the default would be to install software from a known working system. But I should be able to explicitly opt into installing other up-to-date software; keep my browser up-to-date and all that.

That's why I'm proposing a new command, so you can type in something like `apt-get install-latest youtube-dl` and get the latest version. `apt-get install` would still work the same as before.

Right, but the "new command" is "snap install ..." ;-)

Things are supposed to be stable if you're running an LTS, you savage! If you want everything bleeding edge, then use Arch Linux, not an Ubuntu LTS. Good god man.

Also I love the incredible specificity of your "ROLE/AFFILIATION". Top 2000, eh? Heh.

I feel like it's very common to want some things to be stable and other things to be bleeding edge. Like, why are my only two choices "everything out of date" or "everything bleeding edge"? Why not make software for actual humans to use? I remember that being Ubuntu's motto.

Like, I presume it's pretty common for an actual human not to want their wifi to be bricked by an update. I presume it's also pretty common for an actual human to want a version of `youtube-dl` that works. It seems user-hostile to force a user to choose one or the other.

And before you say "but PPAs", PPAs are a workaround that have various flaws (you need to use google and copy/paste code, instead of just using your package manager as a package manager) (and they're only available for certain repositories) (and many of them are unofficial). Ubuntu/Debian already maintain up-to-date repositories, why not just allow users to selectively install software from them?

(I wasn't sure what to put for ROLE/AFFILIATION, honestly; I saw other people being vaguely specific and I tried to copy them)

In what way is Ubuntu not for humans to use?

I don't know what youtube-dl even is, why should your personal interest be the top priority of Ubuntu and when it doesn't work evidence that Ubuntu isn't fit for humans?

There needs to be a rational middle ground.

Arch is bleeding edge and has no qualms pushing updates that break things and telling you to fix it. That's fine for workstations (and I've used Arch as my workstation OS for over 10 years), but it doesn't work well for servers.

IMO, LTS should be transformed to a yearly release cycle with 2 years of support, not bi-annual with 5 years. This would fix a lot of problems, especially considering that distros can get frozen with versions of packages that are already pretty old when they go gold. Keeping support around for 5 years (which is basically a myth anyway) is just a big waste for everyone.

There is a rational middle ground -- Ubuntu's regular semi-annual releases.

If Ubuntu doesn't meet your needs, maybe try Centos + EPEL. You get a lot of stability with the option to install more recent versions of some packages.

Then again, you seem to be taking all this pretty personally, so my comments are probably wasted on you.

> and has no qualms pushing updates that break things and telling you to fix it.

I've been using Arch for 4 or 5 years and can only recall a couple instances where I needed to force-install a package to accomodate a backwards-incompatible change.

>There is a rational middle ground -- Ubuntu's regular semi-annual releases.

I'm aware of Ubuntu's normal releases. These aren't labeled enterprise-ready, so enterprises won't use them.

They also frequently contain experimental/unstable stuff, not because they need to, but because the LTS release cycle causes the release team to treat non-LTS releases like a beta.

>Then again, you seem to be taking all this pretty personally, so my comments are probably wasted on you.

I'm really not sure how you came away with the impression that I'm taking Ubuntu's release cycle as a personal affront. I'm not the top-2000 guy you initially replied to.

Rest assured, I do not believe that Canonical has selected its release process specifically to offend me. :)

>I've been using Arch for 4 or 5 years and can only recall a couple instances where I needed to force-install a package to accomodate a backwards-incompatible change.

Yeah, it's very rare to need to force install (has become more rare in recent years, but was still quite rare). When I say "break things", I don't mean you have to pass the force flag to the package manager.

I mean that configuration formats change, binary formats may change, a lot of system expectations may break that could take you offline for a while, even if you know how to fix them (for example, significant PostgreSQL updates require the data to be reimported to the new version).

Their transition to Python 3 as the default /usr/bin/python about 2 years before anyone else broke all kinds of stuff, for example. That's fine, you sign up for it when you use Arch. But that's not the kind of experience most people expect for a server distro.

2 years support :-) That would be really nice for us (Canonical). But we just announced ESM (Extended Security Maintenance) for Ubuntu 12.04, which is just about to EOL after 5 years of support. There are still over 10 million Ubuntu 12.04 machines that install security updates every day!

Use Debian stretch

This is the exact problem snaps are supposed to solve. Check out Snapcraft.io


My laptop is on Ubuntu 16.04. Via apt, I get libreoffice 5.1.6 from last october. Snap gives me 5.3.1, which is the newest one.

Snap is not there yet. Finding stuff is hard, because uApp Explorer [0] is not good enough yet. It suffers from trying to appeal to servers, desktop, mobile, and IoT (which probably just means Raspberry Pi). It is confusing and messy.

Edit: Since GP mentioned youtube-dl, specifically. A `snap find youtube` gives me three different variants:

youtube-dl-casept 2017.03.26 casept

youtube-dl-snap daily fireeye

youtube-dl-bdmurray 2016.10.16 bdmurray

The first one looks recent. Maybe the second one is even better? As I said, it is confusing so far. We have to figure out how to do curation well.

[0] https://uappexplorer.com/apps?type=snappy

Would be interesting if there's a way to list more information on the snap, such as submission date / last updated etc.

I guess if you want a bleeding edge experimental server, you might want to try Arch Linux. But you will have to tinker... a lot. It all comes as a trade off. You get the benefit of rolling release and AUR, you loose the stability.

Here's a balanced comment from the Arch wiki regarding Arch not being meant for the server:


"You may have seen the comments or claims: Arch Linux was never intended as a server operating system! This is correct: there is no server installation disc available, per se, such as those you may find for other distributions. This is because Arch Linux comes as a minimal (but solid) base system, with very few desktop or server features pre-installed. This does not mean Arch Linux is a bad server system; quite the contrary. Arch's core installation is a secure and capable foundation. Since only a small number of features come pre-installed, this core installation can easily be used as a basis for a Linux server. All the popular server software (Apache, MySQL/MariaDB, PHP, Samba, and plenty more) are available in the official repository, and even more are available on the AUR. The wiki also contains much detailed documentation regarding how to get set up with this software."

Also, because you mention youtube-dl. Here's the latest package version in the Arch repos (only a few days old =)


Ah this is a good idea, and one we're very specifically trying to solve. The way you're going to do this in Ubuntu will be with the "snap" packaging manager!

You can already use "snap install ..." to install hundreds of latest/greatest software packages, which evolve independently of the underlying OS. We're investing even more heavily here, so please do stay tuned, and try out snaps!

Use a rolling release distribution (like Arch or the more conservative OpenSuSE Tumbleweed) if you want the latest and greatest packages. The point of LTS releases in minimal breakage during uptates coupled with some two or more years of updates.

> (which installed Ubuntu 14.04 LTS), and `apt-get install nodejs` installed Node 0.10 (from 2013! in 2017!).

Not in 2017. In 2014. You're using Ubuntu 14, which means you get 2014 versions.

Even the latest versions of Ubuntu install Node 0.10. This is a known thing in the Node community which is why we use tj/n or NVM.

The "Node community" disagrees with the actual version numbers on the Ubuntu packages. The version numbers on the packages say that whilst Ubuntu 14.04 has nodejs version 0.10, both Ubuntu 16.04 and Ubuntu 16.10 have nodejs version 4.2.6.

* http://packages.ubuntu.com/trusty/nodejs

* http://packages.ubuntu.com/xenial/nodejs

* http://packages.ubuntu.com/yakkety/nodejs

* http://packages.ubuntu.com/zesty/nodejs

I very much agree. Apart from driver issues this is the area that creates the most friction for me. It feels weird having to fight the OS on something where the right thing to do seems so obvious.

- FLAVOR: Ubuntu Server

- HEADLINE: "Hardened System" preset install option

- DESCRIPTION: A checkbox in the installer which automatically applies a series of adjustments for a higher level of security right off the bat. Similar to the package presets but for security. So no one has to https://www.google.com/search?q=ubuntu+server+hardening



- FLAVOR: Ubuntu Server

- HEADLINE: Something to allow to apply different versions of php to different nginx server blocks

- DESCRIPTION: Something like perlbrew but for php. To allow installation of multiple hosted systems when their php version requirements differ.



- FLAVOR: Ubuntu Desktop

- HEADLINE: Something to switch audio output from my Laptop's built-in speakers to HDMI when it's connected

- DESCRIPTION: Currently I have to run "pulseaudio -k" every time I turn on my HDMI flatscreen because after I turn it off at night the audio switches to the built-in speakers- but not the other way around when I turn it back on.


> Something to allow to apply different versions of php to different nginx server blocks

Since Php v5.6 the Upstream (Debian) packages of PHP support side-by-side install.

Install the versions you want, and then use a per-php-version apache config file you include in the vhosts you want it for:

Enable proxy_fcgi:

    a2enmod proxy_fcgi

    AddHandler proxy:unix:/run/php/php5.6-fpm.sock|fcgi://localhost;php5.6 .php

    AddHandler proxy:unix:/run/php/php7.0-fpm.sock|fcgi://localhost;php7.0 .php
Edit: brain-shart. You want nginx. Similar concept - point to the correct FastCGI socket in each block.

So each block would be then:


    fastcgi_pass unix:/var/run/php/php5.6-fpm.sock;

    fastcgi_pass unix:/var/run/php/php7.0-fpm.sock;

- FLAVOR: Ubuntu Desktop

- HEADLINE: Proper virtual desktop / spaces for multiple monitors (i.e. independent, per-monitor spaces)

- DESCRIPTION: Right now it isn't possible to switch workspaces on two or more monitors independently. This is possible on Mac, and is a huge productivity boost. Coming home from work to use my personal Ubuntu machine always feels like a step backwards for this reason alone.

I want to be able to have one monitor for my IDE, and one monitor for terminal /vim, browser instances, music, etc. I like to keep different virtual desktops "scoped" to different things--eg. "documentation and code" vs "personal email". When I switch between these on one monitor, it also switches the space on the other monitor. They should be entirely independent of one another.

If I'm looking at something on my left monitor, but want to look at something different on my right monitor, why make me switch both of them away? The lack of ability to independently control the desktops on each monitor makes me super sad. :(

This is what XMonad does by default - I hate it, but you apparently don't, so you could switch window managers to get this, if you wanted.

What parent describes is what most (all?) tiling window managers do. However, XMonad's default is really weird. Instead of each desktop having their own set of workspaces, there is a single set of workspaces shared by all desktops. If you try to switch desktop A to the workspace currently shown by desktop B, then B will switch to the one currently shown by desktop A.

There is a module to get the more 'normal' behaviour of each desktop having its own set of workspaces, but it can be a bit difficult to set up if you are not familar with Haskell.

See I think xmonad's behaviour makes intuitive sense and all the others are weird. If I want to see my Slack window on the monitor in front of my face I don't have to worry about remembering what monitor it was last on, I just call up workspace 3 and bam it's there.

Just wrote the same thing and would love to see this. It's the only thing that makes me consider moving to Gnome 3 as that will treat additional monitors as it's own workspace. Really miss MacOS handling of this as it's so seamlessly integrated.

A huge +1 to this.

My main system right now is macOS with 2 screens. 1 Main screen for all my development/browsing with 2 desktops, personal+work. Second screen for things that are always there no matter what desktop the main is on, such as IM/email.

I can absolutely understand why you would want this, but for me it'd be a huge net negative. I'd guess that points towards it being a configurable option somewhere.

- FLAVOR: Ubuntu Server

- HEADLINE: Python 3 as default

- DESCRIPTION: In lieu of a description, I'll just link to this: https://pythonclock.org

- ROLE/AFFILIATION: Developer, sys admin

Python 3 is installed by default, and Python 2 is no longer installed by default on Ubuntu Server since 16.04.

Both versions can be installed concurrently. Python 2 is available under /usr/bin/python2 if it is installed. Python 3 is available under /usr/bin/python3 if it is installed.

/usr/bin/python points to Python 2 (or nothing, if Python 2 is not installed). This is the upstream recommendation from PEP 394 (https://www.python.org/dev/peps/pep-0394/) and I don't see Ubuntu diverging from this unless upstream's recommendation changes.

Everything @rlpb just said here is spot on ;-) Definitive +1.

I agree, I usually update defaults so that python => python3. However, PEP has some guidance on how this should be handled[1].

I imagine Ubuntu (Debian, et al) is following these guidelines. It would be cool to have a push for those who still depend not only on Python2 but on `/usr/bin/python` being py2 to update their app - or at least update their packaging :)

[1]: https://www.python.org/dev/peps/pep-0394/

Opposing view: This would make me stay on 16.04 LTS for a long long time.

Can you explain why? Assuming you could still `apt-get install python27` and `update-alternatives` to symlink that back to the default `python`, no?

I believe python upstream recommends that "python" is python2 and python3.x is "python3"?[1] (Although, that does not jive with official python packages for windows, which is both annoying and confusing - the pep governs "unix like" systems - eg. including linux subsystem for windows, but excluding python on windows...):

[1] https://www.python.org/dev/peps/pep-0394/

The default Python on any system is the only one which is really well tested and works with all the not-trivial-to-compile packages. Making py3 the default is exactly for deprecating py2 support, thus an apt-get install python27 would never have the wide range of apt installed packages, like it does now.

To be honest, new software should not be being developed on the 2.x line, so if it's not battletested now, it should never be.

But thats my opinion of course. We need to move the industry forward eventually and 95% of useful plugins/modules have already been ported.

It's time for py3 as a first class citizen.

I don't get the logic behind your battletested thought.

Generally it might be true that finally python 3 is now the default for new projects but that doesn't mean that there will be a switch to python 3 as default enviroment. There are still a lot of base libs which are not ported to python 3. Most often nobody has interest in porting them. Some are, but then often as a complete rewrite, which are not backwards compatible.

Until Python 3 is the default env it will take a few more years.

In 3 years, Python 2 will not be maintained anymore. It should really not be used for anything important in 18.04 LTS anymore, because that'll need to be supported for longer.

Ansible only works with python 2.x (with beta 3.x support).


On a server you are the snowflake with your Python 3 desire. Do you not use venv for applications?

Wouldn't this mess up a lot of current server installations?

You can't update a major version and expect things like this to not change

(What I mean is: sysadmins are aware of this and won't "just" upgrade to a new major version without considering such factors - at least they shouldn't do that ;) )

It already happened, so probably not.

Hard to decipher what you mean. Maybe you want to say, that is already the default, but as this is obviously wrong you probably want to say, that Python 3 is kind of default now. Which might be true for new projects, but this doesn't have anything to do with the default env for servers. So maybe you did already messed up your server with a python version default change and you want to express that this can not happen to you anymore.

It's hard to decipher what you mean, too.

Ubuntu already switched from Python 2 to Python 3 and not many people feel "messed up" by this.

Python 3 is the only Python pre-installed on current versions of Ubuntu. All Ubuntu system utilities that use Python use Python 3. You have to install a separate package to get Python 2.

FLAVOR: Ubuntu Desktop

HEADLINE: Better security processes


I've been quite disappointed that there wasn't really any public reaction from Ubuntu to a variety of security issues affecting the Linux Desktop in general and Ubuntu in particular.






Seriously, right now an Ubuntu Desktop isn't a secure choice for users, especially if they have to expect targeted attacks.

Some things I'd propose:

* Dangerous automation features need to be either disabled by default or heavily audited. That includes things like tracker and apport.

* In general I wonder how much auditing happens before something enters Ubuntu. Some basic auditing that could also be automated like testing packages with asan should be a default inclusion criterion for adding packages.

* Currently there are no bug bounties at all in the Linux distribution world. I get that this is a financial challenge, but at least in severe cases where the fault clearly lies within the distribution and not within an external project I'd consider bug bounties appropriate. (Just read Donncha's blog post linked above. He could've gotten $10.000 from a shady exploit dealer and he got nothing, because he did the right thing.)

ROLE: I'm running the Fuzzing Project and I write for IT tech media about security issues.

Hi Hanno - Ubuntu Security Team member here. Thanks for the feedback!

I wanted to point out that we did have a public response to the four issues that you mentioned. We quickly fixed them! If I'm remembering correctly, we had updates available within 24 hours of the first two issues you mentioned. The second two were privately disclosed to us and we had updates available at the same time the issues became public (thanks again to Donncha O'Cearbhaill and Ilja Van Sprundel for those vulnerability reports!).





Note that the first two issues were in packages that don't receive official security support so we didn't publish Ubuntu Security Notices for them.

I think we did a good job of reactively fixing those issues. You seem to be asking for more of a proactive approach (audits, sandboxing, etc.) and that's a valid suggestion. We are making progress there but not specifically due to the issues you listed.

The security team does proactively review the code of packages, which have an attack surface, just before they move into the "officially supported" state. Sometimes that involves fuzzing depending on what the piece of software does. It is a technique that we're trying to use more often.

We're also heavily employing sandboxes by default in the world of snaps. As more debs turn into snaps, those packages will get the added benefit of strong isolation.

The first two examples relate to codecs that are not installed by default. Most probably they should not be available for installation at all, but then it's the way that packages are available in the "universe" repository.

The third example is a valid issue, and got fixed. Apport is important to receive feedback from crashes. It is not enabled by default if you use the final versions of the installation ISOs. It is enabled only in the dev versions of Ubuntu.

Bug bounties would be interesting. Should they be monetary or should be something else (nice t-shirt). The issue with monetary bug bounties is that they make sense to money-making software and services.

> not installed by default

They are installed if you click the box during install to "Install third-party software for graphics and Wi-Fi hardware, MP3 and other media"

"Other media" sadly does not include DVDs. As far as I know there is no officially blessed way to play a DVD on Ubuntu, and probably never will be.

- FLAVOR: Ubuntu Desktop

- HEADLINE: include f.lux or redshift as a default installed package.

- DESCRIPTION: by including f.lux / redshift , Ubuntu will be helping users to get better sleep . I know it's very difficult to accommodate requests for default apps, but macOS and iOS has Night Shift, Android has Night Mode.

Thanks !

I'm not sure about having it enabled by default, but I would absolutely love for it to be just installed by default, and with solid control panel integration to manage it.

Unfortunately redshift doesn't work with Wayland or Mir.

Gnome's Night Light works great with Wayland.

(This doesn't help on Unity/Mir of course... just an option)

AFAI(Nexus 5x user)K Android removed Night Mode.

They "improved" it. It is now only available on Pixel devices (because it needs a driver support).

- FLAVOR: [Ubuntu Desktop]

- HEADLINE: Join Wayland

- DESCRIPTION: Instead of reinventing the wheel with Unity8/Mir, please join Wayland development and maybe join forces with Linux Mint and switch from Unity to Cinnamon or MATE, with Flatpak supports for desktop apps.

- FLAVOR: Ubuntu Desktop

- HEADLINE: Drop Xorg

- DESCRIPTION: I don't care whether it's Wayland or MIR, just for the love of god commit to it and end the pile of shit Linux on the desktop has been chained to for over 20 years. I don't know what went wrong, that you went from "we'll do it this release" to not doing anything for 2 years, and it hurt the entire community.

Disregard the petty squabblers who most likely haven't read a line of either the Wayland or Mir source code. Just come up with a solution that's well engineered and wins backing from NVidia, Intel and AMD. (I assume you already had that, because if not, why announce Mir at all?)

Also, please don't forget about the Desktop, you're definitely winning a lot of ground amongst developers. Container technology is making Ubuntu the preferred OS for many developers, something which both macOS and Windows don't have first class support for. This while Apple is chasing off macOS developers with expensive and less powerful hardware. Microsoft is coming back with a vengeance though, their new focus and work on things like a proper terminal and better linux subsystem makes them an option again in the eyes of some of my colleagues who before never even considered running Windows for their development environments.

If you guys stay on point, you will conquer the developers market the way Apple did in the late 00's. Which I think was critical for Apple's mobile ecosystem as well.

> wins backing from NVidia, Intel and AMD.

That's the heart of the problem. Intel is fine with their free drivers, and AMDGPU will be a good compromise, but NVIDIA will have to do all the work to implement a driver specifically for Wayland. Since AFAIK, Wayland requires KMS, NVIDIA is far far behind.

It isn't Wayland/Mir holding Wayland/Mir back. It's the proprietary driver blobs.


- FLAVOR: Ubuntu Desktop

- HEADLINE: All updates reboot-free

- DESCRIPTION: Short of a major-version update, the software updater should never ask me "Please restart the computer to begin using your updated software" again.

I'm already using the "Canonical Livepatch Service" - but I still get asked to reboot much more often than I would like.


If it is a Linux kernel update, then it's either Livepatch or you reboot.

If it is a system library, then for the apps to use the newly installed library, they need to be restarted (sometimes logging out and logging in again should be enough).

If that library is the system "libc" library or something similar, then it has to be reboot (not relogin). The only way that I can think for reboot-free updates, is to save the full state of the system, then silently restart everything to the previous state.

> If that library is the system "libc" library or something similar, then it has to be reboot (not relogin). The only way that I can think for reboot-free updates, is to save the full state of the system, then silently restart everything to the previous state.

Restarting running services and logging in again is fully sufficient to handle a libc upgrade, at the very least on Debian. "Just restart" is maybe easier advice to give, but overshoots the target a little bit.

But, if you have to restart running services anyway, what is the problem with just rebooting anyway? I don't get this.

Ah yes, good catch. We are working on making the "System Restart Required" message more "Livepatch" aware. Good suggestion! We'll try to get an update out to 14.04/16.04 LTS and into 18.04 LTS.

This. Please, don't ask people to reboot the computer unless it's strictly needed.

GNU/Linux is not Windows 95.

> the software updater should never ask me "Please restart the computer to begin using your updated software" again.

Is it still a thing ? I've been using Linux Mint (which is based on Ubuntu for updates) and I haven't been asked to reboot my computer in years !

On Ubuntu? Yes. For every update that grabs a kernel or syatemd, dbus sillyness.

- FLAVOR: Ubuntu Desktop

- HEADLINE: Volume leveling across applications

- DESCRIPTION: I use headphones every day. I listen to music and podcasts while I work. I use youtube videos and screencasts to learn new things. Sometimes I hop on a VOIP call through one service or the other. The one feature I miss most from Windows Desktop life is the "volume normalization" checkbox in my sound settings. It protects me from opening a new chrome tab and blasting noise into my ears at +30db. It protects me from that guy on the voice call that has his mic level WAY too high. It helps me hear the other guy who can't get his mic above a whisper. Most of all I never have to fiddle with individual application volume levels. Linux Desktops love to crib ideas from Apple, but for some reason they've all ignored this killer feature from 2006.

For reference, this feature is called "Use ambient noise reduction" on macOS and "Reduce Loud Sounds" on the 4th generation Apple TV:

https://support.apple.com/kb/PH18961 https://help.apple.com/appletv/#/atvba773c3c9

- FLAVOR: Ubuntu Desktop

- HEADLINE: Better HiDPI scaling

- DESCRIPTION: Real non-integer scaling on HiDPI screens. Consistent across different toolkits (GTK3/Qt/etc.).


- FLAVOR: Ubuntu Desktop

- HEADLINE: TLP installed by default

- DESCRIPTION: Most new users have no idea that TLP is needed for decent battery life on laptops. Should be installed and activated by default. GUI for advanced configuration would be a plus.

+1 both of these are very much needed by laptop users. Real non-integer scaling would be awesome.

TLP? This will fix my battery life issues?

Very likely.

- FLAVOR: Ubuntu Desktop

- HEADLINE: Set vm.swappiness on install based on machine ram.

- DESCRIPTION: The difference in responsiveness can be remarkable if it's lowered on systems with more ram. Most laptops and pc's these days have 4gb on average but the ones with hdds will be very slow on ubuntu because of default vm.swappiness vm.dirty_ratio vm.dirty_background_ratio etc that are set for older machines. Adding this feature will make ubuntu a better experience for most nontechnical people.

Could this be set at boot time instead of install time? That way if RAM is added or removed (or if a DIMM goes bad) it will update automatically.

+1, agreed!

FLAVOUR: Ubuntu Desktop

HEADLINE: Bluetooh that works

DESCRIPTION: I never managed to have my PC playing music through blutooth to a bluetooth loudspeaker. (I'm using Xbuntu, playing mp3s with mpv.) I think it could be because the audio system seems messy: should I have jackd enabled? What is it? So maybe the headline should be to cleanup audio system, specially its routes.

You have two options to try:

    1. install the pulseaudio-bluetooth-module, if you didn't
    2. follow the procedure below
BT in Ubuntu is in an embarrassing state, and I'm not exaggerating. After they've upgraded bluez to version 5, the workflow for connecting BT audio peripherals is:

    - connect the peripheral (takes a few attempts...)
    - disable the profile (set to off)
    - disconnect the peripheral
    - re-connect the peripheral (takes a few attempts again...)
    - set the profile to a2dp
this mad workflow has been consistent across multiple machines and audio peripherals.

Before the v5, the BT stack was still garbage, requiring changes (again, verified on multiple machines and peripherals) to obscure parameters in the bluez configuration.

So "please Ubuntu devs", try to produce a working BT stack, and stick with it.

I follow a nearly identical flow several times a day to connect bluetooth headphones to my laptop.

- Turn on headphones * Laptop connects immediately and uses the HSP/HFP mode (works!) * Trying to select A2DP appears to work, but no audio will play - Disconnect the headphones using the Bluetooth manager - Reconnect the headphones using the Bluetooth manager - Select A2DP, this now works.


Separately, many web videos will stop playing and refuse to play when the Bluetooth audio device is selected.


OK, well, this had to be released before or later ;-)

Here you can find a script which automates the connection to bluetooth headphones, executing (in ruby) the steps outlined above, using the bluez commandline tool (including: retry until the connection is established...):


As usual, you need to give execution permissions, or pass it to the ruby interpreter. It's designed to executed via terminal, call it from GUI has undefined behavior.

There is also currently a bug where you can't connect to an amazon echo via bluetooth [0]. I suspect it is a bluez issue or pulseaudio, as it happens across distros, but it does make me feel quite sad that it is this difficult.

Reminds me of how much of a pain printers could be and scanners can still be..

[0] https://askubuntu.com/questions/871630/cant-send-audio-to-am...

Thanks I think I tried exactly this but will try again. Actually I'd like a command line way which should be more deterministic.

That's a super important one. Audio on Linux/Ubuntu is okay, but when you run into serious troubles you're screwed. Nobody can help you unless you pay someone, other options: get a new computer, install a different Ubuntu, distribution or switch to Open Sound System. The only thing that always worked through all the years but it has obvious drawbacks.

Hmm, I've done well enough (on Debian testing) by enabling dmix in alsa and sending sound through alsa by default and letting pulseaudio start on demand for the applications that insist on it: mainly, this setup lets mpd continue playing after my graphical session ends.

Bluetooth not working isn't necessarily Ubuntu's fault. Bluetooth is a complicated, broken, cluttered, crumby protocol.

> Bluetooth not working isn't necessarily Ubuntu's fault.

Yet, Bluetooth works seamlessly in MANY other contexts. I can connect my phone to bluetooth headsets and bluetooth-enabled car stereos. I've used PS3 and PS4 controllers for years with no connection issues. I've used bluetooth dongles, bluetooth keyboards and mice on Windows machines and it just works.

Bluetooth may be "complicated, cluttered, and crumby" but it is not broken. Ubuntu is the odd one out here.

According to this article:


the guilt is both in Ubuntu (which updated the bluez version and broke things which were previously working) and the bluez programmer[s], which are described as "cowboy coders".

I personally believe both are accurate accusations.

I wonder if Ubuntu felt pressured for some reason[s] to upgrade to v5 (eg. support for v4 was going to cease), or if it was just "upgrading for the sake of upgrading". I have the suspect it's been the latter, since it's not that v4 had insufficient features or that the BT protocol would change in the meanwhile - not to mention that they actually did the exact same thing long ago when moving to v4.

Both Apple and Microsoft figured it out. Why shouldn't we expect the same from Canonical?



HEADLINE: Embrace the spirit of Open Source, not just comply with the letter of the law


Here's an extract from the Software Freedom Conservancy report on Canonical's licensing policy:

> Redistributors of Ubuntu have little choice but to become expert analysts of Canonical, Ltd.'s policy. They must identify on their own every place where the policy contradicts the GPL. If a dispute arises on a subtle issue, Canonical, Ltd. could take legal action, arguing that the redistributor's interpretation of GPL was incorrect. Even if the redistributor was correct that the GPL trumped some specific clause in Canonical, Ltd.'s policy, it may be costly to adjudicate the issue.


- FLAVOR: Desktop

- HEADLINE: improve VPN support

- DESCRIPTION: the WLAN UI supports some OpenVPN options, but not all, and fails silently on importing non-compatible config files. This is very confusing for new Desktop users.

- FLAVOR: Desktop

- HEADLINE: multi-column list view in nautilus

- DESCRIPTION: This view has been explicitly dropped (https://bugs.launchpad.net/ubuntu/+source/nautilus/+bug/7081...) but is very useful for quickly navigating large directories. Alternatively, replace Nautilus with a file manager that can do this (like Nemo). This is one area where the Windows file manager is still much better.

- FLAVOR: Desktop

- HEADLINE: polish file dialogs (multi-column-list view)

- DESCRIPTION: the default file-open and file-save dialogs lack many simple features that can save a lot of time. For example, in the file-open dialog there is no multi-column view (see above), you cannot rename files, you cannot create files/folders, you cannot access the normal context menu. All this requires separately opening a file manager, which also lacks a few productivity features (see above).

- FLAVOR: Desktop

- HEADLINE: polish hotkeys and general window handling on multi-monitor setups

- DESCRIPTION: I needed a bunch of compiz plugins to make this work in a halfway decent way in a 2-monitor setup, and I dread the day I will have to re-shuffle this for a 3-monitor setup etc. Make it easy to move a window 1) from one monitor to the other, 2) resize and move to one of the corners/sides, 3) maximize it. Also, applications in full-screen mode on one of the monitors confuse my compiz-based setup (for example, a full-screen Chrome window on one monitor will introduce numerous UI issues).

Still, it's a great system and very nice to use overall.

Thanks for gathering feedback. That's the first step ;-) Keep up the good work!

Edit: language

IMHO the biggest thing that would improve VPN support is properly reporting errors. NetworkManager seems to think it is a Windows application with the way it throws useless generic error messages at you.

Instead of "connection failed", how about "connection refused due to key size mismatch"? Even if it looks like technobabble to the end user it is something they can throw into Google to solve their problem. VPN connections are a nightmare to debug right now, and are so complicated that regular people frequently don't set them up correctly the first time.

- FLAVOR: Desktop

- HEADLINE: Easy Dock/Launcher Customization

- DESCRIPTION: The user should be able to 1) drag any executable to the dock to make a new launcher 2) Right click any launcher to be able to choose a dialog to customize command line arguments, initial working directory, and icon.

The user should not have to edit a desktop item file or install or know about Alacarte. Windows got this one right.

- ROLE/AFFILIATION: Software developer for chemists and biologists.

I still can't believe the effort this takes on Ubuntu. Windows has done this right for how many years?

Just so you know, you can launch an executable and it's icon will pop up in the launcher. If you right-click the icon, you can select `Lock to Launcher` and the icon will persist after the application is closed. Still not ideal, but it may hep you.

Ideally a user should be able to simply drag and drop an application or file directory to the launcher.

It would also be cool to have folders in the launcher for quick access to certain directories where hovering over the icon opens a miniature browser. Sort of like the Android launcher.

Flavor: Ubuntu desktop

Headline: Switch from Mir to Wayland


A disclaimer: I'm not using Ubuntu, but I'd like to see the switch from Mir to Wayland for Ubuntu, or even better - making Mir a Wayland compositor. That would benefit Linux desktop as a whole, instead of creating another rift. Current direction that Mir is taking is causing damage to global Linux community.

To give context. Mir was started, because some Ubuntu developers saw deficiencies in Wayland (which later was proven to be incorrect). Over time, Mir started borrowing stuff from Wayland compositors and input libraries anyway, and now simply mirrors most of what Wayland does.

TL;DR there is no valid reason for this rift, and it should really go away. This will make life easier for graphics drivers developers, GUI toolkits developers, SDL (and the like) developers, various developers of applications like screen recording and so on. And having this rift benefits no one.

- FLAVOR: Ubuntu Desktop

- HEADLINE: text antialiasing options

- DESCRIPTION: I'm not a Linux guy, but when I've tried it I'm always annoyed at how ugly text looks compared to macOS. It would be great if we could pick different text renderers or have a new one with an easy GUI for adjusting parameters.

The Windows method of having the user select what style of text they prefer from a set list of options would be a good starting point to take inspiration from.

It's quite intuitive and the average user doesn't need to bother learning the details of how fonts are anti aliased

Yes, it's one thing Windows gets right.

Hasn't GNOME has something similar for years? A whole bunch of different text aliasing examples, you pick the one that looks right?

Yes, GNOME has this feature - but it doesn't make the fonts look good enough, unfortunately.

Ubuntu's solution for beautiful fonts uses non-free software I believe, and the results IMO are as good as or better than Windows or Mac.

When I switched from Ubuntu to mainline Debian, I started having to install Infinality to get beautiful fonts as good as or better than Ubuntu's.

It's a general pain point with desktop Linux, but an area where Ubuntu leads.

> Ubuntu's solution for beautiful fonts uses non-free software I believe

Nope they don't.

That's good to hear. Any idea why Debian lags behind?

Yep, last year I compared some packages of interest (fontconfig, freetype, cairo and few others I can't remember now) and the only significant differences between Debian and Ubuntu packages were

1. Ubuntu packages were slightly more up-to-date (I compared packages in Debian 8 to Ubuntu 15.10 or 16.04, not sure which one). This is important for freetype because it keeps improving in every release.

2. Fontconfig is heavily configured in Ubuntu package. Not patched, just runtime configuration.

So there were no special patches on the Ubuntu side compared to Debian or the upstream sources. I must note that both Ubuntu and Debian's freetype package (which are almost identical BTW) enable advanced hinting options (which must be configured in compile time). Some other popular distributions don't enable those options because of lawsuit fears and this results in a much crappier font rendering that you can't fix with runtime configuration.

As I said this comparison may be slightly out of date now and I plan to repeat it after stretch is released. I didn't keep tabs on their state on recent Ubuntu releases but on the Debian side fontconfig and freetype was barely maintained in stretch cycle, so I guess Debian will still have slightly poorer font rendering compared to Ubuntu. You can still get a similar rendering by copying over /etc/fontconfig of a comparable Ubuntu release, though.

I don't think so. It does have immediate feedback when you change something, but you have to either know what each option does, or just play around. And it's only available in gnome-tweak-tool, not in the standard control panels.

Yup. Unless I mis-hit in default, GNOME does this.

- FLAVOR: Ubuntu Server

- HEADLINE: Please don't mess with python package management

- DESCRIPTION: Take a look at this bug:


This happened because ubuntu decided to unbundle some packages that come as a part of the python ecosystem. This is really a major annoyance because it breaks default behavior that people have come to rely on in every other platform, and confuses the hell out of people - just google for similar keywords and you'll find tons of questions and discussions around this and similar issues. Please don't mess with this stuff, or if you're going to break them, break them in a way that tells the user what the heck to do - it costs real hours and effort to debug and work around these things for production deployments.

- ROLE/AFFILIATION: Software / Data engineer

The same would go for RubyGems, but these are a Debian issue. Not sure how Ubuntu could untangle that.

Lobby to change policy or provide an exception? Now it's possible that I'm fundamentally misunderstanding Debian policy, but in this case it seems like it's not helping and is rather just hamstringing things by breaking user experience. This seems like a very legitimate exception case where bundling does make sense to ensure that core infrastructure works out of the box. Furthermore, if instead of bundling these packages the pip people had written code internally that does what these packages do, then there wouldn't be any discussion. It really seems like a semantic quibble over a minor point that's spurred by following the letter of the law rather than the spirit. But the effect is a horrible experience for everyone involved.

> but these are a Debian issue. Not sure how Ubuntu could untangle that.

Debian's Python maintainer is a Canonical employee thus Debian's and Ubuntu's Python packages are actually the same.

- FLAVOR: Ubuntu Desktop

- HEADLINE: Automated night mode so I can sleep well after work

- DESCRIPTION: Reducing the amount of blue light during the night is proven to help people finding sleep after having used their computer at night. So during the night, the desktop automatedly reduce the amount of blue light emitted on the screen by shifting the color balance.

- ROLE/AFFILIATION: Dev/Machine learner

Back when I was using Ubuntu, I could install an app called Redshift, which is basically the Linux equivalent for f.lux.

Redshift is great. I have a systemd user unit for it, and I actually prefer it to Flux on OS X.

Setting up (configure/build + daemon) Redshift is bit pain, a built-in would be awesome, similar to the one just launched in GNOME.

Why did you build from source and make your own daemon instead of using the Ubuntu package?

* http://packages.ubuntu.com/yakkety/redshift

There is f.lux for Ubuntu. Although maybe not for Unity? I use Gnome.

f.lux for Linux is an X11 app, it does not care for Unity or Gnome. You have to manually start it after you log-in, or find out how to make it autostart. Not very user-friendly.

Being X11 app means, it does not work with Wayland.

I'm running it with Unity

it has been broken for a while; nobody maintains it afaik

sudo apt-get install redshift redshift-gtk

This is a feature I'd really like as well, a blue light filter is a must for me on any device with a screen. I've always had problems with f.lux to behave correctly on Linux, so having one built into the OS (as an option of course) would be great.

I used redshift successfully for a time. It worked rather well but I don't know the current state (eg does it support wayland)

Redshift still works quite well. Maybe it should be pre-installed and better 'advertised'. Many people still don't know that this is a very nice feature to have.

Windows has recently integrated this in the 'creators update' as a feature. Would be nice to see Ubuntu follow suit.

macOS just got it too in 10.12.4

I would love to have that feature, automated blue light filtering and or screen dimming :)

+1 every system now has it. so make sense to join the club

FLAVOUR: Ubuntu Desktop

HEADLINE: Lightweight by default - don't follow the Windows/Mac crowd

DESCRIPTION: GNU/Linux - X'ish desktop environments systems in general and window managers in particular - used to have a certain way and freedom to be able to do things. Around 2000-2005 I was quite happy with FVWM, KDE3 etc. The window managers allowed me to do things that weren't possible with Windows or Mac. (Focus follows mouse, configurable behavious, handle many windows with ease...) I wish Windows or macOS won't be considered as ideal solutions and GNU/Linux just being a bad copy of that. If that's really the best thing, then it's a better idea to actually use MS Windows or macOS - I use the latter since 5 years almost exclusively. Just recently I started using Linux (Xubuntu) again privately on an older computer and at work as well. (At work we don't have Macs)

Please come up with your own ideas - nobody except "computer experts" use Linux on the Desktop anyways. You could go from there. Also looking at Xubuntu, it's a cool system. I really like it because it's fast, I can work with more than 5 windows comfortably. Unfortunately its bluetooth config is worse - recently I had to login to Cinnamon to make my Bluetooth mouse work again. Same goes for multi monitor, it works okay. ;) That means: when I disconnect my laptop from the external screens, open the display and go to the meeting it's black. I have to shut it down if I want to use it. (Power button or SysRq...)

So yeah, if Windows gets got enough (read: they finally get rid of all these freezes and things that just stop working) and they Opensource even more stuff, why not use Windows? I must admit, I'm no Opensource prophet so my primary reason to start switching to Linux around '98 was because Windows was mega buggy, slow and not nice to use on average hardware when the installation was more than a few months ago. IMHO true Opensource people use Debian, Arch or some unusual combination - like Windows as main OS with Emacs and Arch in the VM like a friend of mine.

Again is a time with so much potential for Ubuntu Desktop because devs are increasingly unhappy with macOS.

- FLAVOR: all? - HEADLINE: Improve experience of using 3rd-party apt sources - DESCRIPTION: This suggestion is more apt related, but Ubuntu could lead the improvements. Many software providers (Microsoft, Elastic, etc) are using their own apt repositories to be able to deliver updates faster than the Ubuntu release cycle, which is great. However, configuring them usually requires Googling the instructions and at least 3 commands. For example, installing SQL Server for Linux has the following commands before you can even run apt-get install (from their official documentation):

curl https://packages.microsoft.com/keys/microsoft.asc | sudo apt-key add - curl https://packages.microsoft.com/config/ubuntu/16.04/mssql-ser... | sudo tee /etc/apt/sources.list.d/mssql-server.list sudo apt-get update

That is not user-friendly at all. It would be great if apt could help you out here. i.e. if I type in "apt install mssql-server", it could detect that it is not in the Ubuntu sources but that it is available in a trusted 3rd party source, and prompt me to add that source to my local apt sources. It would then also automatically update that source.

Also, perhaps the Ubuntu sources have an older version but a newer version is available at a trusted 3rd party, and provide an informational message and an apt command-line flag that would allow you to add the source. i.e. "mssql-server 17.0.0 is available at the third party source 'microsoft'. To install it, run 'apt install mssql-server -S microsoft'" which would add the microsoft source and install the package. - ROLE: software engineer

I'm going to play devils advocate here:

If running two pretty simple commands (`curl` a file into `apt-key`; `curl` a file to `tee`) before you can run `apt-get update` is "too hard" or "too complicated", maybe installing software on servers isn't for you.

I also want to actually say good job to Microsoft on this one. Too many org's that run their own Apt repos just have `curl ... | sudo bash` as their install instructions.

I would suggest Microsoft could produce a 'release' package to make any future updates to the key/repo url/names a bit simpler, but not doing a curl|sh is already miles ahead of plenty of shops.

Doesn't this only work for PPAs that are in Launchpad? And doesn't Launchpad have extra limitations that third-party repos don't?

apt-add-repository is not installed by default.


HEADLINE: More stability


On systems that have very little customization, I regularly get 1-2 crashes after the login that ask me to report them again and again. Systems regularly fail to boot after upgrading the kernel when proprietary Nvidia drivers are installed (the ones Ubuntu suggested to me), because stuff is not properly recompiled. The file manager crashes when connecting to a SAMBA share for the first time during a session.

I can fix this crap (although I'm getting tired of it), but for regular users, they go straight back to Windows. Stuff like that simply can't happen in a stable release or at least it needs to be fixed ASAP.

I like Ubuntu, but think that you are handling the support for multiple releases poorly and it might be better for everyone to switch to a rolling release, like Windows did. The users would get better support and updates and your developers would have more time to improve the software, instead of managing broken releases. As it is now, you are getting buried in bugs and there's no end to it.

- FLAVOR: Ubuntu Desktop, Ubuntu Server, Ubuntu Core

- HEADLINE: Build from source, minimize deltas from upstream, and quit poisoning the Debian ecosystem.

- DESCRIPTION: I have repeatedly hit issues with core packages and applications that are solved by simply doing:

apt-get build-dep; apt-get source package; cd package* ; fakeroot debian/rules

Sometimes the packages fail to build. This tells me that you do not have an automated build regression system, even though Debian has gone to great pains to make this easy to automate.

I have hit bugs in packages because there is a large stack of diffs that have been applied to the package (logrotate is one example), but never upstreamed. The logrotate diffs include a "security patch" that is not well thought out, does not actually close a real bug, and causes logrotate to silently fail, filling /var.

This would not happen if you actively upstreamed patches, and reverted changes that are not approved by upstream, or addressed in other ways by upstream developers.

These two systematic issues have caused me to move away from Ubuntu for server and desktop use.

Finally, I've heard stories about Ubuntu devs forcing through controversial votes in the Debian project, and noticed an uptick in user-hostile decisions by the Debian project (like the forced systemd migration).

As a major contributor to Debian, Ubuntu should do whatever it can to improve the health of the Debian community, and generally improve the code quality + stability of upstream debian projects (without just killing off stuff that Ubuntu has decided not to ship).

- ROLE/AFFILIATION: Engineer/Researcher At work, we ship a hardware appliance based on Ubuntu. I've been using Ubuntu / Debian as my primary development environment for almost two decades, and am saddened by the level of bitrot I've encountered over the last 2-3 years.

- FLAVOR: Ubuntu Desktop

- HEADLINE: no new features please, just bugfixes and small adjustments

- DESCRIPTION: please spend at least one, maybe more releases working on polishing existing features and bugfixes. Ubuntu is like 90% there to be the standard desktop of Linux, and the remaining 10% is NOT in adding new features but making sure the existing ones work reliably and consistently. Yes, this is not as exciting as working on new features, but it is exactly what "professional" software development is about. It is pretty easy to get a software 80% done, much harder to get to 90%, but the really great stuff is when you get above 95%. The best OS is the one that JUST WORKS, and you don't even notice it. Same for the UI. So why not take a look at your bugtracker :)

FLAVOR: Ubuntu Desktop

HEADLINE: Child friendly (ad blocker, content filter)

DESCRIPTION: For my son's first computer, I picked Lubuntu and spent days making it "internet ready". I installed Dansguardian + Privoxy, then added uBlock Origin to Chrome, then added OpenDNS to my home router. It was a lot of searching online and trial & error but worth it. From time to time, I check websites he visited and what got blocked (grepping logs) and adjust accordingly. One problem with this is updates are blocked so I must disable proxy manually every time I update.

Please consider making something like this available out-of-the-box. Something that can be enabled/disabled with a few clicks. Also, a simple way to review history and adjust settings. It would make Ubuntu an excellent choice out-of-the-box for all kids. Thank you for asking.

elementary OS designed parental control, but in a bit of an opposite way as you've suggested. An administrator can create a new standard account, and restrict its access to the computer. For example, the standard user can log in between 5 PM and 10 PM, can have access to some of the applications restricted (as in, he can't run them) and he can be banned from accessing certain websites (blacklist-only options at the moment).

Not really what you're looking for, but built exactly in a way you want it: available out-of-the-box, tucked into system settings, there in case you need it. Could be of some inspiration to Ubuntu guys.

Elementary OS looks real nice!

Lubuntu like all Ubuntu have standard users with limited access and that is what my son has but I am less worried about his access to the computer or to the computer's features than about him seeing ads or things that isn't appropriate for his age.

Flavor: Desktop Headline: Polished and modern Desktop/User experience.

I'm using Ubuntu full time for the past 4 years. Some how it still feels like I am using some what old software although Ubuntu has come a long way since the beginning. I don't mind a release with no new technical improvements but only dedicated to improve all the little details and a polished experience of the overall user experience. Given looks are one of the important factors for an average user to evaluate a desktop, I believe any effort on this front will help a lot if furthering ubuntu adoption.

Role: Web developer and Digital marketer

What, specifically, do you mean by polished? Please give examples.

I get frustrated as hell by the inconsistent dialog boxes! When saving a file, the filename text is highlighted (i.e. selected), but when you start typing it sends the text to a search input box. Like... WTF?!?

Polished as in window animations, tastefully done transparent windows by default on hardware that support it, snappy application menu, desktop and file manager icons that conforms to grid, black title bar with white fonts is a too strong to name a few.

In my opinon, There must be one release tailored towards UI improvements among the three releases that leads to LTS preferably as the one that follows LTS because there is a solid platform to build upon and there is enough time to iron out UI bugs in the next LTS.

The theme is also quite dark. Notifications are black, background is dark: It makes me feel claustrophobic compared to a macOS. Maybe, generally, hire more graphic designers.

- FLAVOR: Ubuntu Desktop

- HEADLINE: Hide/Move/Replace the Unity Menubar

- DESCRIPTION: Please have an auto-hide function at minimum? Better would be to move the time/settings to the "dock" when you set the "move the menus to the app windows" option, and then removing menubar entirely.

- RATIONALE: It was awful the last time I used Ubuntu on a multi-monitor setup, wasting space on all displays. And having to click an app and window to give it focus, then swinging the cursor up to select a menu, then back to the app... I'm not sure why anyone would move the action (menu) from the context of the action (focused window).

Otherwise, I really do like Unity, especially since it has useful global keybindings out of the box.

- ROLE/AFFILIATION: Developer, but use Ubuntu for my personal dev ThinkPad.

P.S. I just started listening to The Changelog and your interview was very insightful. Thanks! For those interested: https://changelog.com/podcast/207

This is a topic debated to death between Mac and Windows users. Here is the argument in favor of the Mac/Ubuntu solution. The time it takes to hit a target is inverse proportional to its size, according to fitts law. A menubar at at the top has infinite height, thus it is a really large area to hit. In practice that makes it much faster to hit a menu bar at the top than inside a window.

What you don't want is an autohiding menu bar. When it autohides you can't see the location of the target before you move your mouse. For people who use this setup are used to throwing the mouse pointer up quickly to the target. You can't do that if the menu is hidden. Then it is a two step process. First throwing the mouse up to the menu and then making a selection.

This is the reason why we keep all frequently accessed GUI elements on the corners of the screen. These are the quickest places to hit. E.g. the Windows start button is fast to hit for this reason. But to be fair you are using the application menu bar a lot more frequent than the start menu.

You do have a point, you can't see the menu-bar items when they're hidden, however, I used Mac OS for work in the past, and the full-screen mode (in which the menu-bar does indeed auto-hide) is actually great. Even though you can't see the buttons, I was able to quickly memorize the rough location of what I want to press, so it didn't actually slow me down that much, even in heavily menu-reliant applications.

I feel like this would be even less of an issue with Ubuntu, since it has the menu-bar search feature, where you can "press" a button by typing it's name rather than looking through the menus.

Thank you for the feedback, and thanks for listening (and linking here for others)!

Thanks for taking the time to read. Hopefully it doesn't sound like a petty request. Keep up the good work!

And having to click an app and window to give it focus, then swinging the cursor up to select a menu, then back to the app...

I think it is copied from Mac.

FLAVOR: Ubuntu Desktop

HEADLINE: Officially supported i3 or equivalent

DESCRIPTION: i3 offers a vastly superior power-user usage experience, pretty much compared to anything else in the market. If Ubuntu would offer a properly configured/themed/integrated i3 desktop, I'd be happy to use it, because I've done enough pointless fine tuning for one lifetime. I'd be fine with some other tiling window manager too, as long as if it was at least as good as i3. I have doubts that this could be done properly with Unity, but I won't mind being surprised.

ROLE: Desktop Linux user since '96.

- FLAVOR: Ubuntu Desktop

- HEADLINE: optimize Ubuntu for people who suffer migraines/headaches and other health issues when working with displays.

- DESCRIPTION: There's a small niche of users who suffer badly when working with displays. There are all sorts of things to optimize(mostly to kill various flickers and too much brightness) - no backlight refresh by putting backlight at 100% and using some screen filter app , 16-bit resolution(32-bit in some display types is causing some flicker), up-convertion of videos to the highest frame-rate possible(if it's possible to do so for web videos - would be amazing!!!), various night modes and brightness controls, maybe recommending screens and devices that would help(selection is a huge issue).

btw, if you manage to really help here, this user niche will be very loyal, and will suffer a lot on other areas. Also - a well optimized machine, might be liked by regular users in a subtle way(less tired, etc).

- ROLE: desktop user with migraine.

I would really like to see this as well. Linux in general is a system I love to use but there's a ton of tedium involved in getting it set up in just the right way that it doesn't make my head throb. This isn't necessarily a setback for me from daily usage, but whenever a new version comes out, it soaks up a lot of time getting it re-setup.

+1 This would be a great feature for people like us.

FLAVOUR: Desktop

HEADLINE: Sort out the default colour scheme

I can't really comment on the more technical side, but the Ubuntu Grey/Purple/Orange colour palette is horrible - it makes the whole desktop feel claustrophobic. There's something icky about it.

Together with the 'quirky' Ubuntu font, which is hard to read at small sizes and not at all helped by Linux's mediocre font rendering, it makes for a fairly unpleasant experience.

Your designers should be looking at Elementary OS for how a pleasurable desktop could be designed, even if it's a bit to close to Mavericks-era macOS.

(I know it's possible to change the theme, but none of them have the fit and finish that a first-party one would have)

ROLE: Graphic and UX designer (who wants to love Ubuntu but can't for superficially visual reasons)

Changing default color schemes will impact many other apps. It will take ages to trickle through to all the apps you use. It will not look better until it does. Thats the kind of UI change that is cost, and disputable benefit. I like it, but even if you don't like it, its not bad, technically, contrast etc. And you can change it. At your own cost instead of at the cost of a very large developer community. Personally I use Ubuntu fonts by choice outside of Ubuntu. Luv 'em.

For me the default colors on ubuntu were never welcoming. Changing them would be great. Hopefully with Unity 8 they choose better and warmer colors

I'm definitely not a designer, but I will say I love the Ubuntu fonts

- FLAVOR: Ubuntu Desktop

- HEADLINE: Work out-of-the-box on Chromebooks

- DESCRIPTION: Turns out you have two choices for a well-built ultralight notebook: a MacBook (£1250) or a Chromebook (£250). The Chromebook can run Ubuntu, and run it well. But right now it requires a specially optimised version of Ubuntu (GalliumOS) and faffing around with firmware versions. If Ubuntu was easy to install on Chromebooks as it is on desktops or regular notebooks, that'd be a massive selling point for the OS.

Aside from having to install software for the back-lit keyboard I have no problems with Ubuntu 16.10 on my 2015 Toshiba Chromebook 2 (other than the part where nothing prevents the battery draining to 0% and everything getting erased). Why do you need Gallium and specific firmware?

New (Bay Trail/Braswell) Chromebooks don't have legacy boot support: https://wiki.galliumos.org/Firmware

Xorg doesn't initialise the display hardware to a usable state when the machine is running current Chrome OS, requiring firmware rollback: https://github.com/GalliumOS/galliumos-distro/issues/320

GalliumOS has numerous optimisations for Chromebook hardware: https://wiki.galliumos.org/About_GalliumOS#Why_GalliumOS_as_...

Which Chromebook do you mean?

Mine's an HP 11in G5, FWIW.

FLAVOR: Ubuntu Desktop


DESCRIPTION: You have no idea how upset I am the top comment is more "fancy, flashy" stuff instead of what Ubuntu really needs:

Stability. Better QA, not having my family and friends see another "$x had an issue" every time they boot into their accounts and being embarrassed that I recommended Ubuntu to them.

Seriously, I use gentoo, and my gf uses GNOME Ubuntu, and she has issues with the same services that I don't have a single issue with. Forget about multitouch or external monitors, no one other than fanboys and enthusiasts use that. Provide a stable experience first then move the boundary.

ROLE/AFFILIATION: Computational scientist, but also a Linux enthusiast for personal use.

I would just like to share a different perspective on your point about multitouch — I have several friends who have tried to make Ubuntu their first foray into Linux on a modern "convertible"/ultrabook/whatever with a multitouch screen and run into issues with responsiveness, scaling, etc. Not using Ubuntu myself I don't know how much work is really left here, but multitouch screens should definitely be treated as a first-class citizen of the HID world.

- FLAVOR: Ubuntu Desktop

- HEADLINE: Mouse Button Remapping

- DESCRIPTION: I'm a disabled user and "left-click" with my thumb. At the moment, there's no visual way to do that in Ubuntu's settings.

I have to run something like `xinput set-button-map "Evoluent VerticalMouse 4" 0 3 0 4 5 6 7 0 2 1 2` whenever I login, or connect my mouse, or if the phases of the moon changes.

Please - all I want is a GUI where I can say "For this mouse hardware, use this button map."

Thanks :-)

> I have to run something like `xinput set-button-map "Evoluent VerticalMouse 4" 0 3 0 4 5 6 7 0 2 1 2` whenever I login

Can you not put this in your .bashrc or something?

shrug Maybe. I want to use my computer, not constantly fiddle with it.

- FLAVOR: Ubuntu Server

- HEADLINE: Separate purge-old-kernels command from byobu package

- DESCRIPTION: I like byobu, it's extremely helpful but I would prefer the purge-old-kernels script to be in a separate package. I like to run servers with the minimum amount of packages installed and don't really need byobu since most of my maintenance are remote commands. /boot gets filled up quickly and the purge-old-kernels is a script I think is well written and perfect. I want it separated from byobu, please.


This one would be useful.

There are several things to test, and things that may break. It is possible to get corner cases.

I would suggest to get purge-old-kernels on 17.10 in order to test it, and decide whether to put on 18.04 LTS later on.

Yeah, so see above... This one is 100% my fault. I wrote Byobu as well as purge-old-kernels to solve a really annoying problem. And I jammed purge-old-kernels into Byobu because Byobu is always installed every I have Ubuntu, and I always want purge-old-kernels everywhere I want Ubuntu. In any case, you're right. This is a distro-level problem that needs to be solved properly.

- FLAVOR: Ubuntu Desktop (or any)

- HEADLINE: Installer should allow dual boot with encrypted disk

- DESCRIPTION: Currently it is impossible to use the Ubuntu installer to install Ubuntu on only part of a disk if you want the Ubuntu partitions to be encrypted. (If it's not impossible, it's hard enough to figure out that this advanced user couldn't, so it might as well be impossible for new users.)

Disk encryption is a requirement nowadays, and many users want to dual boot when they first install Ubuntu. So this prevents users from even trying Ubuntu.

- FLAVOR: Desktop

- HEADLINE: Bring back gaming support for AMD graphics cards.

-DESCRIPTION: Pipe dream, but: the ability to run games with an AMD graphics card, the way we could with 15.10. Google "Steam AMD Xenial" and you'll see how big of a mess this is.

As of a year ago, gaming on Linux was pretty viable with an AMD graphics card, using fglrx. However, because that was deprecated, it was removed in 16.04, and the open-source drivers can't handle 3D games, at all. Most 2D games are non-starters as well, literally: the graphics freeze before I even get to the opening screen and I have to REISUB. I'm running an R9 390, but this is widespread among basically all AMD cards.

AMDGPU is an option, but only for some cards, and thats only for 16.04 - it won't run on 16.10.

I could go more into the history and the compatibility, but suffice to say, the intersection of the different versions of {the kernel, mesa, opengl, fglrx, open-source drivers} on Ubuntu now means that I have no choice but to boot into Windows to run games.

Please file bug reports for your issues, and not just as blanket statements. Many people find the open-source drivers a viable option for gaming, especially now that OpenGL 4.5 is supported and a lot of performance optimization has happened. Your case sounds unfortunate, but it's certainly the exception rather than the rule.

It's true that the version that comes with the Ubuntu releases tend to be a bit behind, but you can also try the Padoka PPA.

> Please file bug reports for your issues, and not just as blanket statements

I'll admit, it's been a while, but my experience with filing tickets for graphics-related issues like these has not always been particularly positive. Debugging them and actually identifying the root cause is quite difficult, and I end up getting bounced back and forth between different bug reporting tools for different OSS projects that may or may not be ultimately the root of the bug, and each of which thinks that the other is the more likely cause.

I have some sympathy here because I know it's tough to identify, but it's a huge time investment on my part for very little apparent gain, especially since these issues are already reported.

Besides, as I said, these issues are pretty well-documented already. I don't think there's a lack of information about the issue; it's just not an easy one to solve, and there are a lot of different organizations that are responsible for various pieces.

The AMD engineers have decided to only support the open-source amdgpu driver and diverted all resources from fglrx to amdgpu.

Due to this, 16.04 does not have fglrx because the X11 server in 16.04 is not compatible any more with fglrx.

Around this time (end 2016/early 2017) it was supposed for amdgpu to get parity in features with fglrx. I have not checked, are there issues missing from amdgpu? You would need to test, and preferably test with the AMDGPU PRO driver distribution from AMD (has the very latest support that may take a bit of time to make it to the upstream projects).

> I have not checked, are there issues missing from amdgpu? You would need to test, and preferably test with the AMDGPU PRO driver distribution from AMD (has the very latest support that may take a bit of time to make it to the upstream projects).

I don't know. AMDGPU requires an older version of the kernel, so I'd have to downgrade from 16.10 just to try it out.

> AMDGPU requires an older version of the kernel

AMDGPU-PRO requires an older kernel and Xorg. AMDGPU is open source and works great with Linux 4.10.x

For the R9 390, this[1] is a high-critical bug that has been outstanding for over 18 months. If it gets to 2 years old, I'm throwing it a birthday party.

[1] https://bugs.freedesktop.org/show_bug.cgi?id=91880

- FLAVOR: Ubuntu Desktop

- HEADLINE: Refresh (or replace) built-in themes

- DESCRIPTION: I'm well aware that many hard-core users don't care that much about visual aesthetics of the user interface, but I think this makes up a lot of the impression first-time users have of Ubuntu. While solid and generally fine, the built-in themes look could use some overhaul, or replacement. One of the first things I do when setting up a new instance is downloading and installing third-party themes and icon sets. It's funny how some people are surprised "how good Linux can look", because many still have the impression of it being a hacky, patchy, hard-to-use nerd OS.

- ROLE/AFFILIATION: Software developer, web-related full-stack, Ubuntu user by choice (amongst MacOS evangelists)

- FLAVOR: Desktop

- HEADLINE: Application Menu search like MacOS

- DESCRIPTION: I usually use macOS but occassionally use Ubuntu and I really miss the ability to lookup functionality in my application by typing the name of a menu entry under help. On macOS this will drop down the relevant drop down menu and show the menu entry I am searching for. I use this a lot. Especially in complex applications this is very useful to have.

- ROLE/AFFILIATION: Software Developer

Ubuntu already has that. They call it HUD: https://wiki.ubuntu.com/Unity/HUD

Press Alt (as mentioned in other comment)

Press Alt?

- FLAVOR: server / all

- HEADLINE: remove sha1 PPA signatures

- DESCRIPTION: remove the warning "signature by key uses weak digest algorithm (sha1)" and ban sha1 for PPA signatures

- ROLE: user

This needs to be done slowly or you're going to piss a lot of people off with broken shit.

SHA1 is already broken shit.

Don't be unreasonable.

- FLAVOR: Ubuntu Desktop

- HEADLINE: Low Latency Audio Server + Touch support for pro audio


Running pro audio apps under any linux distro is still pretty much a pain, mostly due to the problem of getting a low latency audio server to run without lots of manual configuration at the risk of breaking your system, by installing jackd, running a rt kernel, and not breaking existing sound servers (pulseaudio).

_Audio stack and drivers_

Google has announced that android O 8.x would ship with a completely new low latency audio server, enabling pro audio apps under android, all such apps have been iOS, OSX and windows exclusives up until now.

Since google has done it under android it should be doable on GNU/Linux ?

Today more devs are porting pro audio apps to GNU/linux: Bitwig Studio, Renoise, Harrisson Mixbus have linux native versions and REAPER has a beta linux native build.

However running these DAWs at rock-solid low latency with an up to date audio interface is hard/impossible for config issues and lack of driver support.

This would most likely require engaging discussion with audio interface manufacturers to develop/port their drivers to linux (Focusrite, Presonus, RME, Avid, Roland, Tascam) Focusrite Scarlett in particular is the best selling enthusiast-level USB 2.0 audio interface range in the world today, with Presonus a close second. RME, Apogee, AVID, MOTU, etc. are high-end stuff that will not appeal to enthusiasts. RME already has rock-solid support under linux.


Most current and future audio DAWs and apps are going the down the multi-touch route (Bitwig, Presonus Studio One, etc). Sanitizing the audio stack on linux and enabling proper touch support would allow Pro audio apps to run on linux (most likely using WINE at first, as most pro VSTs are windows -- or mac -- only).

Considering all the privacy issues and crap ads that ship with win10 (browsing through pro audio forums will show you that that most people are stuck with win7 for running their DAW computer, do not want to upgrade to win10, and win7 support will stop really soon) and the absolute ripoff that the Apple HW is nowadays, linux might become attractive to audio enthusiasts, maybe pros in the long run?

- ROLE/AFFILIATION: Comp. Sci. Researcher, music enthusiast.

> Most current and future audio DAWs and apps are going the down the multi-touch route (Bitwig, Presonus Studio One, etc).

Multitouch on Ubuntu is not fantastic, but it does work. I can use Bitwig without trouble on my touchscreen laptop.

The only major problem I've run into is that I can't figure out how to control the multitouch gestures, and some of them conflict with multitouch gestures that the DAW needs (for instance, making a three-finger chord on the on-screen keyboard).

Exactly what I meant, proper 10 point touch and gesture support. Out of curiosity, what kind of audio interface are you using, what audio stack and latency? It can be even trickier to get usb audio interfaces to work properly on laptops because of USB power throttling, which can be hard to configure under Linux.

- FLAVOR: Ubuntu Desktop

- HEADLINE: Go back to colaboration with gnome-project

- DESCRIPTION: The fragmentation in the linux desktop is getting retarded, both effort (GNOME and Unity) are crippled by the lack of colaboration in the toolkits and applications. This was a marvel up until ubuntu 10.10 which was the last linux that anyone would need. I just miss the good old days.

- ROLE/AFFILIATION: Ubuntu enthusiast since 6.04.


- FLAVOR: Desktop

- HEADLINE: Option to disable all animations and transparency effects in Unity

- DESCRIPTION: With a big (>=2560x1600) monitors and a not high-end graphic cards they are not smooth anymore anyway and my PC is freezing up randomly (but seldom) when switching between applications.


Ubuntu has exactly that since 16.04.1:

1. Open CompizConfig Settings Manager (if not installed, `sudo apt install compizconfig-settings-manager`)

2. Click the "Ubuntu Unity Plugin" plugin

3. At the bottom, "Enable Low Graphics Mode"

4? Restart Unity / reboot (? because I'm not sure it's necessary. See for yourself :)

Agree it would be neat to surface the option in the regular System Settings panel, though.

Source: http://www.omgubuntu.co.uk/2016/11/enable-low-graphics-mode-...

For that matter, why isn't CompizConfig installed by default? So much stuff is hidden in there and it's impossible to discover without running across someone talking about it online. This is exactly the sort of thing that should be on the menu by default as "advanced compiz settings" or something like that.

On the other hand, discoverabiliy is still a big issue with Unity in general.

> "why isn't CompizConfig installed by default?"

Because it's a loaded shotgun aimed right at beginner users feet?

FLAVOR: Ubuntu Server

HEADLINE: Smaller Docker Images

DESCRIPTION: An official, skinnied down, Ubuntu image for docker and AWS AMIs would be nice. I have some clients that want to maintain some uniformity across host and guest, so they aren't interested in Alpine or Busybox images. But the Ubuntu image is ~200MB or so, where OpenSuse is about half that.

I understand Canonical doesn't build those images, but you would have the expertise to help them thin it out. Some wrapper around debootstrap or similar to make a thin server image?

ROLE: Help various clients with docker and AWS.

Just a heads up that AFAIK these Docker images are built on top of the Ubuntu Core ones (which are ~30MB) so this is likely the overhead of multiple Docker layers that we all know about, see https://github.com/docker-library/repo-info/blob/master/repo... as it apparently explains where the core image comes from. Disclaimer: I am a @canonical.com but not involved directly with that so I also +1'ed it :-)

afaik, canonical does build those images. See here: https://cloud-images.ubuntu.com

Hmm. I assume some transformation happens before they end up as docker images or AMI images.

In any case, what I'm asking for is some conversation between Canonical and Docker, Amazon, etc. To see if there's something obvious either side can do to skinny these down. The ubuntu image is for sure the most popular AMI, and I imagine one of the most popular docker ones. The collective bandwidth and time gain of optimizing the size would be significant. Currently, the ubuntu images are significantly larger than other popular images.

Canonical is absolutely responsible for building those images. And yes, we do work with Amazon, Docker, et al. And yes, we're actively working on reducing image size.

That said, what's "minimal" to one is not "minimal" to another. We can certainly take stuff away, until you end up at Alpine or Busybox size. But then we've stripped away the essence of what's Ubuntu. So it's a very delicate dance!

I switched to alpine because Ubuntu docker containers were massive.

- FLAVOR: [Ubuntu Desktop]

- HEADLINE: An awesome hardware partnership

- DESCRIPTION: This is probably stretching the limits of everything being fair game. Nevertheless, I've always found Ubuntu support for MBPs to be below par and haven't been able to justify using it over OSX since switching hardware. Now that Apple seem to be losing the plot on the hardware side I'd really like to see Ubuntu running as a first class citizen on a high end laptop.

No plastic cases, no innovative features (I mean touch bars or dials not 4k monitors), just fast, quality kit with superb software support.

ROLE / AFFILIATION: Contract Java developer, long time Ubuntu user but not on a desktop for a few years now

- FLAVOR: Ubuntu Any (preferred this to be in upstream Debian)

- HEADLINE: binary diff updates for apt-get.

- DESCRIPTION: I have seen Fedora updates as binary diffs. It is very small, uses less bandwidth and space, and gets installed faster.

This request isn't really for Ubuntu 17.10 though (I don't know if there is enough time for this). And I don't wish (actually I hate) this to be an Ubuntu specific feature. I wish this to be an upstream (Debian) feature.


For vanilla Debian you can use "debdelta". It is not integrated to apt-get and it misses a lot of packages but still helps a lot compared to downloading full packages. Ubuntu would need a separate delta server for it to be usable for their packages.

Flavor: Desktop

Headline: Surround Sound

Description: If a user has a media file or application that wants to play surround sound audio, 5.1 or higher, it should work properly and automatically. AC3, Dolby Digital, dts, etc. should all function properly with all different hardware configurations.

I'm aware that it is possible to make it work properly with some effort, but it is not elegant or automatic. The user should not have to do anything special. It should "just work".

For example, a user has a surround sound system connected to their computer's optical output. They play a media file or DVD that has a surround sound audio track. That audio track is selected. The surround sound should play properly with no further special configuration. The user should not have to know that pulse audio or whatever even exists.

- FLAVOR: Ubuntu Desktop / All

- HEADLINE: Disk Encryption that works without gotchas

- DESCRIPTION: Currently, there are options to do full disk encryption and encrypting your home directory while installing. These options are fine, but

* File name limits.

* You cannot encrypt your drive after the fact. So you need to reinstall your system if you find out that you need it encrypted.

- ROLE/AFFILIATION: (Optional, your job role and affiliation) Software dev / user

- FLAVOR: Ubuntu Server

- HEADLINE: First boot post-install hook

- DESCRIPTION: There is currently no clean way to have a script run only once post-install, first boot. There are hacks for making this work to a degree, including things like self-deleting init scripts. I would most prefer to see this hook officially supported in robust way.

- ROLE/AFFILIATION: Systems Engineer

The docs say the test is if /etc is empty. Most packages provide some kind of defaults in /etc when they're installed - wouldn't that mean this never triggers?

No, it will work even if stuff is installed in /etc. The actual check is `test -f /etc/machine-id`: https://github.com/systemd/systemd/blob/5978bdd05fed013d301f...

I'll file a bugreport with systemd about the inaccurate documentation.

EDIT: Here it is: https://github.com/systemd/systemd/issues/5696

More immediately, the actual check is a test for one of the flag files that I mentioned at https://news.ycombinator.com/item?id=13473273 : /run/systemd/first-boot . This file is initially created/unlinked to correspond with the result of the check that you mention, but is also modified later on.

A prior bug report discussing the "misleading" doco in this very area was https://github.com/systemd/systemd/issues/5562 , which was closed as "not a bug".

Good to know, thanks.

cloud-init? cloud-init.org

FLAVOR: Ubuntu Server

HEADLINE: Include a PyPy3.5 package

DESCRIPTION: Ubuntu already has a package for PyPy compatible with CPython 2.7 in the official repositories. However, a CPython 3.5 compatible version was recently released[1]. PyPy is painful to compile on your own if you don’t have enough RAM. Therefore, an official package would be welcome.

[1] https://morepypy.blogspot.com/2017/03/pypy27-and-pypy35-v57-...

ROLE/AFFILIATION: Researcher at a university

- FLAVOR: [Ubuntu Desktop]

- HEADLINE: handle GPU driver update better

- DESCRIPTION: Updating GPU driver can be a pain especially after a kernel version upgrade. Common issues you would see includes a black screen (kernel module incompatible), the login screen stuck in a loop (unity or compiz problem).

on notebook, this could be worse, as some notebooks have 2 gpus. and linux gets confused at which one to use.

I hope you could work with notebook hardware company to fully test a notebook product with a discrete gpu. given how popular deep learning is these days, developers really need a linux notebook with gpu computing.

If anything, increased stability for general-purpose usage would be very nice. Increased hardware support, especially drivers for some wi-fi cards need a lot of work.

I really love Linux desktops, but they have too many stability issues/crashes to completely switch from Windows to Ubuntu or any other linux distribution.

- FLAVOR: Desktop

- HEADLINE: Increased stability++

I shy away from using my ubuntu laptop (dell xps developer edition, you know, the one you'd expect to be doing this really well) because

a) More often than not when starting up it gives me a "something went wrong, do you want to report it dialogue"? I've stopped bother to report it or look at what's happening because it happens so often, but I think it's X crashing at some point.

b) WiFi frequently fails to connect after hibernation, requiring a reboot.

c) There's also been some worrying threads on HN about lack of support for strong kernel power management on recent intel generations.

That WiFI issue is so annoying for me. Sometimes

`sudo systemctl restart network-manager.service` fixes it without having to reboot, other times even that doesn't work.

Yes agreed. I have this issue too.

> WiFi frequently fails to connect after hibernation, requiring a reboot.

`sudo iwlist scan` fixes it without reboot in my case. But it is very irritating.

I'll concur with the others on wifi network support.

- FLAVOR: Ubuntu Desktop

- HEADLINE: Improved wifi network support

- DESCRIPTION: Most of the time I don't have an issue, but occasionally (ok once a month at the venue that hosts our Java developer group) I have network issues that I never had when this laptop was running windows (7 or 10). Basically at this one location I can't stay hooked up to the network for more than 5 minutes without a disconnect, none of the windows/mac users around me have an issue.


I've been using Ubuntu as my main OS for a decade now, and I agree. Especially the UI crashes much more often than I'd like, and I always get a "there has been an error" error message at startup. Please make things more stable.

same thing here. its clunky to login and always see an error message.

- FLAVOR: Ubuntu Desktop.

- HEADLINE: Produce a working Bluetooth stack.

- DESCRIPTION: The [audio] Bluetooth stack is in an embarrassingly malfunctional state, especially after the move to Bluez 5. Based on my tests on multiple machines and devices, even simply connecting a BT headphone requires hacks of the BT stack. Historically, the [audio] BT stack has always been in malfunctional state, regardless of the latest developments.

- ROLE/AFFILIATION: developer/sysadmin.

FLAVOR: Ubuntu Desktop

HEADLINE: Tiling window manager that just works

DESCRIPTION: Tiling wms are great. However most have regressions compared to Unity; e.g. need to wrestle with systemd to get screen locking on suspend working, weird interaction issues between gnome daemons, etc. Easy enough to get a nicely functioning system with some googling, but it'd be great to have a tiling wm with no integration issues out of the box. Perhaps fork i3 and add what's needed to make it work seamlessly after install. Call it unity-tiling?

- FLAVOR: Ubuntu Desktop

- HEADLINE: Integration with Microsoft Active Directory

- DESCRIPTION: Would be nice if in enterprise environment single-sign-on (logging on with kerberos) would work out of the box :). Samba shares in nautilus are usually also slow (against windows server, between linuxes is ok) or have some other logging in problems.

- ROLE/AFFILIATION: Software developer using Ubuntu in enterprise, which officially supports Windows.

+1, and one of the few things I miss from CentOS. It handles domains beautifully. You run one command to join the domain, and for the most part, everything else Just Works.

FLAVOR: Ubuntu desktop

HEADLINE-1: Support for Wayland clients in Unity.

DESCRIPTION-1: I don't think it will be beneficial for Unity to have a different window system protocol from the rest of Linux desktops (including non-Unity Ubuntu flavors). I don't want X11 to stick around as the compat layer that works with both Unity and everything else. Please make Mir into a Wayland compositor.

(I like the Unity UX. I'm not a Unity hater. Currently, I'm sticking to 16.04, because I don't have confidence in Ubuntu not breaking things by making Mir have its own protocol.)

HEADLINE-2: Autoremove old kernels before /boot fills up.

DESCRIPTION-2: The UX of having to manually remove kernels with an LVM/LUKS setup (using the default /boot size the installer chooses) is bad and makes Ubuntu with disk security unsuited for non-geek users.

ROLE: Browser engine developer but speaking as a user.

- FLAVOR: Desktop

- HEADLINE: An advanced mode for the file manager

- DESCRIPTION: I find that the default file manager is a bit dumb. There should be a mode to enable advanced features; like 'connect to server' when one can pick sftp. ftp, smb, nfs, vboxsf etc. It's fine if it's hidden in a configuration modal but 'advanced mode' should be an option.


Nautilus does support that: https://help.gnome.org/users/gnome-help/stable/nautilus-conn...

I'm not sure what version of Nautilus Ubuntu is on though.

There is that already. Click 'Other locations' in the sidebar and at the bottom of the window you have 'Enter server address...' textbox. Put your sftp:// or smb:// url there. Discoverable shares (smb) will be displayed in the window.

Vboxsf is not a network redirector as it is in Windows; look for vboxfs mounts in the /media directory.

Related: at the moment when a folder mounted with s3fs (Amazon S3 file system) is opened it downloads all the contents which doesn't really make any sense.

Also doesn't Midnight Commander do what you're after?

- FLAVOR: Ubuntu Desktop

- HEADLINE: Improved remote desktop

- DESCRIPTION: Remote desktop solutions for desktop Linux really haven't changed a whole lot since I first started using them in the late 90s. It would be great to get something out of the box that was as responsive and feature-rich as, say, Windows's remote desktop feature. VNC is functional of course, but lacks a lot of the fluidity of other remote desktop solutions. Bonus features would include remote clipboard, sound, printers, and files.

As it stands, if I think I'm going to need to remote into a Linux desktop, I set up a Windows host and run Ubuntu in a VM. Then I use RDC/RDP to connect to the Windows host and run the VM in full screen. That's surprisingly more responsive than just running VNC in a native Ubuntu installation.


X2Go [1] and NoMachine (proprietary) [2] are the best bets for somewhat decent remote desktop experience in Linux.

[1] http://wiki.x2go.org/doku.php

[2] https://www.nomachine.com/

Thanks. I've tried them in the past and never could really get them to work. I'll give them another look.

- FLAVOR: Ubuntu Desktop

- HEADLINE: Dismissable Notifications

- DESCRIPTION: I have been using Ubuntu from 10.04. One thing that makes be curse Ubuntu is when my notifications cannot be dismissed. I expected it at-least when it moved to Unity but that never happened. Although I have been living with it, this is something which catches me frequently.

- ROLE/AFFILIATION: Software engineer and maker

I actually prefer Ubuntu's way of doing it. The notifications never get in the way (since if you hover over them they became transparent and clicking on them actually clicks underneath them). Whereas on macOS the notifications stay for ages and get in the way.

This is one of the worst things about Unity

My biggest wish is Ubuntu (and Debian) switching from systemd to any other init system. I know that won't happen but I was asked and that's the only thing I want, whenever you like it or not.

also check https://www.devuan.org

we are very close to release Jessie stable, backed by a vibrant community


For those preferring an introductory video: https://www.youtube.com/watch?v=wMvyOGawNwo

Just saw the video and must say that it has really sparked my interest in Devuan. I heard of Devuan when the fork was first announced but never gave it much notice, now is the first time I try to understand what it really is. It seems like Devuan is much more organized and well-thought than I imagined. This is music to my ears:

Devuan will do its best to stay minimal and abide to the UNIX philosophy of "doing one thing and doing it well". It will foster diversity and freedom of choice among all its components and will perceive itself not as an end product, but as a a process, a starting point for developers, a viable base for sysadmins and a stable tool for people who have enough experience with computers. Devuan will never compromise for more efficiency at the cost of the freedom of its users, rather than leave that and the responsibility for a secure setup to downstream developers.

I need to do much more research and of course testing, but Devuan could be light at the end of the tunnel.

I'm poised to try out Void Linux after trying out FreeBSD (it was missing too many conveniences like Dropbox and Steam).

Could you convince me why I should try Devuan instead?

I need to make much more research before I will convince anyone, but as a start I like the philosophy of Devuan very much.

I have moved many many servers from Debian to FreeBSD after the announcement of systemd, and this has been great, but I must agree with you that on the desktop it can be a little inadequate.

Hasn't Devuan be "close" to a stable release for 2.5 years or so now?

The first thing I did after reading the Ask HN was Ctrl+F 'systemd'.

This is a big part of why I moved away from Ubuntu. I know this is a difficult situation - whenever Ubuntu tries to find a better solution it gets flak for dividing the community. But here it would have been great to have Ubuntu with an alternative to the disaster that is systemd.

I'm curious what init system would you prefer?

I would really like to see s6 in Ubuntu, but sysvinit + OpenRC would also be fine.

Why? You need to develop your answer.

I want software to be simple, well-engineered and lightweight. In my opinion systemd is the exact opposite of that. I think this page explains it well: http://suckless.org/sucks/systemd

In practice I have had many extremely frustrating problems with systemd including systems that suddenly became unbootable.

Late to the party,but better later than never, so -

- FLAVOR: Ubuntu Desktop

- HEADLINE: Support i3 as full-integrated desktop

- DESCRIPTION: I'm using i3 for years now, just because I love the minimalism and the window tiling - I no longer see the purpose of overlapping windows. However when I install and switch i3, invariably something breaks in the inner Unity/Gnome system - the special keys stop working, the control panel needs magic invocations to bring up all the icons, etc. I would love to have the base graphical system working flawlessly even if I switch from Unity to i3. For extra points, please make i3 installed by default.

- ROLE/AFFILIATION: I work for Time Out, the leading global magazine about going out!

BTW, thank you for all the hard work you and the team put in over the years!

Flavor: Ubuntu Phone[0]

Headline: Availability and Development

Description: I would love to see Ubuntu as a serious alternative to either iOS or Android in the mobile space. The availability of phones with Ubuntu pre-installed as well as the devices[1] that support the image (for self-installation) are extremely limited. Its also not clear to me whether the project is still alive.

[0] https://developer.ubuntu.com/en/phone/ [1] https://developer.ubuntu.com/en/phone/devices/devices/

In the last month two friends went shopping for new phones and both considered the Ubuntu Phone. The models were sold out. I've heard similar cases in the recent past of models not being available. I hope they can fix these supply chain issues.

FLAVOR - Desktop HEADLINE - If I try to "Quit" an app via the app bar more than once, please `kill -9` it (optionally, an are-you-sure dialog). DESCRIPTION - Sometimes apps lock up. Like a forever-running query just destroys my SQLDeveloper and I have to pull up a command line to kill it because the UI of the app has locked up and right-click->Quit doesn't do anything.

- FLAVOR: Desktop

- HEADLINE: Laptop Support

- DESCRIPTION: Support for various notebooks. Wireless and high resolution screens and battery life seem like pain points.

We have some biologists using ubuntu on the desktop and when they want to use a notebook, its not easy to make that happen, so they end up on macs.

- ROLE/AFFILIATION: Software developer for biologists

FLAVOR: Ubuntu Desktop

HEADLINE: Night mode by luminosity inversion

DESCRIPTION: Contrary to some other suggestions here, I am NOT talking about f.lux / redshift or similar blue light filters here. These are supposed to make you feel sleepy, but all I want is to remove blindingly bright lights in the middle of the night. Here is the pseudo-code for how it could work:

    Get some region on the screen (possibly the content of a window)
    convert all pixels in that region from RGB to HSL (not HSV/HSB)
    if average L value in the region > 0.5 {
        for all pixels {
            L = 1 - L
            re-render pixel
Similar color inversion modes that I know of:

  - a Kwin invert script, possibly assigned to meta+ctrl+i in KDE based distros

  - MS Windows color invert mode: win+"+", ctrl+alt+i
Note however that these are inferior as they change color composition since they invert RGB channels and dont do a HSL conversion

I don't know if it uses "luminosity inversion" thing you mentioned but I'm more than happy with xcalib(1). For extra comfort I apply redshift after inverting colors.

- FLAVOR: Ubuntu Desktop

- HEADLINE: Wayland

- DESCRIPTION: Get with the rest of the community. Bite the bullet and get unity working on wayland.


- FLAVOR: Desktop

- HEADLINE: Shared Electron

- DESCRIPTION: With the rise of JavaScript applications running on top of electron, it would be nice to have an electron package to depend on (much like Android WebView). This way not every Electron app weights over 50mb.

I think the Electron people need to do some work on this, right now they seem to release updates very frequently. Perhaps they need an long term support version.

No one has been ragging on Unity.... They should be. Unity is still garbage. Ruined Ubuntu for me after about 10 years of usage... Still never returning because Unity is such a resource hog, so non-standard in its interface, and the fact that teaching someone Unity is a useless skill.

We used Ubuntu for years to teach people how to use a computer for the first time: we gave them old PC's with Ubuntu installed. Unity made this impossible. It was too slow for the old machines, too hard to figure out for the new users, and too unfriendly for experienced users to tolerate.

Hate hate hate hate Unity. Always will. Went to Mint because of it. Even installing another windowing system was a huge pain in the ass, because first, you had to install Unity and go to Synaptic and install an old Gnome. This took HOURS because Unity was so freaking SLOW!

I dunno, maybe you fixed these things, but Unity ended my relationship with Ubuntu after years of advocating for it to everyone I knew.

The only actionable complaint in your whole rant is that it is a resource hog. For reference on my system all the processes with "unity" in the name take about 210 of RAM (with a dozen windows open), a negligible amount for a modern computer or phone even.

It took me about 3 seconds to figure out click the menu button and type what I want. That really is the largest user facing feature, and its a common one with other main menu based OS UIs.

For the people I have put in front of it (about 5), all but one (my technophobic grandmother and never understood any UI, still has a time with the very concept of files and folders) figured it out quickly and rarely ask me questions. The most common questions is how to install some windows program.

You lack of concrete examples and its conflict with my experience leads me to believe you are exaggerating.

In my experience it is true that Unity struggles on older computers, or perhaps where the graphics driver situation isn't great. I've had a lot of installations where the dock takes 5-10 seconds to load. Agree with you about ease of use though, people seem to pick it up quite quickly. For the basic functionality of opening and switching applications, people seem to pick it up almost immediately. Not quite the same for using the dock, but even so.

On the other hand, I really like unity. It works just like my old Windows 7 setup, but with handy features like being able to search menu items using alt.

Technically Unity seems to be a bit of a mess, and I like Gnome 3's multi-desktop setup better, but I definitely would like to see them keep the basic design of unity.

Ubuntu dropped the ergonomic Gnome to whip us with their own custom, unpolished Unity.

Now they're doing it all over again with Mir, while Wayland could be a Linux standard. Sounds like IE all over again.

That gives me trust issues with Canonical. More, they now leverage trademark law to forbid people from using ubuntu-? packages.

Then they pulled the Amazon search in the menu, it showed Canonical's misunderstanding of privacy issues.

Doesn't prevent from using Ubuntu, but I give 1% of my company's revenue to open-source ($800, to LE and Postgres), I don't consider Canonical as a candidate in the OSS category.

Sounds like systemd vs upstart all over again.

Ubuntu Mate, mate. ;)

Flavor: Ubuntu Desktop

Headline: a more up-to-date apt-repository


I'm tired of having to add PPA:s for when I need fresh copies of software. I've never not needed Latex, Python, pip, Gradle, etc. now for most of these apt-get works fine but not LaTeX, Gradle so for now I have a bunch of scripts that I run, for instance https://github.com/leksak/settings/blob/master/install-tex.s...

I'd look to CoreOS for inspiration on how apt-get could be revamped

Yes! I am using macOS at work, and I was surprised to discover that I like Homebrew much more than Ubuntu's APT, because of how up-to-date Homebrew's packages are.

- FLAVOR: Ubuntu Desktop

- HEADLINE: independent work-spaces for each monitor with multiple monitors

- DESCRIPTION: On MacOS when you maximize an application it creates it's own "workspace" and each monitor handles these independently. With GNOME 3 each additional secondary monitor is it's own workspace. These are both great but not ideal. It would be great if Unity could be more like the tiling manager i3 and have independent workspaces assigned to specific monitors. Let's say you have a laptop with two workspaces 1,2 and an external monitor with 3,4,5 then when on the laptop monitor ctrl-alt-arrow would switch between 1 and 2 only but the workspace on the external monitor would stay where it is. Then when on the external you switch only between the workspaces on that monitor.

- ROLE: software/infrastructure engineer

- FLAVOR: Ubuntu Desktop

- HEADLINE: Make Ubuntu Make a first class citizen and bring accompanying documentation alongside this.

- DESCRIPTION: Ubuntu Make has undergone a couple of stages, including a rename process. I would love to see maybe a graphical tool that is either stand alone or a plugin to the Software Center sorta. Maybe a "Ubuntu Make" application with a nice little icon, and it should come with basic tooling at first, but should be a resource for finding documentation on how to build SNAP packages, DEB packages, and just all out do software development for Ubuntu, whether back-end or front-end. I've seen ElementaryOS' documentation and it is nice, I would love to see Ubuntu become a great way for people new to software development and Linux to really get to dive in. Ubuntu Make has more potential than it gets credit for. I would also love to see it resolve installation issues if possible of other compilers and build tools, if there are known issues and known solutions, or some process to aid in fixing such issues that might not be so trivial to newcomers (though that's just me thinking way ahead of time). I hope it gets serious attention at some point. I've had odd issues with the D compiler (DMD) because I'm missing a package or it has to be symlinked, something a newbie would spend hours searching could be part of a simpler set of documentation for developers somewhere.

- ROLE/AFFILIATION: Software Developer and daily Ubuntu User at work and at home.


- HEADLINE: OpenSSL v1.1.0

- DESCRIPTION: Do it! I really want ChaCha20 and Poly1305.

- ROLE: Server admin / desktop user

I feel that I'm totally out-of-sync with the rest of open source community. The only thing that I really want is a hardware company with a strong focus on open source, basically an Apple for open source.

I want a single website w/ a shop, docs and related resources where I can consume anything from a mobile device, laptop, chromecast-like devices or anything similar.

I've spent $3000 for my last laptop and the most important thing was compatibility with open source software.

FLAVOR: Ubuntu Desktop

HEADLINE: Better (more polished) HiDPI support (also for legacy apps)

DESCRIPTION: I am running on 16.04 so I might be missing same latest fixes. But, some applications (especially Qt application like VLC player) have the issue with HiDPI monitor. Moving app between HiDPI and non-HiDPI monitor required restart in order to get correct sizing.

Intel VTune has this problem as well. Even their 2017 edition appears to use Gtk2, which AFAICT makes it partially resistant to the usual workarounds for HiDPI scaling.

- FLAVOR: Ubuntu Desktop

- HEADLINE: Add tablet mode similar to Windows 10

- DESCRIPTION: As far as I know, Ubuntu has no tablet mode, which makes it difficult to use with touch screen laptops like the Lenovo Yoga series


I'm adding +1 to this.

Got a Yoga for 3-4 months now, but still haven't managed to find some time to play with its configuration properly. What I've figured out so far: the laptop sends a special "key" (as in, special char is "pressed") when rotated over 180 degrees. This can serve as a trigger (but unfortunately, that key is different depending on the model of the Yoga). I have no idea how to make the screen rotation work. Making HiDPI options integer-only makes me still unable to set scaling properly and I have to resort to a "hack" (maximum resolution and then scaling all the interfaces using the tweaks tool). onboard package is a stability mess as a touchscreen keyboard.

I'm not forced to dual boot because of gaming. I'm forced to dual boot because I can't get the damn tablet experience for light browsing.

- FLAVOR: Ubuntu Desktop

- HEADLINE: Make KDE again a first class citizen

- DESCRIPTION: Kubuntu used to be very similar to the Ubuntu distribution and now, because of the "fork", it´s drifting. It is also very different on configuration, packages and behavior when doing an `apt-get install kde-desktop` on an Ubuntu installation versus Kubuntu, and it should be the same. - ROLE/AFFILIATION: Research Scientist on a large Multinational

- FLAVOR: [Ubuntu Desktop]

- HEADLINE: Improve Desktop Apps Ecosystem. Make it easy for Ubuntu App Developers to Make Money $$$

- DESCRIPTION: I recently moved from Mac OS X to an Ubuntu desktop machine for day to day development. All my comments are relative to Mac OS X (I apologize cause I'm still a Mac fan boy). The only thing I really miss is the massive number of high quality apps available to me on Mac OS X. I wish Ubuntu could support Mac Apps in some sort of Mac sandbox (ala Wine for OS X). I know this is a pipe dream cause of the complexity of it but putting it out there.

A more realistic request is that you create/encourage tool makers to create Snaps. Snap packages must become compatible with flatpack to have any chance of becoming ubuitquitous. Fragmentation in Linux desktop apps will only continue hurting Linux adoption. I think the Ubuntu App directory feels too basic with too few options. Encouraging developers with better tools, better discovery and making it simple to port Mac/Windows apps to Ubuntu is the only way Ubuntu can begin to gain marketshare. I love Ubuntu but I still go back to my Mac Book PRO when I need to edit audio or have to login to many sites since I use 1Password and they have no Ubuntu app.

Ubuntu could work with the top 500 Mac App developers and help/advise them on how to easily port their Apps to Ubuntu. I'd happily pay double the price of the Mac App store Apps to have them on Ubuntu but their is no way for me to give them money. Get money to the developers and they'll come. This is missing from Ubuntu Apps.

I apologize for the long rant. I would've written a shorter comment but I didn't have the time.

- ROLE/AFFILIATION: Developer at Startup

- FLAVOR: All flavors

- HEADLINE: General polish + "good" defaults for non-technical users.

- DESCRIPTION: Quite a few releases we had lots of new features, however they all shipped with a LOT of bugs, some small, some big - I would really love if once in a while the major focus would be to just polish the defaults to make the experience hassle free for users. Xubuntu shipped with broken color scheme or not working sound, Ubuntu Gnome almost always has some bugs that are a pain. I would love to have a release where all the desktop functionality just works and is polished without me tinkering with things.

- ROLE/AFFILIATION: Developer/Sysops

- FLAVOR: [Ubuntu Server, Ubuntu Cloud]

- HEADLINE: Follow standards and respond to bug requests.

- DESCRIPTION: The cloud team is responsible for making machines available to cloud users, including making vagrant boxes. The problem is this team refuses to follow standards. For example, vagrant boxes should have the main user named "vagrant" but instead forces the user to be named "ubuntu"- and there's been a ticket open about this that's been open for a year now[1]. There have also been network bugs[2] that have been ignored for almost as long.

This is a big deal for people who use vagrant for testing. We essentially can't use the Canonical provided boxes, and this issue having been ignored for so long is not confidence inspiring.

1. https://bugs.launchpad.net/cloud-images/+bug/1569237 2. https://github.com/mitchellh/vagrant/issues/7288


HEADLINE: Convenient snapshot & rollback by default

DESCRIPTION: Possibly implemented as snapper + lvm thin provisioning or btrfs. Other distros already have this, but it is far from user friendly.

Just curious, what is missing from the implementations in the other distros you tried? I switched from Ubuntu to openSUSE last year and so far that's the only distribution where i've used snapper + lvm. I think having an implementation like theirs is pretty user friendly (just a personal opinion) or at least a good start. It gives an option to use snapper + btrfs at install time + option to boot into a read only snapshot from grub at boot time.

I haven't used SUSE, but I've used snapper with lvm thin provisioning on RHEL. It works, but it still needs manual fiddling with config files. There are a couple of other issues with lvm thin on its own. The metadata size isn't chosen well by default from what I remember. It can easily get full. There is also manual trimming required on deletes (can be done in a cron job). lvm-raid isn't directly offered with thin provisioning in Anaconda. It would still do mdadm, and run lvm-thin on top of it. Overall, the experience is not great.

I suppose I never had to do that because openSUSE handled the configuration for me. But I see your point. The experience can definitely be improved by the distributions. I had never used snapshots before moving to SUSE and now that I've actually used snapper to rollback a couple of times I don't think I'd want to have a linux install without that feature.

That's my no. 1 request too. Just two days ago I had to reinstall and waste a day because an update messed up my system.

My solution for now is to have TimeShift[0] make regular backups of system files.

[0] https://github.com/teejee2008/timeshift

ZFS is the drop-in FS for /root to allow this. (btrfs et al are not production ready).

lvm thin provisioning lets you achieve this with any file system. So it doesn't have to be zfs or btrfs. It can be xfs or ext4 too.

It would make sense as soon as the default filesystem becomes btrfs or ZFS. As it is now, it would need to be a feature that only appears if it finds that you have the proper filesystem.

Not necessary. Look at lvm-thin + snapper.

FLAVOR: Ubuntu Desktop

HEADLINE: Make Adobe Photoshop and Lightroom happen.

DESCRIPTION: I want Ubuntu to have some strategic plan to get Photoshop and Lightroom fully working (and supported), as well as monitor-color-calibration software. We'd move my wife's photography business to Ubuntu in a heartbeat if this happened.

AFFILIATION: I provide support and guidance on computing issues for my wife's photography business.

+1 to this. That was the main reason I still have to use Windows and macOS.

- FLAVOR: Ubuntu Desktop

- HEADLINE: Make Ubuntu not suck on laptops

- DESCRIPTION: What I want is for Ubuntu to partner with someone on the hardware side to provide a meaningful alternative to the macbook pro that does not suck. The OS is already fine enough if you could make it work very well with a decent laptop out of the box. I have tried Dell Sputnik...endless software pain. I have tried System76... crappy hardware. Make a Linux laptop experience that does not suck and rivals Apple for quality. That is what I want.

- ROLE/AFFILIATION: Director of large IT/Ops team in large scale SaaS environment

I know that this is very unlikely to happen, but I wish Ubuntu had rolling releases. For me it would be okay to have a new version every 10 years (for heavy migrations like UEFI, 64 Bit, systemd). I had Ubuntu on most PCs at home, but switched most of them to Arch, as I was sick of the 6-mothly horror upgrades.

- FLAVOR: Ubuntu Desktop

- HEADLINE: Rolling Releases

- DESCRIPTION: make a distribution which does not require any 'apt-get dist-upgrade' as 'apt-get upgrade' always brings it to the latest stable software version (like Arch and Gentoo)

- AFFILIATION: just a long time linux user

@dustinkirkland great idea to ask HN :-)


So, for the record, that's exactly the approach we're taking with Ubuntu Core. We're getting there! Thanks for the feedback.

- FLAVOR: Ubuntu Server

- HEADLINE: Default swap space doesn't make sense for servers with HUGE ram

- DESCRIPTION: Recently I tried to install ubuntu on a server class machine where it had huge amount of ram and disk storage was spread across many ssd disks. Apparently due to the size of the ram, ubuntu was attempting to set aside so much swap space that it was taking up most of the boot disk! It was very painful to change the default and i would have switched to centos if not for LXD availability. (Note that I am a programmer, not an admin and I was doing this as an experiment)

I heard an anecdote at $work where they ordered servers with positively huge RAM (in the TiB range) for big-data applications, then wondered why the storage box was filled up within a few days. Turns out some admin remembered advice from a 90s-era system setup manual that recommended to set swap size = 2 * RAM size.

Yes, indeed. I assure you, I've been fighting this particular battle since 2008 :-)

In any case, I hope you'll be pleased to learn that 17.04 will actually use "swap files" rather than "swap partitions", which are far more easily adjusted after the fact, than swap partitions.

- FLAVOR: Ubuntu Desktop, Ubuntu Server

- HEADLINE: Windows subsystem

- DESCRIPTION: Windows 10 lets you install Ubuntu as a subsystem and use it without dual booting. In practice, we _need_ windows tools (like WebDeploy) or GUI tools (like Photoshop) at work but would much rather use Ubuntu in general. The compromise (ubuntu subsystem) works but the other way around would be much better. I'm fine with paying for Windows and also CLI tools only would still be a great start.

- ROLE/AFFILIATION: Software developer. I also introduced a lot of people to Linux over the years for home usage.

If I could run some of the DCC apps I have on Windows under Linux at full speed (which work only on Windows, predominately Adobe), even if I have to have a Windows license and all that jazz, that would be the ultimate setup for me.

Seems like a duplicate of https://news.ycombinator.com/item?id=14004532 ?

This already exists: WINE

You need to convince MicroSoft first.

FLAVOR: Ubuntu Desktop, Ubuntu Server, and Ubuntu Core

HEADLINE: traceroute

DESCRIPTION: Installing some version of traceroute by default may be desirable, because sometimes when you find yourself wanting traceroute, it's because you want to debug a problem that happens to prevent installing packages over the network.

If I try to run traceroute on a system with no traceroute package installed, I get a message telling me I can either install traceroute or inetutils-traceroute. It doesn't explain what the tradeoffs are. It doesn't explain why Ubuntu can't simply have one good traceroute program that does everything.

mtr can also be good, and while I usually run it in text mode, it does have an X11 version that may pull in more dependencies than some people might prefer. I've also on occasion found tcptraceroute useful, and of course sometimes a Paris traceroute is good to have. Installing more than one program that has traceroute functionality in the default installation might be appropriate.

FLAVOR: Ubuntu Desktop

HEADING: do not break things that work

DESCRIPTION: every time I update Ubuntu I cross fingers for havings things that work not broken, like Guake on more monitors and other bugs. Ubuntu is so much prone to regression bugs. Maybe more tests would be useful?

HEADING: the Unity menu ui is bad designed

DESCRIPTION: Apart from the apps search feature which works well, the apps navigation is so ugly: giant icons, I have a 2k monitor and I see just 30 apps when I go on the apps list!!! WTF!!! I have to scroll this giant icons menu also beacuse the app list isn't resizable or fullscreenable! Those giant icons drive me mad, no joke!

FLAVOR: Ubuntu Desktop (on laptops)

HEADING: Fix long-stanging WiFi issues

DESCRIPTION: there are a lot of bugs related to WiFi on laptops. I had the Power Management: Off one: http://askubuntu.com/a/537375/53268 but there are many others. I've always experienced bad stuff

ROLE/AFFILIATION: Web developer, freelance

  FLAVOR: Ubuntu Desktop  
  HEADLINE: High quality Bluetooth sound by default
DESCRIPTION: Tried bluetooth sound in Ubuntu 16.04 for the first time yesterday and the sound was horrible! Apparently I need to do some configuration to get it working properly. Not needed on android. Soundblaster Jam headset.

  FLAVOR: Ubuntu Desktop
  HEADLINE: Improved battery performance

  FLAVOR: Ubuntu Desktop
  HEADLINE: More stable and polished desktop
DESCRIPTION: Yesterday a window frame in fullscreen got stuck. Meaning I had a cross in the top left corner no matter what I did.

  FLAVOR: Ubuntu ALL
  HEADLINE: Node.js package updated to latest Stable version

  FLAVOR: Ubuntu Desktop
  HEADLINE: CTRL + ALT + L no longer locks the screen, replaced with SUPER + L
DESCRIPTION: CTRL + ALT + L is "format code" in intellij. SUPER + L locks screen in WIN. I always have to modify this...

For updated nodejs packages, try NodeSource. https://github.com/nodesource/distributions/#debinstall

They have a shitty curl|sh installer script, but it should be possible to extract a regular deb line and gpg key out of it.

- FLAVOR: Ubuntu Desktop

- HEADLINE: Don't force users to have taskbar on the left

- DESCRIPTION: Most users have the taskbar at the bottom. Putting it on the left by default is probably a bad idea, but making it impossibile to move it is most certaintly an awful idea.


FLAVOR: Ubuntu GNOME (but I also like Unity)

HEADLINE: Trackpad drivers that feel like Apple's

DESCRIPTION: I'm using libinput, but my Magic Trackpad is no fun at all - thumb rejection does not work, the acceleration curve seems to be different from macOS, and the whole OS lacks kinetic scrolling. fusuma works for gestures, and should be part of Ubuntu (GNOME/Unity) IMHO. Having to use a mouse = physical pain.

ROLE/AFFILIATION: Freelance developer, tepidly moving from iOS programming into JetBrains IDEs

- FLAVOR: Ubuntu Core

- HEADLINE: zfs setup in installer.

- DESCRIPTION: I would love to see an easy way to install the system with zfs. Current way is to use the wiki by zfsonlinux. And lets say it that way: It is not easy for beginners...

Is that for Ubuntu Core or Ubuntu Desktop?

what is your use case?

FLAVOR: Ubuntu Desktop.

HEADLINE: Better support for proxy for those of us behind corporate firewalls.

DESCRIPTION:Passwords need to be kept in env variables which can leak out. Every tool does it a little different. curl, wget, chrome, firefox. I had to modify python code for apt-get to pass the proxy.

An Acquire::http::Proxy in apt.conf wasn't sufficient?

- FLAVOR: Desktop

1. HEADLINE: Allow users to setup a caching drive in the standard installation process

Currently, the setup process for creating a caching drive(I have a 16gb SSD in addition to my HDD) is very convoluted, with lots of conflicting information about how to setup bcache. Even after finally getting it working, my computer will still hang occasionally when RAM is maxed out and the cache drive has to write to HDD

Honestly, the only thing I really care about are wifi drivers, and it isn't really your fault that the card makers are bad at that.

- FLAVOR: Ubuntu Desktop

- HEADLINE: Unity Tiling Manager


Unity with native tiling manager features that can organise windows automatically like XMonad, i3, Amethyst, etc. But not replace Unity as window manager.

I adore Amethyst automatic tiling in macOS, especially on a 34" ultrawide screen. I used to use Compiz Grid in Ubuntu to manually layout my windows but that was a chore. Then I tried X Tile which was limited, poor UX and poor support for multiple monitors.

XMonad, i3 and others mean replacing Unity all together which I do not want, I just would prefer built in window organisation in Unity. Supporting Xmonad and Amethyst's shortcut keys would be nice for muscle memory.

- ROLE: Technical Architect / Consultant

- Flavor: Desktop

- Headline: Rolling mesa, drm & kernel updates

- Description: mesa is moving at rapid pace and it's improving a lot. Because versions are locked you might find yourself 6-8 months behind current stable and thats MASSIVE. That's why padoka/oibaf PPAs are so popular - but only among the more tech savvy users - the rest just look at the sad state of Linux gaming..

How about making an official (but opt-in) version of the padoka/oibaf PPAs, instead of going full-out rolling updates?

FLAVOR: Ubuntu Desktop

HEADLINE: Stability / UI Bug fixing / Apport UI

DESCRIPTION: Sorry, long rant :)

Have been admin at Uni for 30 Ubuntu workstations. All 16.04 so I don't know about 16.10 or 17.10 improvements but what's missing in Unity is polish.

- The "Ubuntu has experienced a problem" dialogue needs rework and needs to move to the tray or be queued - there should also at least be the name of the application on the modal. I've seen situations where there are more than 50 of these modals layered on top.

- There are already bugs in launchpads for Unity, please consider them and work on making the experience more smooth. Especially focus on making Window management sane with other Apps that are not always Qt/GTK, like emacs, xterms and stuff. There bugs in the menu bar, window position is often broken - lot's of small stuff like that. The launcher tends to misbehave. Would really love if Ubuntu just did a sabattical year of fixing all the bugs in the Unity UI and thinking about good design.

- Menu bar is subtle broken for a lot of apps.

- Nautilus and gvfs should take a long look at some things dolphin and KDE are doing right and adopt some ideas.

- Also stability, stability, stability. Nautilus eating 10gb of memory due to a large folder, or handling of large files is all kind of broken. This is stuff that happens daily for a lot of users and investing some time to implement sane behavoir should not be so hard. Basically I wish that the Ubuntu Desktop team torture their UI and take notes how it breaks. Opening a 10Gbyte .tar.gz, having 10.000 files in a folder, over nfs, over sshfs. Stuff like this. Needs to work without hassle and provide feedback, not hangs.

- The small stuff matters, polish. Often when something does not work no UI feedback is provided. Torture your desktop, do stupid things and see how it breaks in strange ways. Fix that!

ROLE/AFFILIATION: Computer science student, Linux user, Admin for Ubuntu Desktops

Other than that: Good job, I like Ubuntu and Unity. But beeing stable and rock solid would make it not only okay, it would make it great.

- FLAVOR: Ubuntu Desktop

- HEADLINE: NVIDIA-nouveau conflicts that result in black screens after login unless various fixes are applied manually.

- DESCRIPTION: No more nastyt nouveau-NVIDIA driver conflicts that result in black screens after login -- see all these reports here: https://www.google.ca/search?q=nvidia+ubuntu+black+screen&oq...

- ROLE/AFFILIATION: CEO, Exocortex.com / Clara.io / ThreeKit.com

Sadly, the best workaround is to uninstall the Ubuntu blessed nVidia drivers and use the installer you download from nvidia.com. These even continue to work after you update the kernel, usually. They are also way more recent.

EVen so, this should be automatic. It is a horrible experience to install NVIDIA drivers on ubuntu. I am not sure the solution but we shouldn't sacrifice usability in the same of "open source"-ness.

It should be a check box on install to install NVIDIA drivers that are propreitary such that we do not get a black screen.

Anything but a black screen.

FLAVOR: Ubuntu Desktop

HEADLINE: Terminal-Icon on LiveCD-Desktop

Please put a shortcut to a terminal emulator, somewhere visible, on the desktop of the Ubuntu LiveCD.

Whenever I have to use that disc in an emergency situation, I'm glad that there is an icon to amazon (in case I forgot the URL of amazon), but I'm always struggling to figure out how to get to a bash prompt

ctrl-alt-t is my go-to hotkey!

- FLAVOR: Ubuntu Desktop

- HEADLINE: Fix hibernation with entire hd LUKS encryption

- DESCRIPTION: I know this is an issue on a grander scale, but as we all know hibernation isn't possible when you have your whole disk encrypted. If this can be fixed that would be great, or at least remove the option to hibernate then.

- ROLE/AFFILIATION: Senior Developer at Clevertech

"as we all know hibernation isn't possible when you have your whole disk encrypted" -- this should be stated during the installation procedure of ubuntu


HEADLINE: Allow safe sensible package fixes

DESCRIPTION: Sometimes the distribution version of a package is broken and the problem is marked WONTFIX because it involves a version bump, even in the case where it is not a library or the version bump is only there to fix a typo in a config file. This is extremely frustrating for end users when they learn that mplayer will never have GUI support in any version of Ubuntu 14 or there will never be manpages for zsh. If something is a bug and there is no reasonable chance that another package depends on the buggy behavior, allow the package to be fixed.

I don't follow Ubuntu packaging policies specifically but isn't this what the $RELEASE-backports suites are for?

Yes, that's exactly right.

- FLAVOR: Ubuntu Desktop


- DESCRIPTION: Windows 10 will soon be able to run Ubuntu Xenial as a subsystem, I would like to see Ubuntu response with a superb wine integration.

FLAVOR: Ubuntu Desktop

HEADLINE: Multi-Monitor Support with HiDPI

DESCRIPTION: I would like to be able to use multiple monitors with various DPI in Linux without pain and suffering. Please see Mac OS X for how to get this right — they did. I would like to stop worrying about which of my monitors are plugged in at boot, I'd like to be able to plug them in whenever I need to. I'd like to be able to smoothly move a window from one screen to another without the window becoming impossibly small or overly large.

ROLE/AFFILIATION: Software and Electronics Engineer trying to do his job(s) using Ubuntu.

- FLAVOR: Ubuntu Desktop

- HEADLINE: Wayland Support

- DESCRIPTION: MIR is almost a bigger joke than GNU/Hurd and will never be complete, I hope Ubuntu includes Wayland as default

FLAVOR: Ubuntu Desktop

- HEADLINE: easy way to remap keys

- DESCRIPTION: until now I had to write a script wich runs on startup and maps my print key to the secondary menu key - this gets lost after opening my laptop from its sleep state. I want a nice GUI w/o having to write a script

- ROLE/AFFILIATION: (Optional, your job role and affiliation)

- FLAVOR: Ubuntu Desktop - HEADLINE: Wayland, Wifi support - DESCRIPTION: numerous wifi dongles still don't work or require unnecessary work

I had problems with integrated Wifi in laptops.

I try 'to sell' Ubuntu to my family but the Wifi drivers are always a problem. Specially with HP laptops.

Please god let this be taken up! I'm so fed up of

     sudo service network restart

Hi Dustin! - Ubuntu Desktop - Ubuntu Subsystem for Windows :) - An integrated system (Wine is not user friendly imho) to launch windows programs. - linux (and windows) user and developer. @vinnes

I may be wrong, but I don't think Canonical has the engineering resources to pull it off. Even if they did, the main issue is that Microsoft does not publish their API spec which is why Wine and ReactOS devs bend over backwards to be compatible with Windows binaries. It also does not make sense from a business perspective. Considering there is no demand for such a system outside of the very, very small number of hobbyists who run Windows games and software on Wine.

If these issues are anything to go by, there most certainly is a demand:

* https://github.com/Microsoft/BashOnWindows/issues/1494

* https://github.com/Microsoft/BashOnWindows/issues/1243

* https://github.com/Microsoft/BashOnWindows/issues/1516

Did you not realize what vinnes meant by "Ubuntu Subsystem for Windows"?

Also I want to add, if you want to use Windows programs using VirtualBox or VMWare (+ lots of RAM) is probably the way to go. The Windows stuff doesn't work on macOS either. (I even bought the CrossOver stuff ;))

The computer is already running Windows, there's no need for Wine. Just the ability to launch a Windows program from within a Linux program would do.

Flavour: Ubuntu Desktop

Headline: Preconfigured settings per known device

Description: Allow user-published pre-configurations to be published on Ubuntu.com. Then allow me to review and apply the entire thing or fragments to my fresh Ubuntu install. I should have an XPS M1330 install that just gives me the stuff for my computer.

- FLAVOR: Ubuntu Desktop

- HEADLINE: Bleeding edge drivers with autodetection / appropriate kernel tuning

- DESCRIPTION: How many year has it been that we need to have correct performance management / drivers enabled to correctly use quicksync with discrete GPU's, for how long will we need to tune cpu behavior / peripherals power management ourselves to have decent power usage? A "I'm the system, I know what I need" one button optimization would be really appreciated...

- ROLE/AFFILIATION: System analyst in a SB.

- FLAVOR: Ubuntu Phone

- HEADLINE: I want a snap-based Ubuntu Phone now

- DESCRIPTION: Being a click-based Ubuntu Phone supporter from the beginning, do I need to say more? Ubuntu show me some love!

- ROLE/AFFILIATION: beta-tester

- FLAVOR: Ubuntu Desktop

- HEADLINE: Better touchpad gestures out of the box

- DESCRIPTION: I recently got my first ultrabook. I used Windows on it for the first few weeks before installing Ubuntu. The touchpad gestures were very useful for certain activities such as minimizing/maximizing and switching between windows. It seems that Ubuntu has a very limited set of gestures, and after a couple months I still feel like my productivity is held back due to the relative difficulty of switching among windows.

- ROLE: Full-time student

- FLAVOR: Ubuntu Desktop

- HEADLINE: Remap Ctrl+Q to quit to something else

This is an UX mess, as it's too easy to mistype for another key (like W or 1) and ending up closing the program we're currently in. This destructive action already has a «standard» way (Alt+F4), which is way harder to mistype. Destructive actions shouldn't be as easy to do.


- FLAVOR: Ubuntu Desktop

- HEADLINE: Add Flux/Redshift natively

As iOS/macOS is adding a light filter for the night, this feature will be more and more common natively in OSs, why not add it to Ubuntu now?

- FLAVOR: Ubuntu Desktop

- HEADLINE: GUI Everything (real control panel GUI)

- DESCRIPTION: command line is good to give commands sequences, like "do this and do that and also more", and it works only if you already know the commands. Command line is really bad to configure stuff, which is the act of telling the computer how to do stuff. It is also the worst thing ever when it comes to exploring and finding commands and configurations. Some people argue that the cli is faster but the saved time is not always worth the brain power or the pleasure to get stuff done "slower" but intuitively with a GUI. Also the time spent to learn a certain command rarely matches the time saved using it. It is much more difficult to screw stuff up using a GUI, because you can go back with a simple click, while a command to go back rarely matches the one that put you forward toward something you didn't want.

A general rule for good software is "don't hide functionality". If you are putting a lot of important stuff behind a command line, you are hiding stuff, even if you can ask for a command list.

Since Ubuntu, for what I understand, wants to be an OS for a wider audience, I hope you will consider doing putting a lot of effort in improving the UI and UX of the OS, and a good, complete GUI are a great way to start.

My hopes are that if a user searches "how to do X in ubuntu", he won't get just a list of commands, but also a step by step guide. Just like it happens on windows.

ROLE: software developer, former UX/UI designer

Loving Ubuntu myself, I would like a system not requiring command line stuff for fixing things, so everyone could use and maintain it, not just experts (e.g. a "fix my computer" button that in worst case it could reinstall everything but the home folders).

CoreOS-like A/B system partitions would be a very useful addition to Ubuntu.

Or at least snapshoting and then booting to last good snapshot.

Both solutions (a/b partition and snapshost) would require separate partition or subvolume for /home.

If this is interesting to you, you'll want to keep track with what we're doing in Ubuntu Core!


This feature already exist guys: https://docs.ubuntu.com/core/en/reference/gadget

That's not user friendly. E.g. hard to understand for most parents or grandparents.

Oo hum... I think you’re going too far, it should come preinstalled on their laptop/desktop/whatever. Either by a OEM Manufacturer or you. They don’t have to be involve in the process.

FLAVOR: Ubuntu Desktop

HEADLINE: Clean up repos and remove non-working / non-maintained / bad applications

DESCRIPTION: There are many old and/or bad applications in the official Ubuntu repos. Prune aggressively. Anything that hasn't been updated for several years could be flagged for human review. Anything that people use will get PPAs made for them in time. Anything that's dead doesn't deserve to be in universe or multiverse.

- FLAVOR: Ubuntu Desktop

- HEADLINE: Disable Bluetooth on startup

- DESCRIPTION: Bluetooth is turned on when Ubuntu starts and people are struggling to deactivate bluetooth on system startup. For further references check this: http://askubuntu.com/questions/67758/how-can-i-deactivate-bl...

FLAVOR: Ubuntu Desktop and Ubuntu Server

HEADLINE: DANE for TLS in Firefox, wget, curl, etc

DESCRIPTION: Support TLS server verification using TLSA DNS records protected by DNSSEC as described at http://www.internetsociety.org/articles/dane-taking-tls-auth... and https://en.wikipedia.org/wiki/DNS-based_Authentication_of_Na... ; this should have a smaller attack surface than the current mess of X.509 certificate authorities that are trusted by web browsers. Doing this well may require better client side DNSSEC validation; my impression is that DNSSEC validation deployments in the real world today often tend to have only the recursive resolver doing DNSSEC validation, with a potentially insecure connection between the client and the recursive resolver. Firefox probably ought to check the entire DNSSEC signature chain itself.

- FLAVOR: [Ubuntu Desktop, Ubuntu Core]

- HEADLINE: Multi-Seat and Multi-Head Out of the Box

- DESCRIPTION: It would be a great way to cut costs if a single machine could support multiple workstations like SoftXpand does for Windows 7, out of the box without requiring an expert to configure. Though currently possible, it seems to be requiring a lot of configuration.

In developing countries e.g. India where I live and work, people might not come or vote and contribute for such features but this will be a huge step towards making Linux available to many more children at school and home and more hands at work. For schools, this could make computers available for a single computer making computing available for 4-8 children after installing some additional graphics cards. Being in the e-learning Industry, I see this could give a lot of momentum to computer literacy in schools.

This could be a huge maintenance and energy saver at the workplace at will. Now that almost all cards nowadays contain multi-heads, just installing an additional card could make a single computer server upto four workstations.

- ROLE/AFFILIATION: IT Administrator of an expanding company

- FLAVOR: Ubuntu Desktop

- HEADLINE: Laptop Hybernation to disk.

- DESCRIPTION: Options for what to do when you close your laptop lid: sleep, suspend, hybernate, shutdown, stay on. Automatically hybernate when asleep/suspended and you reach critical power.