It supports ARM; and a bunch of ARM-based netbooks are due out soon. They are cheaper and much more power efficient than intel's Atom etc. And Windows doesn't run on them (yet).
I'm not a big linux vs. windows or ARM vs. intel fanatic, but I'm excited at what seems to be a disruptive inflection point coming up in the next year or so...
I have an Eee 1000 which had a 40gb SSD - there are two reasons that you can't get that today.
- SSD prices have gone up.
- You can't put the cheap version of Windows on a device with a large SSD, so in the interest of rationalising product lines you can only get small SSDs or hard disks.
If Windows can't be put on the device then at least the second pressure won't exist.
There isn't proper quoting on HN, but three popular conventions are:
1. italics - to this with asterisks ( * ). To operate, they must be adjacent to the text that is being quoted, and end up looking like this. If not adjacent, you get * this *
2. Use ">" and indentation, like in email.
3. use quotes - like you did.
nb: there is help page available for the above syntax: http://news.ycombinator.com/formatdoc
It used to be available as a link next to the textarea when you comment,
but I just checked and it appears to be gone.
You might want to check out the FAQ in general if you're new (read: I strongly suggest you check out the FAQ): http://ycombinator.com/newsfaq.html (there's a link to it at the bottom of the frontpage; I'm being extra helpful by repeating it here)
Presumably Microsoft won't allow it. To make Windows competitive they have a discount for OEMs to install Windows on machines with low spec, and presumably this SSD is too large to be able to take advantage of that.
Me too. So many netbooks are coming out now that don't even work with Linux properly now like the Samsung ones and Ubuntu 9.04 even was broken with WPA Wireless on the EeePC 901. Also Dell only offer Linux on the very lowest model in their netbook range.
Sadly I've yet to see an ARM system with a Linux friendly graphics system. They all seem to use PowerVR stuff, which is the sames stuff that Intel used to stuff up their previously unblemished graphics suppor on Linux.
Some of the ARM Cortex-A8 systems look pretty sweet. The OMAP3 series come with a fast-enough processor, a lot of IO controllers, H.264 hardware, crypto hardware, a DSP, and a graphics accelerator, all on a single die. All of it very power-efficient, and fairly cheap.
Its all well and good for Ubuntu to push further ahead but they would do a real big favor to the push of Linux on the Desktop if they could solve 1 major issue , remove the ugliness in the default rendering of the Fonts.
Yes, i know it can be done via a bazillion font config options and yes its very powerful and flexible, but lets get it right by default.
1) There aren't many free typefaces for daily use that are beautiful, so you have to buy (or commission) several and "open source" them.
2) Techniques for AA font rendering that maximize aesthetics and readability are protected by numerous software patents. Years ago there was a short-lived patch in Xorg that had a modern font-AA algorithm that held its own against the competition, but it turned out to step on several patents so it was axed out of fear of litigation. That fear is very real, as typography-related tech patents are one of the most aggressively guarded elements of tech-IP.
As a result, the Linux developers have been forced to re-invent the wheel from the ground up when it comes to typography in Linux distributions.
I'm fairly certain that this is a common misconception, which was true at one point in time but no longer applies but I'm not really expert enough (in fonts or Linux) to sound confident in that.
Basically, my take is that you need the fancy expensive typefaces and the patented hinting if you want type to look really good at small sizes and without anti-aliasing i.e. Windows before cleartype or Macs before OS X.
The vast majority of people don't want that, they want OS X style rendering, particularly if their monitor is 95dpi or up. Admittedly there are a vocal minority of geeks that grew up with the former type of font rendering, but even they seem to limit this preference to coding fonts and even they seem to be leaving this behind as dpis go up.
If you want OS X rendering on Linux you can have it today, without patented code, or the need for meticusously hinted fonts (which makes them much easier to produce as open source).
It basically involves going into the advanced Gnome font controls turning off all hinting. Voila! "Blurry" Mac OS X style font rendering. You can add sub-pixel rendering to taste.
You can also make it look like XP, which I personally think is a bit retrograde and ironically this does require well-hinted fonts (like Microsoft's) and patented code.
Yes, it's a part of Freetype, Ubuntu just doesn't enable it at compile-time.
In include/freetype/config/ftoption.h:
enable FT_CONFIG_OPTION_SUBPIXEL_RENDERING
enable TT_CONFIG_OPTION_BYTECODE_INTERPRETER
disable TT_CONFIG_OPTION_UNPATENTED_HINTING
Even under the most anal-retentive freetard fealty to Intellectual Property Law, distribution of C source code cannot infringe upon patents, only distribution of compiled binaries. It's the difference between promulgating plans for the machine (the original point of patents) and capitalizing on said plans.
Gentoo is the only distro I know of that has it turned on by default.
Ubuntu's avowed mission is to bring Linux to the Masses. For the masses, a computer still "looks" like Windows. Ubuntu being a Funded initiative unlike Arch(my choice), Gentoo, Crunchbang et all needs to fund the development of typography on Linux.
Yes, its possible to download msttcorefonts (with some creativity you could also get the Consolas set of fonts) and get Cleartype or Os X type smoothing - but as long as its not the default , i am afraid there is no pushing Linux onto the masses.
I think you overestimate how closely "the masses" pay attention to fonts. (Unless you're talking about "the masses" of programmers and designers, which actually is still a healthy chunk of Ubuntu's potential market at this point.) Yeah ok the hinting isn't so great and Bitstream Vera Sans is a little too wide but it's not THAT bad.
I should have been clearer - most programmers i know are happy to try out the various font customization techniques and some of them accept it even - by "masses" i meant the large number of people who have used a computer(primarily Windows) before and are willing to give Ubuntu a try, but they do not want or know how-to fiddle with font customizations.
This group of people is the natural customer for Ubuntu and i believe the vast majority of them go back to their previous O.S simply because the default fonts for them are not Windows-like.
My point is Ubuntu would better serve its "natural customer" by focussing on making the O.S look good for this group of people i.e. look Windows-like by default, rather than other newer features.
The masses still use Comic Sans for presentations. They don't really know or care what that serif/sans business is all about and they definitely don't care about the default font. As bad as it is, that's what I see all around.
But I'm not sure what is the problem with font customisation you're referring to. There's a dialog with 4 basic choices (from "blurry" to "crisp, subpixel) and another dialog for fine tuning where you see examples for every single option. How can it be simpler?
I'd not heard about these subpixel patents, but I'd be happy to have them removed from Linux as I honestly don't think they add that much in Mac OS X style font presentation, as opposed to Windows Cleartype style.
I personally greatly prefer Windows style font rendering (presumably #2). Fortunately Ubuntu is flexible enough that I can configure it for Windows style font smoothing.
1) OS X. A tad fuzzy, but everything is rendered _accurately_. Consistent thickness and character tracking (spacing).
2) Windows. Everything is slammed into pixel boundaries, making it look clean. Verdana is optimized for this style of rendering, so it doesn't look terrible.. but it's definitely not accurate.
3) Ubuntu. Worst of both worlds. Fuzzy and inaccurate.
I am honestly having a hard time believing someone with decent eyesite can say ubuntu(3) is fuzzy while osx(1) is not,
ubuntu obviously chooses sharpness over balance (H and E), however osx has virtually every lowercase character heavily aliased, its blurry to the point where its almost 2 shades lighter than its supposed to be and there are individual characters A, u, B, a where it is almost a joke
OS X has the glyphs clipped to an imaginary sub-screen-resolution grid. The same letter may in some instances take up one black pixel and in others take up two (and be greyish in both). That's what he meant by accuracy. On linux every identical glyph is rendered identically, no matter where it is positioned.
I absolutely admit that OS X has a somewhat fuzzy appearance, particularly at small point sizes. This gets exaggerated with the MS Core/Web fonts, which were designed for the screen (stems that line up with pixel boundaries, etc), and have hinting that is specifically designed to be compatible with MS ClearType.
Ubuntu has similar (but not exactly the same) levels of fuzziness, but the rendering is inaccurate. Character tracking/spacing is the biggest issue, but there are others.
If you look at traditional fonts that are designed for print or general purpose use (e.g. Helvetica, Frutiger, Univers, Myriad, etc.), the accuracy and quality of OS X rendering stands out considerably. There is simply no comparison.. and once you get used to it, the small amount of "fuzziness" that you observe in small point sizes just doesn't bother you anymore.
its kerning not tracking, and its only off in a few minor places (for the same reason the midline is off in H E), in those paragraphs I can find around the same mistakes from both the renders around the kerning (both make pretty identical errors)
I use osx every day, I am very used to it, I just prefer ubuntus rendering by quite a long way
That's funny, I didn't know Verdana was optimized for that type of rendering.
I use Ubuntu configured to use Windows style hinting without sub-pixel smoothing, and I have absolutely every font forced to Verdana. I'm glad that there is accidentally a little bit of logic behind my choice.
Run, don't walk, to your nearest optometrist and get your eyes checked.
Ubuntu and OS X are not even remotely comparable from a typography standpoint. Using Freetype (the Linux/X11 lib for font rendering), tracking and stem thickness is inconsistent, ascenders and descenders get cut off, etc.
The Freetype team puts forth a valiant effort, but until they're allowed to use Adobe/Apple/etc.'s patented hinting algorithms, there's no comparison.
I'm using Ubuntu and my screen looks like the OSX screen in the picture. The default ubuntu install comes with only free fonts, and clearly HN has been designed to look best with non free. All that is required to make both look the same is to type:
Just tell Ubuntu/Gnome to use the DejaVu fonts by default. They are much nicer looking than the "standard" fonts. Also, `aptitude install ttf-mscorefonts-installer` goes a long way towards making the web look nicer, as does changing Firefox's default font selections.
And yes, I wish Ubuntu would at least make DejaVu the defaults for everything out of the box.
They look like different fonts and different weights (heavier fonts, emboldened, are more readable at smaller sizes). Both look ugly OSX ones are too ragged, Ubuntu are too sharp. Is it the exact same font being rendered by the different engines??
Looking forward to upgrading from 9.04 to 9.10 ...in 3 to 6 months when all the bugs are shaken out and everything is compatible with the new version. =^)
I use a few packages from http://www.getdeb.net/ and have several PPAs https://launchpad.net/ubuntu/+ppas installed as well so there's some secondary lag there as those developers release new packages for 9.10. Which shouldn't be too long, really.
My main concern with this release is the ongoing, phantom ext4 (and sometimes 3?) corruption problem. Depending on which reports you read, it affects 64-bit installations specifically, which is what I use.
As much as the software, I want to ensure that there's plenty of community documentation and collective experience compiled before I switch. My System76 laptop here is my bread and butter; I don't want to find myself pathfinding on a new problem when I need to get things done, I want someone to have already charted a solution.
IIRC 64-bit isn't just about memory size. x86_64 is a chance to use things like SSE for floating point rather than x87. The older floating point units are still on the processors, but in a depreciated fashion (Windows/Linux ABI don't use them). You can use SSE for floating point on a 32-bit system if you compile all of your apps yourself (ala Gentoo), but then the resulting binaries are not 'x86-compatible' because every x86 processor is not guaranteed to have SSE support.
As you can see, x86-64 isn't just about memory size.
This seemed a bit harsh, but (after consulting Wikipedia) I'm guessing you're trying to say that even after doubling to 16, it still sucks compared with "common RISC processors (which typically have 32–64 registers) or VLIW-like machines such as the IA-64 (which has 128 registers)."
Sorry. I didn't want to sound harsh. The x86 ISA has a register starvation problem. I am sure the phone I am writing this on has a far more elegant and modern processor than the PC I'll return to after I finish lunch.
The sooner we can ditch x86 (and that includes amd64), the better.
The problem with IA64 being that Intel designed/patented it so that they could bring in royalties if/when everyone went to 64-bit and used their architecture. I'm thanking amd64 for saving us from creating a new generation of processors where everyone was paying an Intel tax. It may have a superior design, but I don't want every processor in a desktop/laptop out there to be a revenue generation machine for a single company, even when the processor's manufacturer is different.
I was using 64-bit processors and OSs a couple years before Intel shipped IA64. Try to google SPARC and MIPS.
What AMD did was to extend the x86 ISA with a handful extra registers and double the size of all of them. This allowed x86 to survive until these days, slowed down the migration towards 64-bit computing and confined much better architectures to specific niches.
You probably already know this, but mostly all modern CPU's (within the last 5+ years, I could be wrong on some accounts) are actually 64 bit, so I believe and I might be wrong, you are losing some optimizations.
Also its your RAM + Videocard Memory that is your memory, so while you might have say 4 gig, and it would seem to show up on your computer, its including your video card memory.
The downside is that pointer sizes in 64bit software are double, and this does sometimes have issues with the processors cache. Personally I haven't seen much difference in practice either way (except in cases where you have more RAM ... MySql on 10 GB totally rocks :)).
I was asking because I choose to install the 32bit version ... 64 bit packages are more unstable being less popular.
Absolutely wrong. I wish ppl would stop spreading this nonsense. 64 bit is actually slower in practice, since addresses double in size. It also causes nearly double the memory consumption, since all libraries become twice as big, so when you upgrade to 64 bit, you should actually double the memory you wanted on 32 bit. There are only 2 reasons why you need 64 bit:
* You need to compute extremely large numbers. You'll get big performance boosts here.
* You need more then ~3.5 G ram. In which case you actually have to buy AT LEAST 8G to see a positive impact. Even this is not a hard requirement, as there are work arounds in 32-bit. But 64-bit is an absolute must if an individual process needs that much ram.
note: I see I'm voted down, and maybe I'm a little harsh in my comment, but can anyone add any other reasons to go to 64 bit? I've read a ton over the years and those are the only two I could find. Maybe I should add "because 64 is bigger then 32 and I can tell ppl my OS is 64 bit"
edit: Double is a great exaggeration, but not as much as the replies indicated. I moved from a 64-bit slicehost with 512M to a 32-bit linode with 320M and have LESS memory woes. The libraries really are almost twice as big. I speak from experience and not from theory.
Several reasons for me because that my desktop computer are mainly used to run several scientific computing tasks.
1. It can provide performance boost if you take advantage of that 64-bit calculation (two 32-bit calculations) is done in one clock-cycle, you can actually achieve 100% speed up.
2. Use more than 4G memory is not a big deal in scientific computing, and let's face it, more memory is better than read from disk.
3. The argument about the double memory consumption is plain wrong. All program still use int as the main type, which only 32-bit long. Only pointers takes 64-bit long, but you don't save much pointers in your memory (what's the point of keep 4G pointers in memory anyway?). Thus, the memory consumption is little larger, but definitely manageable.
4. (This point may be wrong, please correct me) double precision is common for calculation, and it takes exact 64-bit long, which makes it more suitable for 64-bit CPU.
To explain why you're being downvoted into oblivion:
ints remain the same size, but longs and pointers are double the size. 64-bit libraries may be bigger, but there's no reason they would double in size. In general, a 64-bit system uses more RAM, but not double.
One data point: IIRC, a benchmark I ran comparing memory consumption for Lua coroutines (co-operative threads) used roughly 1.3-1.4x as much memory on 64-bit as 32-bit.
I'm not sure if 1.25-1.5x would be accurate as a rule of thumb, but 2x is almost certainly too high. (It will depend on the data structures etc. involved.)
IIRC AMD64 does support x87, but Window/Linux have chosen to 'standardize' on SSE. In this respect x86-64 is a 'version' of x86 where all processors are guaranteed to have SSE and not need x87.
I have 4GB system and 512GB video memory installed. However, I only upgraded from 2GB after installation because the already-heavy Java apps I use (Eclipse, SQL developer) now required twice the memory.
As the other commenter mentioned, there are the CPU optimizations to consider, but what decided my final transition (I've tried it out a few times over the past few years) was when System76 moved to shipping their systems with the 64-bit distro installed. I want to maximize their ability to support me, so when I did a fresh install on a new drive I opted for 64-bit as well.
Most of the problems of compatibility with 32-bit binaries have been trivialized or are routed around easily enough these days. Though 64-bit Linux was never as much hassle as x64 Windows. ;^)
Java is not a good benchmark. Sun's JVM won't even pack memory (if you have a class with 64 "boolean" variables, it takes header + 64*4 bytes, not header + 64/8 bytes, for some reason that is utterly unclear) unless you use arrays of primitives, and my experiments indicate that it just shifts to packing on a 32 byte boundary to packing on a 64 byte boundary on a 64-bit VM. That is to say the Sun JVM does implement the naive "everything is twice as big" translation even though it needn't.
Of course the 32-bit JVM also has a 2.5 gig or so address space limit that makes no goddamn sense whatsoever, so if you want to process more than 2 gigs of data for any reason you have to either do it offline or you have to buy twice as much memory and do it on a 64 bit VM.
But don't trust the upgrade if you use advanced features (as I found out the hard way) like Software Raid, LVM, and cryptfs. IT WILL BREAK YOUR SYSTEM.
No, but after upgrading via apt-get my system was unbootable. After pointing it to the proper drive in grub, it would attempt to boot but mysteriously hang. No one in #ubuntu or #ubuntu+1 could figure it out. I ended up having to boot from live cd, decrypt the filesystem, transfer it to a different box, and reinstall from scratch.
In the last version, they broke my Intel graphics driver, rendering my machine almost unusable except for the console. I spent many hours trying to fix it, and ended up switching to Arch out of frustration. I don't think I'm going back, but does anyone know if they've fixed it?
This was a really famous regression. It's not solely Ubuntu's fault, of course; they got bitten by shipping newer software. But yes -- they (Intel, really, not Canonical) fixed it.
They fixed that bug along w/ a few other annoying bugs (pulseaudio, laptop mic, etc). I've been running Ubuntu 9.10 for a month now on my laptop and have had almost zero problems. Only one resume fail in all that time.
Some googling shows that the VMWare Tools are not yet compatible with 9.10. Does anyone know how quickly VMWare usually gets around to updating tools after an Ubuntu update?
Actually, with Windows 7 and Snow Leopard out Ubuntu feels even further away from becoming mainstream. I want to like Linux on the desktop but looking at it objectively Win 7 and Snow Leopard are better.
I spend almost all of my time in firefox, emacs and a few other programs, which are the same on every platform. It really doesn't make much of a difference to me what desktop OS I use as long as it's cheap and it works with my hardware. I use Ubuntu on an HTPC because it's free and it (usually) works. I can't imagine what MS or Apple could do to improve on free + works.
To me Linux is a lot closer now than it was in the past because you can pop a Ubuntu CD into a computer and 9 times out of time it you have a working computer with sound and everything half an hour later. 7 years ago, when I first installed linux, that would have been more like 1 time out of 10.
> I spend almost all of my time in firefox, emacs and a few other programs
Me too, and I have a Dell laptop in the mail that I'll be installing Ubuntu on and trying to use as my main machine.
But you know what isn't the same in all of these programs between OS X and Linux? Font rendering. I can get Linux to "OK" but OS X is always refreshing in comparison.
Font rendering is a big one for me. For some reason Ubuntu always looks.. ugly.
The color scheme, font rendering, icons, etc, etc. Are there no designers in the open source community willing to help out? Or do the project leads not give a crap about design?
I personally find the Ubuntu out-of-the-box font rendering with the DejaVu font family on my laptop LCD to be on par aesthetically with Windows and OS X font rendering, so consider at least that there is an element of taste involved. (Unless some bug is making your fonts exceptionally bad for some reason.)
Unfortunately, I am personally unable to duplicate Ubuntu's pretty fonts on Gentoo, no matter how hard I configure X and fonts/local.conf.
In Arch the AUR directory has packages called fontconfig-ubuntu, cairo-ubuntu, and libxft-ubuntu , i found when i compiled those my font rendering was the same as ubuntus. Maybe Gentoo has the same packages, or you could grab the sources from AUR and compile them manually?
Firefox, like OpenOffice.org, bundles its own renderer instead of using the OS's renderer. That's why Ubuntu fonts look good, and Firefox fonts look blurry.
True - but when we start comparing it to Windows - (disable crapware, install firefox, install anti-virus, install iTunes) vs Ubuntu (install restricted-extras) then a Ubuntu setup starts looking pretty good.
For me the fonts are better in Linux. The clear-type technology they introduced in Vista doesn't play well with existing monitors.
I have a 19 inch LCD display that's very decent, but I purchased it before Vista's release. The fonts on Vista have a red hallow that are unbearable, and I'm not the only one with this problem (and trust me, I played with the clear-type settings, although they don't make that easy).
On a big monitor with a typical DPI, clear-type just adds visual noise.
In Gnome the standard font anti-aliasing that doesn't use clear-type is great.
I genuinely believe (ignoring vendor support) that ubuntu is by far the easiest to use operating system, ahead of windows and osx for the basic operations of installing and using new applications, I would rather use gnome over leapords window manager any day.
Since I installed Ubuntu on my mom's computer, she hasn't called me for really any support. The computer boots up faster and isn't slowed down by virus software.
i don't think Win7 is better. to me, it is a slight improvement over vista but still annoying and ugly.
osx is the clear winner among UIs and ubuntu is nice enough for most people.
I couldn't agree more. The last time I ran Ubuntu as my primary OS its desktop experience was a long ways from Windows or OSX. Compviz was full of bugs that in my opinion significantly detracted from the the overall desktop experience. I'd constantly run into problems where windows wouldn't be placed correctly on the screen or would have artifacts drawn in them. Some of the compviz effects would fail to run altogether - with almost no indication of why. On top of that I found the general desktop performance to be much slower than Windows and OSX. Specific examples of this would include: hiding/restoring/maximizing windows, scrolling through text and graphics, and moving windows. Add to that performance issues in firefox (which have been well documented), incomplete drivers (sleeping, graphics drivers, etc), and the obvious compatibility problems with running linux. Ubuntu is still more frustration than it's worth for a desktop OS imho.
you must have a fast machine... Try installing windows 7 or snow leopard on a computer with 2gb of ram and a 2 year old processor... then the eye candy won't really add to the experience.
Windows 7 is running on my 1.6ghz Sempron w/ 1GB of ram. I turned off the major eye candy (aero) and it still looks great and runs fine.
That PC is the 'family' computer - so basically all it's used for is Surfing, streaming music through iTunes and instant messaging.
It used to run ubuntu but everyone complained about how ugly it is - and I wasn't going back to XP. At least with Win7 I'm in the clear anti-virus wise, no need for that.. yet.
hmm... I guess some people don't like brown. I would be happy if ubuntu came with a non-brown theme that had the same overall quality as the default theme.
I have always preferred Kubuntu to Ubuntu for desktop installs, and that's one big reason. (Also, I've always preferred Konqueror as a file browser to whatever Ubuntu uses by default with Gnome.)
However, it's been a while since I've used it (I have 10.4 and XP machines for the most part), and I get the feeling that Gnome has benefited from Ubuntu far more than KDE has from Kubuntu, being sort of a bastard stepchild at best.
KDE's Konqueror has features that I still find myself wishing for on my Mac on a daily basis. (E.g., the visual view-by-size thing and built-in sshfs are my top two.)
I've not seen any compelling reason to upgrade my Macs from 10.4, so when they finally hit EOL I may seriously reevaluate Kubuntu as my main desktop, see if it's kept up reasonably well versus the mainline Gnome version.
is it really a big jump from 9.04? the last few releases havent changed a whole lot in terms of ui polish, which I always thought was really nice until I used osx.
and to be fair to the original poster, the link is available now, the site was in the middle of updating
I think it is. Pulseaudio, for example, has become sane and actually better than any other sound system I've used. It even streams to apple airports and other computers.
Just whiny, but why does Ubuntu (or most distros) not give any idea of minimum/average disk required for install in their system requirements area? I know, I can customize the install so as to reduce or increase space requirements, but even a suggestion that "a straight default install requires around 2gb of space for programs, and 1gb for swap" would be helpful.
Ubuntu is available for PC, 64-Bit and Mac architectures. The Alternate installation CDs require at least 256 MB of RAM (the standard installation CD requires 384MB of RAM). Install requires at least 3 GB of disk space.
Hmm... I guess I never thought of Sys Req as "features". I would expect it in the release notes (not present) or on the download page (not present). Placing at the bottom of the "features" page is obfuscating, in my opinion, if it is placed nowhere else.
But good of you to find it; I would never have thought to look at the Features page. Again, whiny of me.
If you're a geek and need to ask, it's probably too large (if you worry about hard-disk space, you're probably trying to install it on something obscenely old. Ubuntu isn't for that.).
If you're not a geek, it'll fit (any consumer computer going over the counter for the past 6-7 years, probably more, has at least a 20 GB drive in it. Most likely you'll have 100+ GB).
Yeah, well, good point. My point was that a userfriendly distribution shouldn't present technical information that isn't absolutely vital. And it's really easy to find, by the way:
Disk space requirements aren't obscure technical information. Providing it wouldn't confuse anyone, especially not anyone adept enough to install an operating system. It should be provided.
You fail at UI. Any and all information always adds some cognitive load and confuses the user. Every time you think you should show the user something, you must figure out a very convincing reason why you should not just leave it out. There clearly isn't one for majority of users who are going to install ubuntu, and for the rest, googling for two easy words isn't going to be too much of trouble.
Disk space is required to install the OS. The alternative is letting the user download the .iso, burn it, then try to install only to have the installer complain about not having enough space. That is failing at UI. The additional cognitive load of reading a disk space requirement is negligible, especially compared to what someone who needed that information has to go through without it. You haven't even considered the possibility that a user might not think about how much space they have until they come across the requirement.
Not only are you wrong, but "You fail at UI" is a particularly abrasive way to be wrong.
I might well be wrong about whether it's worth it to show disk space or not. I have no problem admitting this, because I haven't done any studies about it.
> but "You fail at UI" is a particularly abrasive way to be wrong.
>The additional cognitive load of reading a disk space requirement is negligible
Apparently I was too abrasive, or not abrasive enough.
NO amount of cognitive load is negligible. Ever. When you design a dialog with two buttons that both have to be there, both of those buttons are creating a significant amount of cognitive load. Understanding this is largely why apple makes good UI's. You should never try to justify removing something from the UI -- you must instead always have very good reasons why NOT to remove something, and remove it by default if you can't find enough (not any).
As I said, I have not made any studies of how users install ubuntu or what they expect, but as it's supposed to be installed as the os on the computer, in place of whatever that was before, and as there is no commonly available commodity hardware that it won't install on, I can easily see how there is no longer enough justification to show disk space requirements.
Ok, let me revise my statement: the cognitive load of the requirement is only negligible in comparison to the frustration not including the requirement would cause.
Apple puts the disk space requirements in every single installer. Disk usage is an important piece of information. I'd agree with you if it were something like "Ubuntu requires a Pentium II or higher".
Hmmm... I guess I thought a userfriendly distro should be friendly to the user by providing me info I didn't know I needed to ask up front, and hiding things that are merely technically interesting or edge case. Disk space is probably the former. Asking me to "just google it" is probably not the response Ubuntu/Canonical or any other product would want.
BTW, since it crashed on install instead of catching that my partitions weren't large enough, that's a fail. But by merely providing the info on the DL page, they might have saved themselves some credibility loss as a "user friendly" distro.
But I do appreciate that you didn't throw a lmgtfy.com link in, which would have been really embarrassing for me.
Actually the graphics card support is better than Windows for me. My graphics card driver keeps crash under Windows 7 but for 9.10 Ubuntu, there is no problem (Rademon HD 5770)
Correction: Actually the graphics card support is better than Windows for me. One case study is not enough evidence to support your claim that overall their driver support is better.
But I'm v happy to hear this: i've struggled with my ATI graphics card the last 2 major versions of ubuntu (7.x i managed to get working after days of hacking around. Never got compiz working though. 8.x I got compiz running, but the general performance was so sluggish it was unusually).
Just upgraded and everything is smooth so far. The only exception being VMware Workstation 6.5 - you need to upgrade to VMware Workstation 7.0 or it won't work.
for anyone who finds the Ubuntu mirrors to be too slow due to being hammered on the first day of release, we just posted a technique for quickly making your own private, temporary Ubuntu mirror on EC2: http://blog.jumpbox.com/2009/10/29/instant-ubuntu-mirror-usi...
Is editing the registry or hunting down drivers for a new version of Windows better? Or how about waking up to Snow Leopard having deleted your $HOME folder? Operating systems have their issues - it's inevitable...
> Is editing the registry or hunting down drivers for a new version of Windows better?
I can't remember the last time I edited the registry or had to install a driver manually.
Don't get me wrong, I run Ubuntu and I like it. The upgrade process on the whole is much better than a even a few years ago. I just loathe upgrading since it can take weeks to iron out problems with the upgrade. Invariably there are issues with suspend/resume, video drivers, randr, gnome preferences borked, etc. Perhaps these problems will go away in the not too distant future now that more and more manufacturers are shipping Ubuntu pre-installed.
Here's the funny thing: I actually dislike Ubuntu (I run Debian at home and use OSX at work), and I loathe Windows. But of course, none of that matters (saying "I run Ubuntu and I like it" is a bit like saying "some of my best friends are <gay, black, Jewish, whatever>").
The point isn't whether you or the other respondent has recently edited the registry or hunted down a driver. The point was that all operating systems have their troubles.
The standard gripe about Linux distros is hardware related. Well, um, duh: if the hardware is closed source, there are going to be problems getting things to work just right. Hardware manufacturers cater to Windows (for obvious reasons) and Apple controls what hardware their os runs on. Ubuntu, Gentoo, Debian et al. are not in a position to do this. So, yeah, there are probably always going to be those kinds of troubles with Linux.
That said, you can't beat the price, the ability to get at the source code (if that matters to you) or the package manager (for Debian-based distros, at least).
> The point isn't whether you or the other respondent has recently edited the registry or hunted down a driver. The point was that all operating systems have their troubles.
No, the point is I haven't had those troubles on Vista. I haven't had these issues for a long time. I could care less about open or closed source drivers. I just like it when things work. And I like it so much I'm even willing to pay money for it. That is the point.
> Is editing the registry or hunting down drivers for a new version of Windows better?
When was the last time that installing a new Windows involved registry hacks. I'll admit that I haven't used Windows 7 or Vista, but I don't remember hacking the registry on Win95/98/ME/2k/XP installs.
Maybe you shouldn't get so bent out of shape so quickly. It's not like he was bashing Linux and promoting Windows/OSX. If someone gripes about Windows do you immediately rush in to make comparisons to similar problems in OSX or Linux?
With the way that Ubuntu does releases though, it's little wonder that people are scared to update. It's been a revision or two since Ubuntu pushed Xorg over to using evdev for mouse handling/etc, and my Thinkpad's TrackPoint is still only recognized as a regular mouse (no scrolling) when previous Ubuntu revisions had readily setup scrolling automatically for me.
{edit} I'll add there are multiple blog posts, and a LaunchPad report, IIRC with the fix... which the Ubuntu devs have not seen fit to include in Ubuntu, so a fresh install mean applying all of the fixes. {/edit}
With the update to Jaunty (9.04), I've had a few woes with my Atheros chipset:
1. ath5k was forced as the default out-of-the-box driver for Atheros chipsets over the previous madwifi (ath_hal) driver.
2. ath5k is flakey whenever I come home and connect to my wifi network... even when the laptop is 2 feet away from the access point. It will connect/disconnect from the network 2-5 times before deciding to stay connected.
3. If I change position in my room (with the access point in the middle), the same connect/disconnect 'settling' period still needs to happen.
4. /var/log/syslog is filled with spam from NetworkManager with messages along the lines of this (even when I see no visible disconnects in nm-applet):
'disconnect from network (mynetwork)'
'connect to network (null)'
'disconnect from network (null)'
'connect to network (mynetwork)'
5. There are unexplained slow downs in the network with the ath5k driver. For the longest time I thought that my ssh woes (random 'hangs' that last 2-6 seconds) over my network were the problem of the server, but since I've switched back to madwifi they've vanished.
6. The madwifi driver still doesn't play nice with suspend-to-ram (and maybe suspend-to-disk but I haven't tested it). It will refuse to work after waking back up (and a modprobe -r/modprobe doesn't help). Ubuntu could easily fix this... Just need to drop `SUSPEND_MODULES="$SUSPEND_MODULES ath_hal"` into a file in '/etc/pm/config.d'. This problem has existed for at least 2 or 3 revisions of Ubuntu (maybe more). The fix is on the web and on Launchpad.net, but I have yet to see the fix make its way into an official release.
If me, as someone that knows my way around Linux, can get frustrated with these issues, then how can 'Bringing Linux to the Masses' be anyway near achievement? My laptop isn't exactly a brand new laptop with highly unsupported hardware. I'm running a ThinkPad X41.
[ Oh, and did I mention that the headphone jack stops working after the second time I suspend/resume after a fresh boot? I'm perfectly willing to work with the Ubuntu devs on Launchpad, but all of my suggestions/information is apparently falling on deaf ears. https://bugs.launchpad.net/bugs/101986 ]
Not necessarily, but someone griping about needing to wait for Service Pack 1 of a Windows release before having a stable platform wouldn't have gotten the same response.
In all seriousness, "guides" for setting up Ubuntu are most likely unnecessary or superfluous; they'll just go into some random area of "customization" that 95% of users probably don't need or care about.
If you already have an older version of Ubuntu, just run the command I mentioned, or go to System->Update Manager to handle the upgrade, and get on with your day. If you don't have Ubuntu, just boot the live disc and install it, and go from there.
> In all seriousness, "guides" for setting up Ubuntu are most likely unnecessary or superfluous; they'll just go into some random area of "customization" that 95% of users probably don't need or care about.
Guides that target a particular system are still useful (i.e. "How to install Ubuntu 9.10 on a Thinkpad x200") because they are guiding you through the stuff that doesn't work out of the box, even if they are minor issues. (Or hidden options to make something that 'works ok' or 'sorta works' into something that works well)
Slicehost has an excellent assortment of articles for Ubuntu (as well as plenty of other topics). It looks like they don't have any for Karmic yet though.
I'm not a big linux vs. windows or ARM vs. intel fanatic, but I'm excited at what seems to be a disruptive inflection point coming up in the next year or so...