Hacker News new | comments | show | ask | jobs | submit login
The amount of crap Windows users have to put up with is incredible (dendory.net)
408 points by dendory 1834 days ago | hide | past | web | 322 comments | favorite



I enjoyed this rant. The bottom line is that this 'experience' isn't about you, its about who you can be sold too. That is because "you" are too cheap to pay what "we" think should be paid for them. It is like a hotel which sells you room for $100 a night if you agree to let them leave a web cam on 24/7 and sell any useful 'snippets' it catches while you are there.

The bulk of the market doesn't buy "computers" they buy "televisions." Think about that for a minute.

The bulk of the market are entertainment 'consumers' for which you can sell access to their eyeballs for real money. Just like TV did before people got digital recorders and started skipping all the ads. Not so with these new fangled TVs, they don't care if you don't look at their advertising they want to know what you did look at, and when, and after what, and then what did you do? Because all of that is much more valuable than putting up a tasty picture of a cheeseburger in front of you, no, they can phone ahead to your local market and tell them to stock up on cheeseburgers because you've been researching them all day and are now at the point where you want to make a purchase.

But the cool thing? It means that the current 'big' players are leaving the market for computers behind. You can tell that by the fact that the computer company no longer sells a product that a developer would care to use. And that means that there is room again at the bottom.

Time to start a 'developers' company that works very much on the same model that Sun Microsystems started on, hardware designed from the ground up to be developed on, open systems so that folks can easily work with it, and a team dedicated to making sure integration and support is there so that folks like you and I can say "Hey this audio doesn't work when you set the sampling rate to 40Khz" and they can fix it and release that fix.

But for that company to exist, you have to pay for the products you use, and to get to that point you have to not be able to get something 'good enough' by hacking and slashing something else into shape.


I think there is a company that does what you suggest. I will probably get down voted for saying it, but Apple. They put in the time to make a pretty darned good user experience for first impressions.

There's no junk, no spyware, nothing to remove. Occasionally you might get a software bundle, like Office, or Quicken, but literally just drop it in the trash and you are done with it. Rarely would you have to find an uninstaller or dig deep to remove something, and if you do, you generally brought it on yourself and should know what you are doing.

But you pay extra for this, most of which people apparently don't want to do, as I always hear "I could have bought computer xyz for 25% cheaper", as they are finishing their 15 hours of cleanup.


Definitely NOT Apple. Despite their open source site, the most important bits of the OS are closed, and so developers can't dig into the OS to figure out if the bizarre behavior they're seeing is a bug in the OS or not. Worse, they can't submit patches for things they do find. Instead, they have to go submit a radar report, which can't be seen by anyone but themselves. It's so bad that it has become common behavior to copy/paste the report to openradar and then have other developers submit duplicate reports as a way to "upvote" a bug. That is most definitely NOT developer friendly.

And then we have Xcode, the program that has inspired a twitter account dedicated to its severe brokenness.

And let's not forget their terrible provisioning process that always breaks in continuous integration systems, and the asinine limit of 100 test devices. Oh, and their train wreck of a command line tool suite.

And then there's their absolutely bizarre behavior of replacing every internal function name with <redacted>, and disabling stdout on iOS 6.

And then we have the "eat all memory" pager system in OSX that can chew through 8 gigs like candy and bring your system to its knees should you ever try to run more than 5 programs + Xcode.

If there were an iOS development environment + simulator for Ubuntu, I'd wipe this laptop and switch over.


I'm really enjoying your zen-like calm. If I had to enumerate everything I disliked about their experience for developers I would be foaming at the mouth after three or four bullet points.

Everything you posted is basically an understatement of the truth. Like you forgot to mention that if you do bother going to the trouble of posting a radar, it's probably going to get ignored.


all this is true if we add "for XCode based development" as a qualifier to "Definitely NOT Apple"

I use an ageing (4yo) 13" macbook as my primary development machine and it works fine. I'm no where near geeky (or whatever term you use) to be even thinking about fiddling with OS internals and I've never hit an OSX bug that bothered me much


And you upgrade the OS on your Macbook regularly? Isn't there a lot of software that drops support for older releases (10.5, 10.6)? Not to mention newer Apple-tested software running slowly on older Macs? I remember Safari and Mail.app just endlessly hanging and giving me the wheel of death on a 2008 iMac and Macbook Pro.


accidental downvote - sorry

Edit: Really? Someone downvoted my explanation?!!!


> Really? Someone downvoted my explanation?!!!

Yeah, but don't let it bother you. Your comment is appreciated, but it's off-topic and doesn't provide conversational value, so I'm going to downvote it. I'm just curating for the next user. :)


I see you never needed quicktime on windows? Their download page asks you for the email to "Keep me up to date with Apple news, software updates, and the latest information on products and services." Their privacy policy starts with:

"You may be asked to provide your personal information anytime you are in contact with Apple or an Apple affiliated company. Apple and its affiliates may share this personal information with each other and use it consistent with this Privacy Policy. They may also combine it with other information to provide and improve our products, services, content, and advertising."

I don't think they're different from other companies at all.


Little secret: if you click the download button below without entering your e-mail, it just works.

Apple is very discreet in their e-mail marketing. They send maybe half a dozen emails per year, mostly after product launches or before christmas/school term/black friday/etc.


Sure, you can workaround those problems in many ways. The point was the same as in the article - this is the default behaviour you get. This is also the behaviour a typical user will accept and have to live with.


I could never figure out how to turn off the marketing that I got from having an iTunes account. These days anything Apple sends me gets caught by my spam filter and I haven't logged into my iTunes account in years.


You can get Windows to just work also, with similar workarounds.


I don't think they're different from other companies at all.

They aren't. Believing they are is the quintessential feature of the fallacies of Apple partisans.


And there was that one time the iTunes updater installed Safari on my computer. I'm in the habit of reading through the options on installers carefully looking for those sorts of tricks, but not updaters.


I am an Apple user, and the Apple experience has jumped off a cliff for developers. What you are talking about is the casual experience for people that don't know how to use a computer.

If you don't know how to use a computer at all, Apple is really awesome. If you are trying to make your own software or sell software (particularly for phones) Apple is awful.


>I am an Apple user, and the Apple experience has jumped off a cliff for developers.

I was an Apple user and small OS X developer, and I agree completely. Three years ago, I made a living developing shareware. Then Apple decided they wanted total control over the app ecosystem and turned the "put out a good product and wait" business model into an impossibility. VersionTracker's sale to CNET and subsequent mediocritization didn't help, of course.

Now I use Ubuntu and do web development. And maybe some day WebGL and related technologies will get to the point that I can go back to the tightly optimized graphics programming I really crave.


Yes, this restrictive control over the App Store is the most unfortunate thing they've developed. I would not be surprised if they tried to lock it down entirely by first changing the default security to "App Store only", and then removing the other options.

I guess great minds think alike, because I'll be jumping ship for web development on Linux in short order.


>I guess great minds think alike, because I'll be jumping ship for web development on Linux in short order.

That's great. It's really not as painful a transition as I feared it might be. It's actually pretty astonishing how far the various linux desktop environments have come as far as usability.


I've never tried to sell apps, but I don't understand your complaint. Are you saying that people were no longer interested in buying your apps because they weren't in the Mac App Store? I'd think that it would improve discoverability in general, not hurt it, not to mention make the purchase process a lot easier.


The main problem was that at around the same time that VT became useless, Apple removed the OS X downloads section from their website (in preparation for the Mac App Store). So within a very short time span, most of the resources that small developers had for promoting their work disappeared.

More recent abominations like Gatekeeper are just icing on the cake.

>I'd think that it would improve discoverability in general, not hurt it

Nope. Apple's app stores are abysmal at helping people discover your software. I don't know why they opted to ignore everything that software aggregators like VT and Mac Update had learned over the years, but they did.


>>More recent abominations like Gatekeeper are just icing on the cake.

I'm sorry, but I just don't understand this line of reasoning. How can security measures like this harm most users, especially given it's something that can easily be turned off.


Surely you can see that "easily turned off" is relative to a user's experience, and that ease of disabling is not the only issue. If you ask most users to turn off a security feature so that they can run your software, they're going to think you're doing something shady.

And that's pretty shitty. Now a lot of people will say "it's only $99/year; what's the big deal?"

Well, first off, that's $99/year, forever, or your apps stop working. That's quite a commitment for a small developer.

Secondly, $99 is steep if you're developing freeware. You should not have to pay a hundred bucks a year for the privilege of giving your software away without people thinking that you're trying to steal credit card numbers or something.

Finally, $99 is steep for a lot of young people. I started developing freeware and shareware when I was in high school. If I had had to pay $99/year to make my app presentable, it probably would have been a deal stopper.

Now, you might say "Sure, but you're an established shareware developer and none of these things apply to you." But they still affect every shareware developer. Ever notice that nobody seems to be have been successful with shareware on Windows in many many years? Everything on that platform has been "free trials" and sketchy ad ware. The difference is ecosystem.

For many years, there was a thriving freeware and shareware ecosystem for the Mac, and so Mac users expected to be able to find freeware and shareware solutions to their problems. They expected to be able to find a handful of cheap or free programs that did what they wanted, and they'd be able to try out each one until they found one they liked. They expected this because it was true.

And free/shareware developers expected there to be an audience of Mac users looking for a free/shareware app that did what their app did. And they could be reasonably sure that if they fulfilled a real need, did it well, and kept the latest versions up on software aggregators, they'd be able to reach that audience.

But those three points above, along with the decimation of aggregators, and the introduction of the Mac App Store have broken that ecosystem. And let's be clear: Apple didn't merely sit by while the forest burned. They clear cut the damned thing and built their app store there. That thriving ecosystem had helped Apple sell computers, but it didn't profit them directly. So Apple set out to change users' expectations away from "find some cheap/free software out there written by a small developer" to "check the App Store", all so that they could squeeze 30% out of the process.

I do not think Gatekeeper is really about security. I think it is about turning the Mac App Store into the Only Way to release software. It's just a little squirt of agent orange to make sure that the forest doesn't grow back.


Fair enough. I agree the fee can be steep to some. As a user, a counterpoint:

I hate going to random websites and giving them my credit card and dealing with whatever license key they give me.

I hate trials. Cheaper apps I can buy more liberally are nicer.

I hate crappy software that spews stuff around my system. Even though some apps can never be subject to sandboxing, I think forcing the rest to clean up their interactions with the rest of the system is an advantage of the App Store. (But I am biased in this particular opinion.)

I like having all my updates in one place.

I think the App Store generally has the potential to provide much better discoverability than the combination of Google and some crappy websites. Even considering the fee, I think that if people get used to it, the store can provide a better version of the "find some cheap/free software written by a small developer" concept.

I like that my mom is much more likely to use the App Store than VersionTracker.

I think the Store is only the death of an ecosystem insofar as it replaces it with a slightly different, but mostly just improved ecosystem.


>I think the Store is only the death of an ecosystem insofar as it replaces it with a slightly different, but mostly just improved ecosystem.

But it's not an ecosystem now. It's a garden. And maybe it seems improved from a user's perspective, but it is completely fucked from a developer's perspective. If we can agree that Apple has made selling outside the App Store nonviable, and that dealing with the App Store is complete hell for developers, then I think we can agree that this is a bad situation for developers.


They haven't made it non-viable. And please don't comeback with "Yet".


Because measures such as these are security theatre (aka fearmongering). They are designed not to protect users but to force developers into their walled garden.


I don't see any part of the agreement that says you cannot develop apps for another ecosystem.


Apple is still a polished, shiny version of the consumer-focused computer, and not, I think, the developer-focused computer that ChuckMcM is describing. The locked bootloader of the iDevices and the Mac app store are but two pieces of evidence of this.


Er...what?

The iDevices are completely orthogonal to whether a MBP is a "developer-focused" computer. I use mine for Android development (oh noes! locked bootloaders!), Web development, and Windows development about equally, and everything works pretty much flawlessly, with the tools I need kept close at hand. (Hell, I don't even use Quicksilver anymore, as Spotlight's gotten better and Quicksilver hasn't kept up to date.)

The Mac App Store is entirely optional and none of my developer tools except XCode come through it. MonoDevelop and IntelliJ I get separately, pretty much everything else comes through homebrew. Loads and loads of applications are sold outside of the Mac App Store, too, with absolutely no problems.


From ChuckMcM's original post: But for that company to exist, you have to pay for the products you use, and to get to that point you have to not be able to get something 'good enough' by hacking and slashing something else into shape.

Based on this and other things he said, I think Macs count as the "something else" that can be hacked into shape, since Apple is not specifically targeting all of your listed use cases.

I'm also not the only one who's extrapolated the recent trend of Macs more closely resembling iDevices to imagine a possible future in which Macs become appliances, and the developer-focused computer ChuckMcM envisions can come into being.


I agree — if an when Apple locks down OSX so that the only source for software is the App Store, that's when ChuckMcM's company can take off. I think there are enough people who are technically minded (no even necessarily developers) and currently use OSX because its a UNIX system that "just works" and can also run popular commerical software (MS Office and Adobe CS mostly). The day Apple does this I and others will need to buy computers from somewhere else — I think there's room for a company to fill that role.


This is exactly why I use OS X; thanks for stating it so succinctly.


I like my MBP for this as well, but the writing is on the wall. The Mac App Store is entirely optional now but I don't have any reason to believe it will continue to be that way. The App Store on iOS is not optional and Apple sees some benefit over Android because of that (well they define it as qualities), installing from 'outside' the Sandbox has only been getting worse.

So how about a development environment for Android in the 'cloud' ? How cool would that be, your files are in iCloud so you know they are safe, your tools are always current and compatible since you're just talking to them over web API, and you can pay for a months worth of access to the compilers and debuggers for cheap instead of wasting all that time to put together an open source environment or spending big bucks for the MSDN Library equivalent service. You can watch movies on your MBP while you are building the Android manifest and running regression tests on that thing in some cloud instance that spun up just for that one task and will go back to idle when its done. Click a button and blam! it's submitted to the App Store.

You may not realize it but that is the vision that is driving moving away from a computer that you hold and do things with, vs a 'viewer' into a service somewhere that is doing the lifting and "adding value" in other ways.

If you read the thread on the Lisp epiphany [1] then this is the same kind of thing. But rather than data is code and code is data the epiphany here is that if you have enough network bandwidth and its available 24/7, then it becomes just another bus on the backplane and where your "computer" is and where your "screen" and "keyboard" are becomes irrelevant. At the office we have folks who put their desktop computer in some room, out of the way, and using X have their 'screens and keyboard' on their actual desk, they do this because the desktop machine is modestly noisy (we have quite cases but its not silent) and really on a gigabit LAN you can't tell that your machine isn't under your desk anyway.

That is coming to the world, faster than you expect, and because it will be insanely more convenient for most people supporting the few people who want to do this 'locally' will fall by the wayside.

Step 1, you figure out how to present a interaction rich UX

Step 2, you figure out how to move most of the data into the network.

Step 3, you create tools that present the UX, process data in the network, and deliver the result.

Google Docs is crushing Microsoft office, services like Box.net, DropBox, and S3 are capturing more and more of the data.

Free startup idea: Github + APT, add a button to a GitHub like service that says "Build this for Ubuntu-x86-64-12.10 and give me the link to the package." Click the button, wait, then post the link to a modified APT

   apt-get install <URL>
All of the pieces for this exist, a couple of weekends coding can put it out into the world. Now if the packages can be built in the cloud, why does your machine need to be a 'computer' again? Build it then 'add this feature with apt' is a pretty powerful thing.

Probably won't happen for a couple of years at least but it will happen, too many people want it too, and too few will demand local compilation to keep it supported.

[1] http://news.ycombinator.com/item?id=4765067 - "The Nature of Lisp"


How cool would that be, your files are in iCloud so you know they are safe

My open-source stuff is on Github, sure, but none of my private code is. I run a Redmine+gitolite instance on a physical box in my house for my private and private collaborative projects (one on-site backup, taken daily and rolled over three days; one off-site backup, stored every week). My code stays local and on machines I control, and the same is the case for many other businesses. It's for that reason that I don't think what you're describing is likely to be so commonplace as to make local development "fall by the wayside" in the near future.

While I get where you're going, I admit I had to restrain an eye-roll at the idea that this will be "the thing" anytime soon if only due to inertia. A development environment for Android in the cloud would be "cool," until it breaks. I have a very overpowering need to control my toolchain and my environment and that doesn't mesh with what you describe. I find the situation you are describing to feel really fucking oppressive. Unsettlingly so.

(I don't see Apple locking down the Mac, by the way. I see Mac sales being cannibalized by iPads, but I do not see Apple, at least in the near future, killing its most attractive feature--flexibility. Both for developers and for consumers. I think Microsoft has a much better chance of offing the traditional desktop/laptop with WinRT, though I'm personally not a fan.)

.

I fully agree that something similar will eventually happen for a certain subset of development; there are certainly some developers to whom this no doubt sounds really awesome. But I think that, and I don't say this to be either insulting or patronizing, this may be a bit of projection on your part. This feels like a "Valley bubble" idea, a "wouldn't it be cool if" that ignores a lot of real-world use-cases. Even aside from the "ew, remote" factor for me personally, I think there are enough infrastructural hurdles to make this really difficult to do in the sort of timeframe you're suggesting.

But I am not the target market for such things anyway. I find "always-on" technology overwhelming. I find that my best development time is when I turn off the Internet and go sit outside and write code; I keep local copies of the Python manual, the Java APIs, and the MSDN library for that reason. (And my Mac is silent when I'm doing development work, FWIW.)

Perhaps I am wishcasting, but I don't think so. Here's hoping you're wrong. =)


"But I am not the target market for such things anyway."

I think we're more alike than you realize. My point is that there is much of what you and I do in our day to day development efforts that depends on the notion of "I own a computer, and I run these tools on it, and I get these results." And we can do that reasonably easily because even though our requirements are outside the mainstream (the target market as you put it) they are enabled because the piece work, the foundational stuff, is required for the target market, for now.

My prediction, fear, belief, what have you, is that the bulk of the 'investment' in terms of time and energy and thought power, will become more and more focussed on that market and worry less and less about what you and I are trying to accomplish.

Here is a milestone which you can look for in order to measure progress toward that future. When you see a GCC release which supports a system or processor where the only way to use it initially is through a remote API to some remote server with source code on some network accessible repository. The rationale will be

"We didn't want to delay releasing it for all the ports to 'catch up', most people can use it like this, installable packages for local OS'es will be available in a few {days/weeks/months}."

That is when you know that it has started changing 'faster' than people are willing to wait for the ports. Then the folks doing the ports will start to drop off because the number of people who use the port is dropping off because all the new kids just use the remote API anyway and you can do it from a command line as if it were running locally so why complain? But that milestone will tell you, to get the feature you want right now you have to use the remote version or wait for a port to come out. Here is a pointer to the source if you want to start porting it yourself.

Except the source will embed various network services which it can 'assume' are there, its the remote version after all, and you will have to code up an equivalent.

I've seen build systems like this, they are very powerful, I will buy you a beer and we can talk about the 'old days' when you could do development without having to be connected to the network. The kids will pity our backward ways. But the beer will be tasty as usual.


Apple never collect information on the users? They don't use that information for advertising? Its not a asset to the company, possible to be sold if economically useful? please remember that as a public company, they are required by law to maximize stock price.


> please remember that as a public company, they are required by law to maximize stock price.

While I agree with the rest of your rant, this is simply untrue. The best summary I can find is here:

> Thanks to a legal doctrine called the business judgment rule, corporate directors who refrain from using corporate funds to line their own pockets remain legally free to pursue almost any other objective, including providing secure jobs to employees, quality products for consumers and research and tax revenues to benefit society. The idea that shareholders "own" corporations is another powerful but mistaken myth with no legal basis. Corporations are legal persons that own themselves. Stockholders own only a contract with the company, called a "share of stock," giving them limited rights under limited circumstances.

http://articles.latimes.com/2012/sep/02/opinion/la-oe-stout-...


An opinion article is not really an reliable source. Same law says:

>directors of a corporation ... are clothed with [the] presumption, which the law accords to them, of being [motivated] in their conduct by a bona fide regard for the interests of the corporation whose affairs the stockholders have committed to their charge

what that mean, is up to interpretation, but I thank you for encourage me to read up on it. Its clearly not a clean cut case as I first thought.

(better link: http://skeptics.stackexchange.com/questions/8146/are-u-s-com...)


Corporations have a duty to act in the interest of their shareholders, but this is broadly defined. There is no requirement to maximize profit.


>There's no junk, no spyware, nothing to remove. Occasionally you might get a software bundle, like Office, or Quicken, but literally just drop it in the trash and you are done with it

Microsoft does this too, if you buy a PC from them or from a Microsoft Store.

http://signature.microsoft.com/


It's sad that this exists (or has to), but I'm glad it does. I recently bought a Vizio ultrabook, which is a "Windows Signature" laptop. No crapware, trialware, bloat, etc. It came with a stripped down version of Office, Microsoft Security Essentials and Skype (which I guess could count as bloat, but I don't care).

Vizio seems to get it. A sleek industrial design. Well priced. And Windows Signature. It's hard to believe that OEMs have trouble grasping this given that Apple has done it for years.


System76 is attempting to do this. I liked the Gazelle Professional. Considering they're not an ODM, the Gazelle was pretty decent for its specs and screen when it was released. There was no bloat aside from vanilla Ubuntu, except a driver to improve the screen brightness buttons.

Still though, you can always throw Ubuntu on a Thinkpad. A small company like System76 can't match the form factor and battery life of products from Lenovo, Apple, Samsung, etc.


I'll second System76, I've bought laptops from them for a couple years now and never had a bad experience. They only throw in hardware that ubuntu supports and there's absolutely no bloatware. There are still things I have to put up with like Ubuntu's recent decision to use Unity, but gnome3 is a single command away.


Just to provide a counter-anecdote: I bought a Bonobo Pro from System 76 in summer 2011. After about a day of use, it failed to POST. I had to mail it back to a repair center in California. It took over a week to get the RMA information as Sys76's Customer Service has very limited hours and seems stretched pretty thin. I needed the laptop back ASAP, and System76 made me pay $80-some to get overnight shipping labels both ways so that there wouldn't be 1-2 week delays each way.

I got the system back, and it seemed to work, but now I was getting a lot of crashes. I tolerated it for a couple of months as it wasn't so severe to make the system completely unusable, but I eventually got sick of dealing with it. memtest immediately reported some severely damaged RAM, so I emailed System76 and again, after a protracted paperwork/support process, they mailed me replacement parts to install myself (the only alternative to shipping the whole laptop back off to California; the disk is encrypted, but still was not looking forward to the potential of a careless tech wiping it and another $80+ bill to make sure it got back to me within the month).

I installed the new RAM modules (and I'll say that it was quite easy to take apart, only exception being the keyboard ribbon, much better than my old MBP) and things were going better, but I still get full system hangs in 3D games. I am concerned that there is a hardware issue with the GPU. Haven't cared enough to tackle that one seriously yet, as I do most 3D stuff on my desktop, but it's really annoying.

If System76 was better with their support processes, these would not be such a big deal, but with their non-cooperation in getting shipped repairs performed and returned quickly, inability to contract out or reimburse for local repairs, long RMA processes and limited customer service availability (no weekends, hours something like 9a-4:30p Mountain), it's just not worth the hassle.

Next time I will get a powerful Dell most likely. I just wipe them immediately and put Arch on anyway, so no concerns about bloatware/spyware/whatever.


Looked up their web site: https://www.system76.com/home/ pretty nice stuff. They are in the right space for this business model. However they've gone only half-way there. They assemble a system out of off the shelf parts and provide Ubuntu for it.

If they were doing a 'Sun' type model, they would spec the hardware, get the Ubuntu distro and integrate it with some key features. Create an ABI and a DDI [1] that third party vendors could rely on to work for the next few years and when it changed to evolve in a compatible way, and then sell that to end users.

The point of having a managed ABI/DDI is that someone like an AutoCAD would be assured that if they ported AutoCAD to the system it would always work on all systems and if it didn't then System76 would figure out why and fix it. System76 would have to sign up to create a stable set of interfaces that provided all of the required features so that you wouldn't need "autoconf" or "configure" you could just add "#ifdef SYSTEM76" in your code where needed and know it would just work. Now System76 can build that out of existing pieces, they can say for example that "OpenGL will be available, link library is -lgl and include path is <opengl/*.h>" and those kinds of "restrictions" enable someone to maintain a version of their product for a 'small user base' (which rounded to the nearest million users is 0) Those same restrictions are anathema to many in the community ("Why force me to use Unity? I love Gnome/KDE/XFCE...") but they require a federating API if someone wants to code to them and support that code.

The counter argument is "community support has created stuff that runs on everything without restrictions, look at MySQL, or VIM" and that is a good argument, but for 'boring' things it works poorly, and 'boring' usually means tools of interest to a user who isn't interested in quarrying the rock so they smelt the iron to make into a lathe kind of users.

The third argument is that its all going into the cloud and people running computers that they compile on and stuff will be like people who spin fiber into yarn so they can knit their own sweaters, a niche and a small one at that. I can see the merits of that argument as well but hope it doesn't win the day.

[1] ABI - Application Binary Interface, DDI - Device Driver interface


I'll third that; my Darter Ultra is still running smooth as glass, although I did eventually swap out Ubuntu for plain Debian, due mostly to personal preference.


> ...hardware designed from the ground up to be developed on, open systems so that folks can easily work with it, and a team dedicated to making sure integration and support is there so that folks like you and I can say "Hey this audio doesn't work when you set the sampling rate to 40Khz" and they can fix it and release that fix...

Why would that not describe any major Linux distro?


"Why would that not describe any major Linux distro?"

Because Linux distros don't build or spec hardware. A hardware company can make a platform that is 100% supported by a software distribution, but it is currently impossible for a software distribution to be 100% compatible with a HW platform which won't release details of its implementation. One of the things I like about the OLPC XO-1 was that everything was documented. Very cool that.

Trendy example, look at the Raspberry Pi. Now look at the graphics blob, now back at the Raspberry Pi. Can't get there from here. So there is an opportunity.


OK sure, but change the statement to "...any major Linux distro on any major hardware system" and it seems on the money to me. The only place I think Linux is lagging is graphics cards.


In my experience (and I use Linux at home (Ubuntu 12.x) and at work CenTos) Linux is lagging on Graphics cards, wireless support, USB peripherals, file system stability, disk i/o scheduling, user interface tools, network printing, and document preparation. But other than that, its right up there with MacOS and Windows.

Hmm, that sounds a bit snarky. I wasn't going for snark, that is a list of things that I run into at least one of them and often more every week. My latest was trying to get some sort of drawing tablet support out of Wacom for Linux. They point you here: http://sourceforge.net/apps/mediawiki/linuxwacom/index.php?t... What is wrong with that picture?


It's a chicken and egg problem. Hardware manufacturers won't provide documentation or drivers for Linux because there's not that big user base, and there's no big user base, because most hardware won't work with Linux... although, I have to find the first piece of hardware that didn't work with Ubuntu on bootup, but maybe it is just me because I buy hardware that after some investigation (googling for 10 minutes) I know will work with Linux. It's really easy.


But in regards to hardware support, the only reason that's really an issue is because people want widespread compatibility. You don't need to start a whole new company to design entirely new hardware; just pick your components with a little care.


As someone who's spent a lot of time debugging its custom hardware, that's absolutely not true about the OLPC XO-1: http://news.ycombinator.com/item?id=719048


Because major Linux distros and the upstream developers like to ignore or mock use cases that deviate from their "brand identity" (witness https://igurublog.wordpress.com/2012/11/05/gnome-et-al-rotti... which recently saw the HN front page).


The difference is that if you do not like the desktop on most Linux distros, you can go into the package manager, install another desktop, and then log out and back into that one.


You are both right of course. But in being so right we can see the problem. If I am a third party software package and I try to install I have to know all the possible window systems you may, or may not, have running. And it gets worse for me if I only support one since there will be vocal anti-support for any version I pick.

I really disliked the Windows95 window system which I was thrust into when I left Sun for a startup. But over time I learned its quirks so that I could get stuff done in spite of it and eventually came to appreciate what the developers were going for when they shipped it.

But had there been any way to go back to something like the Sun desktop when I first encountered it, I would have in a heartbeat. The change interfered with my productivity.

Linux gives you that chance, you can stick with what ever window system you want as long as your willing to recompile from source if it stops getting maintained. And maintain all of the packages that go with it, and maintain all of the utilities that adjust it, and maintain all of those 'throwaway' apps that you use from time to time. It wears on one to do so.


> If I am a third party software package and I try to install I have to know all the possible window systems you may, or may not, have running. And it gets worse for me if I only support one since there will be vocal anti-support for any version I pick.

Most software packages do not need to know what window system you are using, and when they do, it is almost always for non critical conveinces in OS integration.

>Linux gives you that chance, you can stick with what ever window system you want as long as your willing to recompile from source if it stops getting maintained. And maintain all of the packages that go with it, and maintain all of the utilities that adjust it, and maintain all of those 'throwaway' apps that you use from time to time. It wears on one to do so.

That rarely happens with popular software. The most common thing to have happen is your preffered distribution swithces window systems, in which case the actual maintainer of the system will continue to maintain it. Or in the case of Gnome, the old version will get forked and maintained by another group. The only time the problem you describe will happen is if the developers of the window system abandon it, and it is not a highly popular system. This is far less frequent than the OS maintainer deciding that the software is not the one true way.


I haven't quite seen your hypothetical with hotels yet, but the low-cost airline market in Europe is pretty much converging on selling you a dirt-cheap ticket and then trying to market you to a bunch of other revenue sources.


> Time to start a 'developers' company that works very much on the same model that Sun Microsystems started on, hardware designed from the ground up to be developed on, open systems so that folks can easily work with it, and a team dedicated to making sure integration and support is there so that folks like you and I can say "Hey this audio doesn't work when you set the sampling rate to 40Khz" and they can fix it and release that fix.

Um. Any company that does that is probably going to go out of business, just like Sun. You know who's a "developer's" company? Microsoft. And if you drink their kool-aid, your Visual Studio-developed desktop apps will run smoothly across all sorts of hardware with a single compiled exe. Microsoft convincingly showed that hardware doesn't matter many years ago, in fact their focus on software over hardware is arguably why they won, and they treat their licensed software devs well regardless of what company they work for. There are many instances of "Hey Microsoft, this doesn't work" and Microsoft responding with a custom piece of software to fix the problem. Some developers are angry about Windows 8 because they perceive that Microsoft is caring about them less, and that's probably true, but Microsoft will learn again that it shouldn't upset its developers if it wants to retain its supremacy.


I've had a soft spot for Microsoft ever since C# became my primary programming language. The .NET documentation is incredible, and Visual Studio works flawlessly with their languages. I'm loving their Kool-aid.


>in fact their focus on software over hardware is arguably why they won

nope, it was their business acumen, and willingness to kill off competitors


> Some developers are angry about Windows 8 because they perceive that Microsoft is caring about them less, and that's probably true

C'mon, Microsoft just handed them an app store for the entire ecosystem. None of their technology stack was designed to do that before surely some stuff had to change.

assuming they survive, that's pretty cool.


I have been wondering if the tablet market will help us on the way to this. Once 'normal' people stop buying desktops and laptops the market becomes much smaller but more focussed on power users.

This trend towards apps stores may also be pushing in the same direction.


What you describe sounds like what Dell's "Project Sputnik" is aiming to do.

http://bartongeorge.net/2012/11/06/project-sputnik-profile-t...

Looks like they plan to launch something soon. Hope it turns out well.


> The bottom line is that this 'experience' isn't about you, its about who you can be sold to

To be honest, I didn't jump from Windows to Linux because of this. I jumped because Windows ME was just that bad. However, this makes me increasingly happy I jumped with each new version of both Windows and MacOS.

> Time to start a 'developers' company that works very much on the same model that Sun Microsystems started on, hardware designed from the ground up to be developed on, open systems so that folks can easily work with it, and a team dedicated to making sure integration and support is there so that folks like you and I can say "Hey this audio doesn't work when you set the sampling rate to 40Khz" and they can fix it and release that fix.

Bits and pieces of this are already here. We need a Wozniak to find the best way to put them together and create the new pieces we need. Too bad that of The Two Steves, Jobs gets the accolades and Wozniak gets ignored.


>Too bad that of The Two Steves, Jobs gets the accolades and Wozniak gets ignored.

Oh please, not this again. But OK, I'll bite:

Wozniak was given plenty of recognition for his work at Apple.

But he quit in 87, which was 25 years ago. What has he done since? Not that much, really.

Steve Jobs, on the other hand, has brought tremendous contributions to a number of industries.

In the light of this, it makes complete sense that Steve Jobs is a household name, whereas Wozniak isn't (but Wozniak is definitely a "hacker household name", which is what he deserves to be).


It's even worse than that, as Woz effectively left Apple after his plane crash in 1981. He continued to officially be an employee, but didn't contribute a whole lot after that point. Quitting in 1987 was more of a formality than anything.

The last Apple computer Woz had a direct hand in designing was either the Apple II or the II+ (I can't quite figure out which), over three decades ago. Woz was a genius when it came to minimal and effective designs and it's certain that he played a key role in getting Apple off the ground, but he has had no real impact on the company or anything the company does for nearly as long as I've been alive.

Woz helped build Apple into a successful company, but the scale isn't even remotely the same. The Apple that Jobs subsequently built decades later now sells more hardware in a month (or so) than Woz's Apple sold during his entire tenure.


Thanks to both you and the parent for thoughtfully laying out the history. Saying the "two Steves" as if they are parallel doesn't make sense.


I wonder if this trope would ever come up if one of them were named Bob instead.


I'm sure Woz continued to be an influence on Jobs for quite some time though.

I guess if you have seen somebody build one of the worlds first affordable computers out of some bits in garage it probably seems like anything is possible when you have a few billion $.


Woz designed Apple Desktop Bus, first used in the Apple IIGS and then in Macintoshes until the switch to USB.

But your point remains; Woz hasn't done anything at Apple since 1987.


Yet his biography is called "iWoz", why have people let him ride the fame train on other peoples work for so long.


Exactly what contributions did Steve Jobs do beyond opening up markets by design and marketing powers?

It had a large impact on a number of industries, but your statement implied a multiple of contributions.


Bringing tablet computing, multi-touch, voice control, quality screens (IPS etc), video-calling on phones, small utility software, ubiquitous (real) web access, to mainstream (before someone attacks: yes, all these existed for a long time, but only became common place after Apple pushed them)? Pushing hardware design way past what anyone else was doing in the past half dozen years? Cornering the music industry with a digital distribution model that works?

I can't even begin to enumerate the things you see everyday that are influenced by Apple, besides their own products.


The singular man acting as CEO did all this, and more! He is smarter than everyone else ever.

Or he just used smart and predatory business practices, and somehow was the only major player to realize eyecandy will win over configuration for everyone that wants just a "tv" into the internet.


Well, just 20 minutes ago I was using my "TV" and it's nice unix shell to develop a cross-platform multiplayer game. And you could say it's usability and experience over configuration, not eye-candy (remember Aero Glass? KDE Plasma?), althought nice-looking interfaces can help with both.

No, he didn't do all this himself, but he was captaining the ship. Has Sergey Brin changed the way people use the internet?


"Has Sergey Brin changed the way people use the internet?"

Let me Google that for you.


You could just as easily alta-vista, or yahoo that because people used those search engines originally, and they existed way before Google did.

Google however turned a better product and has therefore captured the lion's share of the market. But they Google search engine is pretty much a copy in functionality of previous ones that came before.

Of course Google have brought other things to the internet (maps comes to mind), but the search engine didn't change how people used the internet - we were searching long before Google.


The same exact accusations could be (and have been) leveled at Apple. The fact of the matter is that Apple is a very, very good marketing company; as for revolutionary technology and being innovators, not so much.


Apple's innovations are myriad and subtle, particularly on the software side. Almost all of these things have been copied by the various Linux environments:

- Sub-second reconnection to wifi when resuming from sleep, using past remembered networks and IPs

- Bonjour zero-configuration service discovery for printers, network speakers, etc.

- Exposé window management and the GPU-accelerated desktop in general, not to mention just-in-time-compiled GPU accelerated image and video manipulation with CoreImage / CoreVideo.

- Spotlight indexed desktop search that's actually usable for real world filetypes

- Quicklook instant previews, not to mention native PDF support not involving Adobe Reader

- A mail program that autoconfigures based on just the email address, just by trying the obvious options

etc

Add in the fact that Apple is still the only company to have multi-touch work properly on a desktop, that their hardware has been dominating geek conferences for years because of its travel-friendlyness, and there is plenty of innovation to go around.


While I started this comment thread, I do think Apple has done a lot of good putting quality back into hardware in many regards. It really pains me that I'm going to have them to thank for getting us out of the low PPI dark ages, but it is almost exclusively due to their market pressure for high pixel displays that will finally end the last decade of pixelated nonsense.


still does more in a year than Woz has in his entire career


Opening up a market is a contribution (singular) to the community, but this is a thin line to call "tremendous contributor". Wallmart was one of the major players in pushing USB drives and 3g modems to the public. Does this mean that Wallmart is a tremendous contributor to USB?

In what way has multi-touch been pushed way past what anyone has done in the past? (say, compared to Sears et al work from 1990?) Actually, any of the technologies on that list, in what way were they pushed way past what anyone else was doing in the past? Cornering the music industry (legally) was an achievement indeed, but a tremendous contribution? My thought goes back to Wallmart and any product that they was first to successfully sell.


He brought design sense to the company, making -- in my opinion -- the first computers ever made that did not look like shit.


When disagreeing, it would benefit if the person who downvote would say what the contributions are instead of silently downvote.

Hacker news arent a poll of "do you like Steve Jobs" with simple yes/no, and i am honestly interested to hear what those contributions are.


I didn't downvote because I disagreed (or agreed), but because your question seemed very accusatory and asked in bad faith, so it does not contribute to the discussion. A reworded question that made it clear you were honestly inquiring still might not get answered, but I wouldn't downvote.


That's because the question was rhetoric and its purpose was to state an opinion. Specifically, the opinion that Steve Jobs did not have significant contributions to computer industry.

So I think you downvoted an opinion.


Really? Seems like a legitimate question to me.


yeah well his contributions have been discussed numerous times, followed by people saying he didn't really have much to do with those things. This part of HN is so repetitive.


Its useful to reiterate the discussion in case new facts or concept can be brought forth. mostly, I ask the question to confirm (or disprove) my own opinion. If a person states disproven facts, one should always allow the person the back his statements up.


Well known inventors aren't well known because of their great inventions. They're well known because there's a corporate PR department promoting the legend. Ever heard of Thomas Edison? He founded a company named General Electric. Heard of Alexander Graham Bell? AT&T.

The reason fewer people have heard of Wozniak is because he left the company 25 years ago. He doesn't get the same PR treatment from the Apple marketing fund.

Now, here are some bonus questions: You all know who invented the transistor at Bell Labs, but who invented the LASER? How about the microwave oven? FM radio?

Actually, I don't know why few people know who invented the microwave oven, since the inventor was a big-wig at Raytheon. I guess Raytheon doesn't have much of a PR department.


So bored of people still giving Woz the spotlight.

He hasn't done anything worthwhile but trundle around on a segway since leaving Apple


The pieces are all there already, but you can't pull it off with dis-associated independent groups. Canonical comes close, as does Redhat, either could do this is they started making their own hardware, or contracting out a HW 'spec.'

But you need a 'real' company to say to someone like an Atheros that you'll design in their wireless chip only if they will commit to driver support for it on your OS. That sort of thing doesn't happen between a company and an 'interested third party' sadly.

The other thing folks will have to come to realize that you won't be able to combine your "TV" and your "computer" much longer. The demands of content producers are reaching a point that 'general purpose computing' on devices that can show their content are made increasingly more difficult and eventually they won't legally exist. Secure boot, HDCP, all of these things are manifestations of externally exerted control over the platform.

The good news is that computers will go back to being computers. It is going to feel a bit weird for a while though.


> To be honest, I didn't jump from Windows to Linux because of this. I jumped because Windows ME was just that bad. However, this makes me increasingly happy I jumped with each new version of both Windows and MacOS.

Yup, every time I use someone else's computer I'm astounded by the sorts of things they put up with. A simple example - after years of using AdBlock and FlashBlock, I simply don't have any tolerance for ads on the web. I don't have the ability to ignore them.


"Now that Norton was gone, I personally happen to like Comodo firewall. So I go to download it, but the installer nicely tells me that this doesn't work on Windows 8, and I need to download another file instead. However, this is their pro product, which basically means it's filled with crap. It's a good thing I noticed the tiny Customize Installer button because otherwise it would have: Changed my home page, subscribed me to something called GeekBuddy, enrolled me to their cloud program, change my DNS servers, and sent information about each scan it does to the company. The same was true with many of the applications I installed, like Adobe Reader trying to install McAfee, or QuickTime trying to sign me up for offers."

I hope Mark Shuttleworth might read and understand this paragraph before he continues to reduce the trust of Ubuntu users. The crucial difference between users of Windows/Apple on the one hand and GNU/Linux on the other is that Linux users have choices at the distribution level. A secondary advantage for GNU/Linux users is that once a distribution has been chosen, application software is packaged centrally with some degree of oversight so this kind of bait and switch by software vendors becomes harder.

The Ubuntunaughts at Canonical seem to be heading in the direction of selecting hardware which they can ensure will 'just work'.


Its hard for Shuttleworth because he has burned millions and millions on Canonical and it is now clear that the company is going to fail. People simply will not pay for linux desktop support and there really is no other model. So all this crap they are doing, it is their death throes and they should be pitied and not mocked.


It is not clear that Canonical is going to fail, just that their support model for making desktop linux profitable failed. We are seeing them change their profit model with the Amazon links in unity, and their attempt to get into the android/desktop linux hybrid market. It remains to be seen how these play out.


"People simply will not pay for linux desktop support and there really is no other model"

RHEL need no competitors at all then?


RHEL is a server product, I guess I don't follow your comment. Do they even sell desktop support?

If you are suggesting Ubuntu should focus on the server market - well they have put a lot into that too but they cannot get the market share required to sustain their business from that. RHEL is far too entrenched.


RedHat quit the desktop market years ago exactly because they found it to be unprofitable.


https://www.redhat.com/apps/store/desktop/

Red Hat seem to be selling a Desktop support package still, but minus the support! I assume this is a cost for the installer and updates.

I've donated the sterling equivalent of $49 to Ubuntu, and would do that on a yearly basis if it meant a cruft free system.


Canocial doesn't select hardware to work, they certify that hardware does work.


> Bits and pieces of this are already here. We need a Wozniak to find the best way to put them together and create the new pieces we need.

No Wozniak required. Aren't there already companies that sell Linux laptops that "just work?" Really, the big challenges are in organization, distribution, and marketing.

How about this? http://arstechnica.com/gadgets/2012/11/zareason-ultralap-430...


And also these guys https://www.system76.com/.

I have bought two laptops from ZaReason, with mixed results. They mostly "just work".


IMHO you also need a Jobs who wanders around (a) selling the thing to normal humans and (b) being a relentless advocate for the user experience, with the power to delay releases if something is broken.


> being a relentless advocate for the user experience

The dark side of this is why I hate MacOS: There are things in MacOS I consider broken that I cannot fix, because Apple is dedicated to One Apple Way. Great for the Mac Fan, lousy for someone who has their own workflow.

Apple only has room for one Jobs, one person to dictate how the experience is. Anyone else has to bow to Jobs or GTFO.

The solution is good defaults with configurability maintained as a first-class citizen. Ubuntu has this, mostly to the extent it keeps non-Unity window managers and desktop environments in the Ubuntu package repos. I can still use all of the Ubuntu stuff except the tiny amount that really does depend on Unity, which wouldn't make sense with my workflow anyway.


> ... lousy for someone who has their own workflow.

"You're holding it wrong," indeed.

Your comment makes me want to try Ubuntu again on my MacBook. What I liked about it last time was that everything worked--just like on OS X. I guess the hardware premium and hardware monoculture are helpful even in the open source community.


> the hardware premium and hardware monoculture

... the culture of hardware developers that isn't focused on 'It works on Windows', the culture of driver developers that aren't (apparently) seen as loss-centers by the hardware makers, and a number of other things that slip my mind, I'm sure.

The only thing better is something like Stallman's current laptop, which was built ground-up to be Free and Open. Given Stallman's track record with 'crazy' predictions like 'The Right To Read', I fully expect to eventually end up on something like that as my primary machine.


> you have to not be able to get something 'good enough'

So true. so true. My applauses sir.


I feel that this is one of the major reasons I find myself using tablets more often.

The UX on PCs has turned to crap. All programs want to run in the background, they all compete for my attention every 30 minutes, they all want to be updated all the time and they all want to install some stupid toolbar into my browser that you can't remove, or even if you do it will somehow come back. Installing new programs has turned into a 10-minute minigame of "spot the checkbox that screws up your computer deep in the Custom Install settings!". Or sometimes I have to re-read a question in a randomly intruding dialog box 5 times just to figure out if I should answer Yes or No to their vague question for them to just please leave my computer alone. It's extremely annoying. I also can't remember the number of times I've cleaned up my msconfig startup. The programs just magically insert themselves back somehow over time. It's infuriating and exhausting.

I feel protected on my iPad from this torture.


There are other perspectives, though. I've been on a Linux desktop almost exclusively for 19 years now. I continue to be amazed at the amount of spammy crap I have to put up with on my phone and tablet (running CM10 and stock 4.0.2 respectively, so it's not like I'm beholden to a manufacturer). Seems like every game wants to tell me about new features in the status bar every day. All the online services want to connect me to all the others. Every day or so some app has an "upgrade" available that inexplicably needs new permissions authorized. And frankly it seems to be getting worse.

And my desktop distro just goes on, doing what it's supposed to do, year after year.


> And my desktop distro just goes on, doing what it's supposed to do, year after year.

What distro are you using? Because not all of them are free of this sort of crap. For example, Canonical decided to pollute the universal search in Unity, starting with Ubuntu 12.10, by adding in results from Amazon.


First of all, this guy's biggest mistake was failing to format his drive and start with a clean Windows install. Installing windows requires far less effort than cleansing a vendor's install of crapware. Users should not have to do this. I must fully agree that MS needs to start restricting what Vendors like HP can install on the PC's they sell.

Second, I'm very curious to see how the Windows App store plays out. The #1 thing it needs to do to improve Windows as an OS is distribute free software. There is a lot of excellent free software available for windows like chrome, firefox, VLC, foobar2000, texmaker, Notepad++, uTorrent, etc.. Users have to go to different websites to download everything. This is such a pain that people have come up with installers, like ninite, that aggregate free software together into a single download. Ninite doesn't have everything I use, but it can easily shave hours off of setting up a new Windows box!

One major advantage of an App store is that software distributed through it can be policed for malware and viruses. If MS could get users to use their store as much as possible there is the potential to improve security of the OS. The only way MS can do this is to build their store up as a trusted and comprehensive distribution center that is all most users need. MS should view it as a failure on their part whenever users are forced to go elsewhere to get software, even software that competes with Microsoft products or which duplicates core functionality of the OS. That's where Apple's App store failed! In order to do this, MS needs to devote resources to lowering the barrier to publication in their store. Don't get me wrong, I am dead-set against Windows moving towards an entirely walled-garden iOS style ecosystem. The ability to install software from outside the store should be preserved, nor should it be limited in any way. However, I would welcome a central distribution point for Windows software like what Linux has.

Debian's APT package management system is brilliant. Even 10 years ago it would have made today's Apple App store look sad and pathetic. It is both comprehensive and incredibly smart in how it makes software modular with clear dependencies that are managed automatically for the user. This is the dream that all application stores should aspire towards. Redmond and Cupertino, for the love of your users, please start your copiers.


As I mentioned elsewhere, doing a clean Windows install is difficult if the only means of reinstallation you have is the bundled "recovery DVD" which puts all of the shit right back on there.

With regards an APT for Windows, I don't see how that wouldn't be possible to add to Windows as a third party thing. With a nice GUI and search and a default catalog of quality mainly OSS software with no toolbars or BS and have it also keep everything upto date.

That way you could just install it on all of your relatives who use Windows's computers and tell them just to download everything from either that or the MS app store.


Microsoft provides official (clean) ISO of W7 flavours -

http://www.mydigitallife.info/download-windows-7-iso-officia...

So, basically, burn respective .iso to a USB key, boot from it, install, enter the key from the sticker on the bottom of the laptop - done. Caveats are (a) limited choice of languages (b) lack of some brand-specific drivers, but if you run English version on a common laptop, it's very straight forward.


These won't work with OEM keys supplied by the manufacturer. You would need to acquire your own key (enterprise, retail, etc.).


That is incorrect, at least for most retail OEM keys anyway.

I noticed that starting with Vista, the distinction between OEM vs Non-OEM key seems to have been reduced. This makes life easier when there is no recovery partition or the hard-drive is hosed. Whereas with XP, you did have to use an OEM version for the key to work.

I have not had any issue validating windows using the above versions of Win 7, as well as a few vista cds that I believe are retail, as long as there is an OEM sticker on the laptop. Every so often I have to call in and do the automated telephone activation, but they are valid CD Keys and I think that is probably tied to how often the key was activated.

Having said that, at least with Vista, the disc the manufacturer gives you is often locked to a specific laptop/bios/board.


>Every so often I have to call in and do the automated telephone activation

reason number 1001 why I hate windows


It takes exactly 30 seconds and it is fully automated. Punch in a string of numbers, hear back the activation code. No personal information, no humans involved. I too was dreading doing over the phone activation, but it was remarkably nice experience.


sure, they make it easy as possible, but it's not optional. And I like to change the hardware in my desktop, which makes it a pain.


>No personal information //

So, not your phone number, or the match of the given code with your computer's IP and usage when you go online?


They worked with three different OEM keys I had just fine, all from different manufacturers.


(b) is the big catch that will kick anyone. You want the hardware to work as designed, you have to start with the manufacturer install and work your way down.


Never had any problems with Lenovo, Sony, Acer or HP laptops. I'm sure there are some non-mainstream laptop manufacturers that don't maintain their support sites well, but that's be more of an exception than a rule.


You can easily download the drivers from the manufacturer's website but I understand that even doing that may be too complex for some users.


keep in mind that some (most) manufacturers don't seem to give any craps about the users either, and provide shoddy bloatware drivers that are difficult to hunt down, take forever to install, often don't work with older or newer OS/hardware, etc.

and then there might be a specific patch or workaround that's been applied by the oem that keeps the computer from exploding, but they sold it anyway because you can blow yourself up as much as you like AFTER voiding the warranty.


I haven't used it yet, but windows 8 has a 'reset windows' option that will basically reset everything to a fresh install of windows 8 without having to do a reinstall.


Nice, does it allow OEMs to decide what to "reset" to though?


"doing a clean Windows install is difficult if the only means of reinstallation you have is the bundled "recovery DVD" which puts all of the shit right back on there."

That's not a clean install.

A clean install is buying a retail version of Windows and blasting it down on your HD.

Recovery DVDs recover what the hardware vendor put there in the first place.


So you have to buy Windows twice?


Unfortunately yes. The version that comes with your computer is relatively cheap, but you can't do much with it. The retail version gives you full flexibility, and you can move it to a new computer.

Or you can buy a computer that doesn't come with Windows in the first place.


That is incorrect. See eps's comment above. You can use your OEM key with a Windows 7 ISO that you can get from MS.


I think you can't buy a retail version on Windows anymore. Plus, when it was available previously, you had to pay for that.


Actually, you can buy an update to Windows 7 right now for $39.99:

http://www.microsoftstore.com/store/msstore/html/pbpage.Wind...

You can wipe a Windows 7 install and do a completely clean Windows 8 install.


A comment yesterday [1] led me to Chocolatey NuGet [2], a package manager for Windows built on PowerShell. The gallery [3] has almost everything I need to set up a dev environment.

[1] https://news.ycombinator.com/item?id=4763895

[2] https://github.com/chocolatey/chocolatey/wiki

[3] http://chocolatey.org/packages


I was thinking more as a way for installing stuff for more casual users.

Now I think about it these just seems like such an obvious idea that I'm surprised nobody ever did it.


Ninite does a lot of what you are talking about.


It allows you to install some software.


First of all? Really? No. You should not have to for,at your hard drive when you buy a new computer. Nerds aren't the only ones buying computers and the experience the author describes isn't how you treat a customer and you should never have to tell a customer to do some inconvenient thing if they want to use their new computer without hassles.

Go tell your mom to format her hard drive and let me know what she says. I don't know about you but for most of us the answer would be "what?".


The guy's biggest mistake was buying an HP anything. HP is the new RIM. And they deserve the same outcome for the massive quantities of crapware they've forced consumers to endure. The worst.


I consider myself a power-user, one which has grown extremely tired of doing the same re-install routine.

ninite seems just like what the doctor ordered and I cannot believe I haven't heard of it until now.

I know what I'll be trying out when we get new dev-machines at work and need to go through yet another reinstall. Thanks a bunch :)


Pretty sure this is exactly what MS is going for. The RT versions of the OS only run on ARM and are more locked down, because a lot of Windows software isn't compiled for ARM and/or hasn't been adapted to the RT ("Metro") API set.

But for the standard "Pro" versions of Windows available at Best Buy for $65, you can run anything you want, including a lot of legacy apps all the way back to Windows XP.


I've been a fan of the clean install and it is generally a lot better, but you'll often need to install third party drivers which will then try to install crapware at the same time.


> was failing to format his drive and start with a clean Windows install

HAHAHAHAHA IT'S LIKE I'M ACTUALLY LIVING IN 1997


Not computer users, just the windows users.

I never understood the install wizards for windows software, in 99% of the time you just click "next" until it stops asking for more "next".

And now they all sneak garbage adware with the check boxes enabled by default because most people just click "next" anyway.

Also the whole download and install drivers/utilities etc after installation is very time consuming because the OS has almost nothing bundled and what is bundled is MS only like internet explorer, nothing you can choose in that "next" wizard.

Windows is like a granny in the OS world, she needs to retire.


This is nonsense fanboism. Modern versions of Windows are extremely similar to OS X from a usability perspective. OS X has plenty of counterintuitive warts. Computers of any flavor are pretty hard to use.

> I never understood the install wizards for windows software...

I never understood why half the time when I download Mac software and double click it, nothing seems to happen because it's a "disk image" (talk about an outdated concept), which means I have to know to navigate to the Finder, find the actual thing I want, and then remember to unmount the image when I'm done.


One thing I noticed when I first bought a Mac was that the driver for my Lexmark printer was about 30MB and integrated properly with the system printing options. On Windows, it was more like 200MB, had its own custom print dialogue which could be themed and had an online gallery for themes which you could download. You could download a theme that turned your printer dialogue into a potato. Not even kidding. On Mac, the printing dialogue was just the standard OS X printer dialogue.

Why is this? I don't know. Maybe they put less, er, effort into the OS X driver because of marketshare. Perhaps it was a different set of developers and the Mac developers had some sense. Maybe it's not even possible to create custom-potato-themed printer dialogues for OS X. All I know is that since moving to OS X I've seen far less crapware and whilst you're right that OS X has some UI warts too, I find that it (and the 3rd party software you install) bugs you a hell of a lot less.


Modern versions of windows are extremely similar from a usability perspective, sure. But that's windows as defined by microsoft, and OSX as defined by apple. The problem occurs because third parties try to shoehorn all kinds of crap into windows applications and onto the windows desktop. There's nothing wrong with windows in theory, but in practice there sure are a lot of people trying to fuck up windows. It's much less of a problem on other operating systems.


there sure are a lot of people try to fuck up windows. It's much less of a problem on other operating systems.

Possibly because Windows has many, many, many more users than other operating systems do.

It's like the old urban legend about bank robber Willie Sutton (http://en.wikipedia.org/wiki/Willie_Sutton) answering a reporter's query about why he robbed banks with "because that's where the money is." Crapware vendors focus on Windows because that's where the users are.


One of the major reasons genetic diversity is important is that it makes it harder for a population to be wiped out by infection. The windows monoculture is inevitably a victim of itself.


It doesn't seem wiped out yet.


It's not advantageous for a parasite to immediately kill it's host. By 'wiped out' I don't mean the literal death of windows installs, I mean in the sense that a pristine windows install will, on average, slowly accumulate cruft until it's on the verge of being unusable. It's not a 1:1 analogy. In a strictly practical sense, an individual windows install may not be nonoperational due to mal/spy/crapware, but the quality of the user experience goes way down as a result of it.


This is hilarious. You call others fanboys and then compare the horrific inconvenience of disk images (which I don't understand either but it's what, double-click and then drag, right) to Windows install wizards, uninstall, etc. Get real.


Get real? Why the hostility? Eli's comment is aimed at how unintuitive OS X's system is, not how inconvenient it is compared to Windows.


In my opinion it's not "installing" apps in Mac that can be "unintuitive", it's grasping the fact that a lot of applications are self contained; setting up an application on your OS is as simple as dragging it to a folder on your hard drive (generally) and it can be unintuitive to users that are accustomed to Microsoft installers and package managers.

OSX by default (at least for DMG's) opens a window with the extracted content and most include "setup" instructions ("drag FOO into the Applications folder"), so it's not hard to miss.


> it can be unintuitive to users that are accustomed to Microsoft installers and package managers

I was afraid of that first, but after I briefly explained that to my father 3 years ago, he has never had a problem again (except with Flash Player)


This is a reply to a post that starts with "This is nonsense fanboism". "Get real" seems quite mild to me.


>This is nonsense fanboism I'm not a fanboy, I didn't mention apple or anything else, I used ubuntu for a long time and it doesn't have this kind of problems.


We're not really talking about usability in general - the particular issue of default installs being crapped up is pretty Windows PC (and maybe Android?) specific


WHat's wrong with Android installers? A single file with a single button to press.


The problem isn't the process of installing. It's the damned trip line installs along the way. If you install Adobe Reader, it shouldn't install McAfee for you. If I install your software, you don't have the right to install browser plugins. In terms of installers, Windows has developed a culture of malware that really needs to go.


Adobe pulls this maneuver on OS X as well. Other Mac apps will try to sneak in some global software for cool effects; I've had Growl installed three of four times without my consent. Obviously the Windows installers were worse on average, but I think OS X would have gone that direction if not for sandboxing. If sandboxing hadn't turned out to be awful I would upgrade. The idea is really good for users, but their implementation is lacking.


Don't install insecure crapware like Adobe Reader. Chrome has an internal pdf reader. I've personally also got Evince installed.


Likewise. I've managed to learn through the misfortune of others. That said the way we find these things out is through the misfortune of others or ourselves.


This is nonsense fanboism. Modern versions...

You lost me at the first sentence.


Me too, I clearly missed the bit in the parent where givan openly gushed about OSX being better than Windows at installing software.


> And now they all sneak garbage adware with the check boxes enabled by default because most people just click "next" anyway.

While I agree that OEM computers come installed with a lot of crap, that's not a Windows problem, that's an OEM problem.

> Also the whole download and install drivers/utilities etc after installation is very time consuming because the OS has almost nothing bundle

The OS part of Windows is about 2 gigabytes (tried looking for source, but couldn't find one. Although this explains how you can fit a bootable emergency windows suite on a flash drive). The many other gigabytes are drivers. Plus, it's pretty disingenuous to blame Microsoft for Windows not supporting 3rd party devices, even though Windows checks for driver updates automatically.


While I agree that OEM computers come installed with a lot of crap, that's not a Windows problem, that's an OEM problem.

Windows is as successful as it is because of Microsoft's cooperation with crappy OEMs who are only too happy to betray their customers for a quick buck.

You can't have Windows without the OEMs - show me the Microsoft PC you plan to buy. Microsoft does the best they can, working within their constraints, but at the end of the day they're throwing their product over the fence and letting someone else package it.

And you can't have good OEMs because Microsoft's PC strategy - with WHQL, PNP, UEFI, ACPI, and just about every other hardware initiative they've participated in, has been to make it difficult for hardware manufacturers to innovate in creative, non-standard ways without Microsoft's prior consent. Hence the race to the bottom among PC manufacturers. Cheap PCs, yes, but there's not much room for innovation that hasn't been green-lighted in some way by Microsoft and so the only PC manufacturers who do well are those who can survive on thin margins.

So yes, it is a Windows problem.


Erm no the DoJ anti trust ruling means that Ms can't dictate what the oems can or can't install on their machines.


If you go back and carefully read my post, I said nothing about whether Microsoft could 'dictate' to the OEMs - only that Microsoft doesn't want to sell hardware PCs and that hardware PC OEMs are almost necessarily bad because the market for PCs rewards low price and hardware component stats, not the overall usability, quality, or unique innovation of the product.

Microsoft under Gates and Balmer demonstrated very clearly that they couldn't be trusted to exert their influence over the hardware OEMs. They couldn't resist using their power to hammer small start-ups selling potentially threatening new technologies. So yes, they've lost this lever that might otherwise have been used to protect PC buyers.


I think every OS has lots of crap burden. On linux you have complicated unnecesary things geeks will defend to death. On OS X I recently tried to install octave. How I did it:

Step 1: Create an Apple Store account. Credit cards aren't so common here because bank transfers are basicly free and you automaticly get a debit card from your bank that is universally accepted. If you try to create an account directly you will fail if you don't have a credit card. You need to "purchase" a free App to be able to create an account without credit card information.

Step 2: Download Xcode and skim the licence agreement for 10 minutes so you don't sell your organs to Apple

Step 3: Find the Console Development Tools installation somewhere hidden in the options of xcode

Step 4: Install macports

Step 5: Install octave with macports, this encompasses half an gentoo installation. Things already present on the system like the llvm from xcode get rebuild. This takes hours.


You'd have had a better experience had you:

1. Installed Command Line Tools for Xcode instead of Xcode (you don't need the IDE now for build tools, and yes this used to be painful)

2. use homebrew instead of macports which is more actively maintained and generally works better


>I never understood the install wizards for windows software, in 99% of the time you just click "next" until it stops asking for more "next".

Well, if you use the Windows Store to get applications, you can finally get rid of them.

>the OS has almost nothing bundled

Really? Windows 8, for instance, comes with much of what you'd need. It's bundles a lot more than previous versions of windows

>nothing you can choose in that "next" wizard.

I'm not quite sure what this means.


>I'm not quite sure what this means.

I was referring to the fact that Microsoft does not bundle multiple browsers to choose from and forces everyone to have IE installed and unfortunately most users don't understand what a browser is to fix this.

And this is the reason web developers had nightmares with IE6 in the past and some still have with IE7.


Microsoft shouldn't bundle multiple browsers. Now, I can see your point that they don't exactly provide others (except in the EU)... but I'm not sure this is really such a bad thing.


What operating system does bundle multiple browsers?


By that logic MS should also bundle multiple video and audio player, calculators, image editors, file explorers, email clients and who knows what else. And what's that if not bloatware?


> Not computer users, just the windows users.

Well, on a new Ubuntu installation the first thing I have to do now is:

  sudo apt-get remove unity-lens-shopping
But still way better than the crap windows users have to put up with.


yeah, I definitely prefer 'sudo apt get install this-thing && sudo pray-to-god-this-works'


There's a pray-to-god part, true. Largely when it comes to hardware support, which may require some unpleasant steps. Or may not work at all, though that can be usually known beforehand.

But once you get the hardware to work, apt-get install really is a great way to get quality software. You list a bunch of stuff and it will get installed. Without trying to trick you into bundled crap, without charging you, with quite decent documentation included. If you run into trouble, there are people you can reach out to -- other users of your distro, mailing list, even the authors. It's not perfect, and may be time consuming but head and shoulders above of what regular users of other systems can get nowadays.

More than that. Package is maintained by people who use it themselves and will usually get updated in a timely manner, with minimal effort required on your part.


Actually, there is a certain class of software that works amazingly well that well. Installing apache/php/mysql on linux with apt beats macports or plain windows so hard it's not even funny.


Installing part is easy, I agree but trying to uninstall either one of them leaves a mess behind. Configuration and other misc files are left behind so if your program does not start because the conf is corrupted, uninstalling and installing doesn't work. It seems though that Ubuntu never lets the program know that it is being uninstalled leaving behind a string of custom generated files.


Well that's what purge rather than uninstall is for, does it not reliably wipe out configs?


I did on some occasions but it too din't do the job. It couldn't recognize the custom generated log and config files. I was stuck tracing those files and removing them manually. If I remember correctly this was for postfix/dovecot installations.


well, the last time when I used windows was like sometime between 2006-2007, I never had to pray, everything seems to work just fine and smooth until now.


After almost 20 years of running Linux, FreeBSD (and even NextStep!) on x86 bare metal, I've for the time being gone back to Windows.

It's a fast dual-head box. One screen is all Windows (and I'm typing this from IE 10), and the other is xmonad and emacs on Arch Linux on VirtualBox.

I'm also an Apple user (and AAPL shareholder), owning two (2010 and 2011 model) MBPs, and it is my opinion that Windows 8 is gorgeous and very, very stable. Microsoft has finally simplified the desktop experience, with sane security defaults, and in my opinion are very close if not at par with Mountain Lion on ease of use and attention to detail.

Finally, the system's performance as a desktop machine is at least %25 better than Unity/Gnome 3/KDE, and the overall fit and finish is years ahead of any leading open source DE. It was the Desktop Linux experience that has encouraged me to look elsewhere.

I'd encourage everyone to keep an open mind wrt MSFT. They're on the right track. This is not the same company I grew up hating.


Although I haven't tried Windows 8 yet, I had the same experience using Windows 7.

It's interesting that only 7 years ago you probably wouldn't have be able to sit me in front of a Windows machine without resistance or at least some complaining. Since then I turned out to be a frequent defender of Microsoft in my surroundings and I'm considering ditching my Apple ecosystem for Windows 8 (esp. because of Surface and Windows Phone 8).

But the real problem, as have stated other, is OEM software and 3rd-party-apps. It takes effort to maintain your system (although not much, at least in my opinion, but you have to pay attention to, for example, what is installed along the way). I usually have none of the problems my peers or friends have (with Windows) and if you look at their machine, you'll see why. Almost every time I visit my father, there is a new (worthless) toolbar installed.


Sorry, but the skeptical in me says this comment reeks of advertising. I don't know if it was your intention or not, but it feels like you try really hard to establish "geek cred" and then make some grandiose claims about Windows 8. It seems like you were comparing Windows 8 to a really non-standard Linux environment and then out of nowhere you claim performance is at least 25% better to three DE's which weren't in your dual-head box. Odd.

Extraordinary claims need extraordinary proof (some benchmarks, maybe), you are not providing any of that.


I think the Linux environment is probably more common than you think, among devs anyway.


Your setup is close to mine, which is Win 7 running VirtualBox and Linux Mint. I've not used a Mac. I like developing with Linux. A friend recommends ditching my double-OS setup in favor of Apple. My question to you is, why don't you use only Apple / Mountain Lion? Why bother with Win 8?


If you like to combine the Win desktop environment with the Linux dev environment, other than VirtualBox you could also check out MinGW if you haven't yet.


Obvious astroturfing is obvious. How many of you guys did Microsoft hire this time around? Guess they learned a lesson with the Vista failure.


I had a good laugh at people recommending Ubuntu. I recently installed the very latest Ubuntu on a VM, started the launcher, typed "terminal" ... and looked in utter disbelief as it started showing me random movies and crap from Amazon.

If that is your idea of a better operating system, try again.


Maybe fedora would work better for you. Or debian. Or one of the numerous Ubuntu spins which do not have ads.

The point is that there will always be a Linux distribution which focusses on the user because a lot of them are made by community


I'm not an ubuntu defender but honestly, aren't a couple Amazon ads better (or let's call it less bad) than all the crapware OP talks about?

We are talking 1 command against hours of fight against a whole system.


Yes, certainly. I dimly remember HP as being a particularly cruddy install.

The Amazon search in Unity home lens (where you go to launch programs &c) is attempting to be a dynamic search so that results narrow as you type. Network latency means it sort of 'jumps' in groups of letters. Sometimes results corresponding to part of the words you type appear. This can produce 'interesting' results.

http://ubuntuforums.org/showthread.php?t=2063868


I think I'd rather stay with crapware.

Companies, I don't care about your offers. When I will want to buy something, I'll Google myself a better deal.

Damn, it's just so wrong...


I have a good laugh at people that dismiss an entire OS for a problem solved with 10 seconds and 20 characters of effort.


Go read the article again. Now consider how inane what you just said is?

The whole point of the article is that these issues can be solved but that the combination of all of them eventually make the platform bad.

Your "just turn it off" statement couldn't be more off-base for this discussion if you tried.


Give me a break. `sudo apt-get remove unity-lens-shopping` or as the other replier comments, a single check-switch will disable this. It clearly is NOTHING compared to the daily bullshit in Windows.

What a self righteous jerk. The article is about the regular bullshit that Windows users go through. A single choice made to bolster revenue of an open source project is vastly different than a system that simply hasn't been innovated on for the past 20 years and just sucks in a lot of places.

If you can spend 20 seconds of effort and fix breaking issues like installation standardization, or (un)installation cleanup, or package management, or the new security system that manages to suck harder than GateKeeper... then you can come back and talk down to me.

edit: even just drivers. Do you know how frustrating it is to be given a laptop as a gift from Microsoft only to find that the drivers are difficult to locate, impossible to install without several Admin-level command line statements and a reboot and several scary warnings, and that even after installing them, several of them were just disasterously bad. Fedora, everything works out of the box. Ubuntu, everything works out of the box.


The difference is solely one of scale.


Or by clicking a switch in system settings.

Explanation for those outside the Ubuntu bubble: 12.10 has had an Amazon search integrated into the main Unity lens, and this is enabled by default. Early in the release of 12.10 a graphical 'kill switch' was added to system settings. Before that, you had to uninstall a package manually from the command line.

As I commented up the screen, I hope Canonical don't alienate their users with (any) more of this sort of thing as switching between GNU/Linux distributions is relatively easy.


You didn't get the point of this comment and article at all. The question is not in the quantity of things, it's that a hardware vendor (or here, distribution maintaner) would think so lowly of his customers that hes ready to just shove advertisements and annoyances into their face by default.

And theres no magic switch to get them back to a point where they recognize that users matter.


No. On a daily basis, I don't run into walls in Linux. In Windows, I do regularly. The issues discussed in this article stem from poor design in Windows and legacy shit that hasn't been updated. Just insert my other reply here, it's still perfectly applicable. If you can fix these issues at all yourself then they're not issues.


I remember reading that this is the reason[2] Steve Jobs wouldn't license the Mac OS X operating system to other OEM vendors. As I understand it, a good deal of this crap is caused by misaligned incentives. Software companies pay OEMs to preload computers with their buggy, invasive, resource hogging software. Thanks to the Microsoft Windows operating system having all sorts of nooks and crannies to hide this sort of stuff in, like msconfig; the "PC", with which Windows is synonymous, is seen as a painful platform full of crashes and arcane bullshit.

Of course, the cure is presented in the form of locked down, proprietary platforms which shelter users from software companies self inflicted wounds. As times goes on; these platforms will slowly come to resemble their former counterparts, but without the relief of being able to uninstall the crapware.

I think that priority one for people interested in the future of computing is to fix the experience for the 99%.

EDIT: The answer isn't necessarily other open systems like Linux or Haiku, OEM manufacturers can screw the default installs on those systems up just as well.

EDIT[2]: Part of the reason anyway.


> the reason Steve Jobs wouldn't license the Mac OS X operating system to other OEM vendors

The reason Jobs didn't license out OS X is because the last time Apple did that it almost killed the company. One of the first things Jobs did on returning to Apple was end those agreements.


Do you have a source for that?

My recollection was that Apple was in dire straits at the time anyhow, but that the clone makers were substantially increasing MacOS shipments at a time when Apple was threatened with irrelevancy.

I thought Jobs killed the clones because he wanted absolute control, which he saw as necessary to pursue his goals for a high-end, seamless experience.


The explanation I have always heard is that the margin on the software license (for OS 7) was not high enough to make up for the loss in hardware profits. The problem was exacerbated because the clone makers were targeting the high-margin top-end of the Apple line. (In other words, pretty much what the parent comment said.)

E.g., this blog entry (skip down to "Amelio") --

http://blog.adamnash.com/2008/04/16/reminder-why-apple-kille...

Here's Jobs on the subject, indicating it was unwillingness of the clone makers to accept higher license fees:

http://www.youtube.com/watch?v=maIgu_7oLm0

Of course, just because Jobs is saying it, you can't know if this is the whole story.


good thing he didn't. one of the reasons i enjoy working on OSX is the "mintness" i got accustomed from using Linux


No matter how hard I try I can't understand this comment. Are you implying that Linux is a Mac OS X clone?


I think what he's trying to say is that on Mac you get a fresh OS every time. It's never preloaded with crap. If you don't like the !ac you'll find a way to tell me I'm wrong but there really isn't any third party crapware on the Mac. You don't get security alerts every 2 seconds, and uninstalls really do just uninstall apps. You know exactly what to expect when you buy a Mac and the experience stays roughly the same throughout the lifetime of the machine. You can get crapware on a Mac but its pretty rare.

So basically you get a machine in mint condition. And yeah, it is like Linux in that when you install the OS it's the OS and nothing more. You can talk about freedom and locked down platforms all day long and I'll even agree but thats neither here nor there. Point is, the Mac isn't screaming for your attention, doesn't come preloaded with shit, and generally doesn't fuck with you in the same way Linux doesnt do those things. There are exceptions to every rule and god knows you have to cover them all here on HN but generally speaking that's the way it is.


Yep pretty much that and some. I find not only the install and out of the box experience very good on both systems, but also usage.

As a long time Linux and Gnome user i got accustomed with an arguably better experience in managing software and user experience :-)


Perhaps the answers are:

1. 1st-party OS installations in which the OS creators sell you the hardware with only the OS installed (Apple). Or possibly 3rd parties can license your OS only if they don't install any additional software (sort of like Nexus phones). Or obviously you could install Linux distros yourself... which is not appealing to most people.

2. Monitored software installations either in the form of (A) community managed software repositories as done with Linux distros and the BSDs, and/or (B) "app stores" with restrictive policies and sandboxed apps.


Ms can't dictate such licensing terms because of the stupid anti trust ruling.


The cure could also be MS limiting what OEMs can do to the pre-installed OS. Of course, if OEMs switch to free OSes, the copyleft would make it impossible for the OS maintainer to make any such demend. Another solution would be consumer pressure to not crap load PCs. This is one of the main reasons Macs did so well, but their is no reason a OEM cant get a brand recognition as non crap-loaded PCs.


DoJ anti trust ruling means MS can't dictate what OEMs can or can't install on their PCs. Just another way the stupid ruling has hurt the consumers more than it helped them.


I bought a Vizio thin and light partially because it came with what Vizio and Microsoft dubbed "Windows Signature." Which is essentially marketing speak for "Windows. Just Windows. No crapware preinstalled."

It was almost weird to be able to take a PC out of the box and be able to immediately start using it.

Microsoft apparently worked pretty closely with Vizio on this, which suggests that Microsoft knows that the experience on most Windows PCs sucks.


They do (have friends in the shell team @ MS). It's just really hard for them to hit the right balance between compatibility, which is _super_ important to their bread-and-butter enterprise customers, vs. providing a clean crap-free experience to consumers.

For better or worse, the "consumer" user of Windows -- a person at home, without an IT support desk, who purchased his computer himself at a store -- captured a lot of attention inside Microsoft for this release.

I'm not at all surprised to see more Apple-like behavior from them, e.g. (1) restricting what Windows RT can run to the software available in the windows app store, (2) completely breaking app compatibility for older (WinForms) apps on WinRT, (3) getting way more draconian about hardware requirements for the Phone (specification of button counts/sizes, etc.), (4) having a "Signature" edition of windows.

MS has been fighting this battle for decades -- most blue screens were caused by crappy hardware drivers; they knew this but couldn't fix the problem without hugely breaking compatibility.


This is one of the reasons I like running Linux, although this is really mainly a feature of having a smaller userbase and being mainly OSS so it is less of a target for this BS.

The only alternative for the mainstream seems to be the sandboxed type approach of iOS/WinRT. Oh, some developers abuse browser toolbars? No browser toolbars period! Background processes can make your system slower? No background processes period! Software from random websites can be viruses? Buy all of your software from us!


Yeah, as a Linux user this article was just a list of reasons not to go back to Windows.

It does seem ironic that the product you pay for is far more anti-consumer than the free (in both senses) alternative. But I don't think that fact can be denied.


It is inherently anti-consumer by being non-free software. Honestly, the FSF did call this crap, for all those who call them insane. They are insane, but they are sensibly insane.


Just wait until OEMs start shipping their own distributions...


Note this is an aside on USEFUL background apps, not spyware/crapware, since it appears he disabled the update mechanisms in the last two paragraphs:

Most of those background service apps (Google Update, Apple Push, Adobe Update) are because a tiny background app is necessary to preform certain functions, and check for updates in the background, which is a good thing!

For example, 'iTunesAgent.exe' on Windows was to simply detect if you had plugged an iPod in and auto pop-up iTunes. For usability, it makes sense; because otherwise you would plugin an iPod and get nothing or a confusing Windows 'dunno what to do with this drive' screen. And because of Windows' architecture (for better or worse), it requires a background app to do this.

The same for many of the updaters. I would much rather be sure Flash and Java are updated then go running without an important security update.

I still have yet to understand the "no background app, good or bad" manifesto. Good background apps barely use any CPU time, they barely use any memory (and Windows reports too much memory in the first place due to shared memory), and preforms useful functions.


In cases of useful background apps (and there are few), they should be run from the Windows scheduler. They don't need to always be running.

Why do we need 30 individual software update checkers running anyway? Linux solved this problem many years ago.


Actually it is a bad thing since system updates should be a centralized feature and not something that should need to be reinvented (badly) by every other program you install.


There's a difference between "background app that quietly does it's job and stays out of the way" and "holy fuck, what the hell is using all twelve cores and 8gigs of RAM and making Google take 30seconds to load!?" Running your software on my hardware is a privilege; don't abuse it, and I won't delete your oh-so-important-crapware. That includes not nagging me to update and/or register.


A few added things:

- To those suggesting using Linux, I use Windows because I need to. I run programs that require Windows, and don't really want to start messing with Wine. Also, from seeing the recent Ubuntu stories here on HN, it really does seem like the people behind Ubuntu and Gnome in particular are guilty of the same thing with their "brand"

- While Apple probably doesn't do this on MacOS they sure do try to get your email to sign you up to "offers" when you download QuickTime.


It would be interesting to see the list (or at least types) of programs you use that require Windows. I've got a CAD program that I've been using forever that does (TurboCAD) and I'm more facile in Corel Draw than Inkscape (which is horribly, horribly under-performing on the same hardware booted into Ubuntu 12.10).


Does Corel Draw use some kind of graphics hardware acceleration (e.g. Direct2D)? I find that GTK software completely ignores hardware accelerated rendering most of the time; Cairo (which renders most of the GNOME UI now) is pretty much completely a software renderer, and the GNOME 3 desktop only uses OpenGL for window compositing, not most drawing, as far as I'm aware.


Coincidentally perhaps, both Corel and TurboCAD use accelerated graphics for rendering.


For many this list has just one item:

- Adobe Photoshop

If you rely on it for work, than you can't just switch to Linux. GIMP is closer to paint than to PS.


People say Gimp has a different/worse UI, but roughly similar capabilities. Do you have a citation/basis to claiming Gimp is so much weaker?


I would argue that having a bad UI is a big enough problem.

I last used GIMP in 2010 so a few of these might be available now. Anyway here's my list:

- No support for raw images.

- No 16 Bit and 32 Bit color mode.

- No adjustment layers.

- No support for Photoshop plugins.

- Lack of LAB and CMYK color modes.

- No 3D support.

- No video support.

- Integration with other Adobe tools (e.g: copy paths directly from Illustrator).

- Lack of usable transformation tools.

Also even though GIMP supports psd files, once you exceed a few hundred megabytes, it gets really slow compared to PS on the same machine.

The only thing that's better in GIMP is the lasso tool. I hate it in PS, but it's actually usable in GIMP.

If you factor out price (which you can and should if this is the tool that makes you money) GIMP doesn't even come close.


You are contending that Photoshop requires Windows? You sure you want to go with that?


Yeah, I do. (Though it runs on OS X too.) What's wrong with that?


Apple only use your email for sending Apple software and hardware news.

They don't sell you out to 3rd party advertisers like Microsoft has sold their Metro and Xbox UIs.


Agreed that the base installs are rubbish.

But clean install of Windows 7 or 8 + ninite.com. Windows generally has the drivers you need out of box, or at least the NIC / wireless driver so Windows update can find the rest for you, automatically. I can't remember the last time I had to manually get a driver for something, with the single exception of a USB to serial adapter I need for work.


In addition to ninite, pcdecrapifier[1] is also handy, even for cleaning machines that aren't brand new. Doesn't look like it supports windows 8 yet though.

[1] http://www.pcdecrapifier.com/


Windows 8 has a factory reset button... this would be a great option, except I expect that on retail systems it would reinstall the crapware. Or maybe not?


A lot of laptops are different. The Lenovo T520 I'm using right now has an ethernet chip that the OEM version of Windows 7 couldn't figure out, and I had to track what the chip was before I could get on the internet.

So it isn't all roses in the Windows world with drivers. At least there was a driver, though.


And I fear this kind of crappy PC experience is a significant factor in why many people move to primarily using tablets.


The problem with mobile devices is that when they are prefilled with OEM crap (mine came with Motoblur, Blockbuster, and several other terrible media-streaming apps), there's absolutely nothing you can do about it. Windows may have a terrible default experience, but at least it's fixable.


OK, admittedly the fact that I stick to Google Nexus and Apple brand kind of makes me oblivious to the existence of all that crapware. But as far as I can tell, some people actually like what HTC and Samsung are doing with their Android.


The customizations that people like on android phones are usually the skins - touchwiz on htc phones and sense on samsung's. Many fewer people use/like the Verizon-specific app store or V-cast or whatever their proprietary video service is called these days, which are more directly equivalent to the crapware referred to in the article.


Every time I try to launch the turn-by-turn GPS on my Verizon Droid2, I get prompted to use vanilla Navigator or some Verizon crapware. Checking the box to make one the default does nothing. This is absurd.


Go on then, tell me how I can uninstall Facebook and Google Goggles from my Nexus One?

(Slightly tongue-in-cheek - my Nexus One is nothing like as bad as my Galaxy Tab for crapware that comes pre-installed.


You have heard of rooting and using something like TitaniumBackup to force remove apps, right? And of course, there's always the option of going with the clean versions direct from Google.


With my particular phone (Droid2 Global) you can't root if you've upgraded to 2.3. Nor can you downgrade to 2.2 and then root. Basically I'm just stuck on castrated Gingerbread for the life of the phone.


I'm sorry. That's lame. I recently (finally) got my wife's phone rooted, and it was a pain. The whole experience has me considering writing a scathing critique of the the whole "just root it" culture of Android, which is, I have to admit, very similar to the responses here of "just wipe and do a clean install". It wouldn't be so bad, except that there is so much unintentional disinformation, broken links and general nastiness in trying to prevent people from messing with their own hardware. I've had easier times developing embedded software with sh!t BSPs! I blame not just the manufacturers (Samsung, I will never buy an Android device from you; between rooting my wife's Samsung phone and the Tab from work with it's non-standard port and refusing to charge off anything but the supplied charger, just no), or even the carriers, but the community really needs to get it's sh!t together.

More

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: