MS has some very talented programmers. They're not very common, but they exist. The problem is that the entire company is completely and totally focused on developing an absurd number of new features and products, giving them completely unrealistic deadlines, and then shipping software on those deadlines no matter how half-assed or buggy it is.
The idea is that everything is serviceable over the internet now, so they can just "fix it later", except they never do. This perpetuates a duct-tape culture that refuses to actually fix problems and instead rewards teams that find ways to work around them. The talented programmers are stuck working on code that, at best, has to deal with multiple badly designed frameworks from other teams, or at worst work on code that is simply scrapped. New features are prioritized over all but the most system-critical bugs, and teams are never given any time to actually focus on improving their code. The only improvements that can happen must be snuck in while implementing new features.
As far as M$ is concerned, all code is shit, and the only thing that matters is if it works well enough to be shown at a demo and shipped. Needless to say, I don't work there anymore.
It's not just Microsoft, but this "update culture" is pervasive in software today. New features make for exciting marketing but in the end what most users probably want more is stability. All the stories of forced updates and accompanying reboots (with not even a chance to save work?) remind me of the old criticisms of Windows being so unstable it needs to be rebooted frequently, and how uptime was a highly-valued result of good system design.
I personally think near-daily updating to fix bugs that should've been discovered and fixed before actually shipping product is absurd. It's been argued that this is because systems are more complex now, but maybe we don't actually need all that complexity...
Most users say they want stability but when it comes to actually paying, unfortunately most users will only pay for features, especially new shinny ones that demo well. This is the challenge faced by software companies.
I pay twice the price of the hardware to be able to use Mac OS X, which is stable. Except, since 10.10, it kernel-panics once a week. I'd pay 100-200€ per year for a good OS.
It's a very, very long shot, but could it be that you have virtualbox 4 installed?
With 10.10 and 10.11 I had the same issue, caused by the kext from virtualbox. Removing it and installing version 5 solved everything and haven't had a kernel panic since then.
If not virtualbox, I'd look at other kext you might have there.
End users have (mostly) never paid for operating systems. I use Windows mostly because that's what the applications I use require. There's nothing about Windows itself that is compelling and frankly, Windows has been good enough for a long, long time.
I really wish Microsoft had stayed the course and maintained separate desktop and mobile operating systems. Windows CE had problems, but they could have fixed that without messing with Windows.
Yeah. Everyone needs to justify their existence. While quite a lot of people thought Win7 was great and that Microsoft should just stop there, maybe with a few minor improvements (e.g. fix CMD.EXE), they decided to completely change it in response to the tablet "threat".
I have a cheap Win8 tablet. It's weird. Quite slick in a lot of ways, but you can go back to the old UI and see where all the bodges are.
I develop daily on Windows and Visual Studio. The latest versions of both are so horribly unstable and buggy that our team's productivity is being seriously damaged. What the hell happened to Microsoft's quality control? In Windows 7 and earlier versions of Visual Studio you didn't feel like you were walking on thin ice all the time.
Mac is a much more viable choice today than in the past, and I think the Mac's slowly but steadily increasing market share has something to do with the perceived reliability and stability of the platform.
I'm only basing this off of personal experience. I have Windows at work and Mac at home.
I'm sorry, if we're talking about features being buggy, OS X is not something you exactly what to point a finger at as a good example of an OS that has features implemented in a stable way.
Thankfully, OS X is cheap/free. I use OS X, Windows, and Arch literally every day of my life, both at work and home.
To be quite honest, I blame consumers for all this non-sense. No one wants to actually pay for software. I can go into this in a more nuanced fashion, but I seriously think software across the board is a good bit cheaper than it needs to be. From OS's to games and everything in between.
>they are able to charge a higher price due to having OS X as part of the product, and they do.
That's a suspect claim. I am not saying the cost isn't accounted for, but claiming that they charge a premium simply because they have OS X is dubious at best. I don't consider OS X a selling point in any regard and I'm not sure even Apple could. Of course this is largely my opinion. I would find it kind of funny if they really considered it a selling point.
I always thought that having OSX sorta was the selling point. If I'm only comparing computers just based on the parts inside them and their capability then there's little to no reason for me to buy an Apple computer because other than some sorta neat things like the hybrid drives or some of the Thunderbolt integration, they're generally much more expensive than building or buying another computer.
But for people who value specific software that is OSX-only or some of the features only found in OSX (like CoreAudio for music stuff) or the general interface polish and perceived stability, it doesn't matter if the Apple option costs $2000 and the other brand/build with similar core components costs $1000. If it doesn't run the platform you want/need then it doesn't matter that it's more bang for the buck.
Basically I default to Windows because for the things I do on a computer, it's got the most flexibility and options for hardware and software. Linux is missing too many applications I need/want and OSX only runs on machines that cost a good deal more while offering nothing I really need enough to justify the cost. But when I've needed or wanted to run an OSX application in the past (Final Cut Pro in my case several years ago) it just wasn't a question. The stuff I wanted to run and the hardware I needed to hook up required OSX so that's the only reason I shelled out for an Apple computer. If I could have done it without OSX I'd have saved the dough and put it into something else.
I can't help but avoid but sound like an Apple marketing drone, but hardware wise quite a few the selling points for me for a MBP have been the form factor and how "it all fits together." Personally, there still isn't really a comparable windows-esque laptop from any other manufacturer that makes me not want to rip my hair out when I'm using their product. I enjoy the weight and weight distribution of the laptop, the keyboard is okay, the trackpad should be the target of corporate espionage, etc etc. And it certainly looks nice.
Admittedly, a good selling point for OS X is how well it handles the hardware -- the battery life is consistent and lasts long enough for my needs. I definitely use OS X more often when I'm traveling, and will relegate my development to that environment just to make sure the battery lasts.
The software ecosystem is definitely a selling point for some. I am not involved in multimedia production but I certainly can appreciate paying a premium amount to get the tools you want/need. Although whether this aspect can be fundamentally attributed to the OS or just a product of other things (audience, marketing, corporate deals, etc) is another conversation.
Well without OSX a macbook pro is just another Intel laptop. And I've seen them with Linux installed, and using bootcamp variations of Windows. But OSX is certainly part of the appeal.
Most of the people who own Macs got them because of OS X. It helps that the hardware is quite good, all in all, and for that you can also charge a premium. But if Apple stopped developing OS X and just sold MacBooks with Windows preinstalled, I think their sales would decline drastically; if they ported OS X to regular PCs and sold it seperately, their computer business would probably take a large hit, too.
It's non-zero, but essentially right next to zero, the number of people buying Macs who do not at all run OS X. So yes, it's bought as a whole, people want the product experience, the integration of hardware and software. That's why they buy it. If they really didn't care about OS X, or really didn't care (a lot) about industrial design they'd buy something else. And on industrial design, Dell's XPS 13 is neck and neck (bit better in some spots and not as good in others) with the Air.
> I would find it kind of funny if they really considered it a selling point.
Good point, but I'm trying to stay completely agnostic of what Apple thinks: it's not Apple's cost that determines the price of a macbook (aside from setting a lower bound), it's what people are willing to pay.
I assume that people are willing to pay significantly more for macbooks in this universe, than in the parallel one where they have no OS. At least as much more as it costs Apple to develop it in the first place.
You are right, though, that I can hardly "claim" it :)
I'm always a bit shocked when I see people express this. I work in a office with roughly 100 macs and 100 windows. The macs are FAR more stable and less buggy. I've had this experience three different times at various companies. Yet online I see people saying that OS X is just as buggy but they never point what part is.
Well, for example, with the most recent updates I had problems with wifi stability and thunderbolt peripherals not working properly. They've been fixed now, but it's still kind of surprising they creep up given the limited amount of hardware the OS has to target.
Are the macs running the latest version of OSX? From what I've read and experienced, OSX declined in quality when Apple stopped charging for OSX and started using it to drive hardware upgrades. These days OSX works fine if you have the latest, greatest, most expensive Mac hardware but if you try to upgrade an existing machine, that's when you get errors.
I use to love the Mac and OSX, but the whole freezing up while spinning a colored ball thing has got me too pissed off to use a Mac any more. I switched to Linux a year ago and only use Windows for playing games. As for the increasing market share, I think that's more of the perceived coolness factor of a Mac. But man, I love the MBP keyboard! I can't wait to try a Happy Hacker 2.
My mom (55 yo) always tries to get me into her computer buying arguments with my dad. She recently called me and stated "Macs never break compared to windows, so I should get a mac." I use Linux almost exclusively these days, so I can't really comment, but anecdotally speaking Apple seems to be known for it's reliability (at a cost) to at least some people.
To add to your point, Valve and Jetbrains both seem to be suffering from exactly this. Do you think it is Agile gone too far? Minimal viable product being too broadly scoped? Modern programming culture / lack of resources?
Yes, I can also see that happening in many other companies. I think that this urge on putting something new out quickly (Innovation?), regardless of being stable or not, is what they think they can win market with. And sure they can.
This might be yet another enterprise trap set-up by the fact that the people deciding what software to buy are different the people paying for the software and different from the people using the software.
The problem is that Windows isn't a purely enterprise product. Although it gets little money from the end user market, it needs dominance on that market for strategic reasons. So, MS will ensure it dominates end user market, whatever it takes.
Yup, It is a way to add more 'perceived value' than there actually is. Lowering the investment required for creating something that _seems_ worthy enough for people to buy it.
From a customer perpective, it looks like an evil spiral more and more companies tend to fall into.
I really agree with this. But I have to say it's not just companies. Look at the fast pace of open source software, web frameworks, etc.
So much focus on "getting there quickly." A few keystrokes and a marketing page, and everyone's using "Foo.js" v0.0.1.
And then you're spending time as a developer dealing with deprecations because APis changed.
But as painful as that is, I believe we need external feedback to deliver better software. Trying a thing, seeing how it's used, and improving it is a good way to go.
It is just going to result in bad times in the short term.
Bah, you think that's bad? Currently there's a nasty bug in iOS 9.2 where any webpage with an overflow:hidden style in the body tag will cause the viewport to be way too large, and consequently the webpage zooms out to far too large a size.
Apple can't fix this until the next update, yet it's a problem in Safari. Now we have to wait till the next update to fix a pretty bad problem.
You can see this, incidentally, if you browse to Libreoffice's opengrok.
I only started referring to MS as M$ after I had been working there for a year. I joined them with cautious optimism, because the new CEO had finally gotten rid of that atrocious stack ranking system.
That optimism did not last long. I only stayed there as long as I did because of how much they were paying me. Referring to them as "M$" is disturbingly appropriate given that all they do is to throw huge amounts of money at problems until they go away.
Have you ever seen how much Microsoft will pay you? Those sort of people often end up as very good programmers. Not that surprised that people in this camp end up working at Microsoft and then quit because they don't like working there.
I think it's largely a dog fooding problem. Visual Studio is a superb piece of software, and you can tell that its designers are the primary users. Microsoft's consumer apps are awful.
That's fair (re: 64-bit) and I'll amend: VS2015 should be able to handle a solution with (say) 1,000 projects without crashing. If that can be done in 32 bits, hurrah.
its not that uncommon to find VS solution files consisting of 100+ projects, especially in finance. The most I've worked on was 132 projects in a single solution (roughly a third were test projects). It was slow and sluggish at times but still workable nevertheless...
Wow, I'm impressed! What do you work on where 1,000 projects in a solution is a thing? With just one developer per five projects you've got a team of 200 developers. What are you working on? Please give us a few details.
I'm working on a legacy desktop app (~25 years), currently around 600 projects. Maybe one developer per 7 projects. Common workflow is to create custom solutions around your working set of projects, but that's not ideal for whole code analysis/debugging.
Does that mean that there's a lot of duplicate projects? It seems like there's an opportunity for consolidation. I've seen a few solutions where a developer created a Project for every class library and then only ever referenced them from 1 other Project. I was able to roll up a 30+ project solution into 5.
We had ~150 projects across about 1M LOC and 4 developers because they were being used like folders - they weren't really separate codebases, just a means of organising the code. We cut that down for a drastic improvement in speed.
Most "serious" users are now uninstalling ReSharper. Visual Studio 2015 adds most of ReSharper's key perks and JetBrains has done little or nothing to resolve ReSharper's laggy/hoggy/unstable code (and has said they aren't going to re-write it for Roslyn).
Honestly back in VS 2010 I'd call ReSharper a core tool. In VS 2015, eww.
I've never liked ReSharper, have felt that the "advice" it offers is often over-rated and seems commonly to be wrong for C# (not unexpected given that JetBrains primarily focuses on Java outside of ReSharper, I feel like ReSharper is a Java developer's approach to C#). These days what's baked into VS and the Roslyn compiler platform blows ReSharper out of the water.
Just about everyone I've worked with that had daily complaints about how slow and laggy they felt Visual Studio was didn't seem to understand that ReSharper was 99% of the slow-down, excessive memory use, and UX lag...
I never liked it either, but I've worked in places where it was like the eleventh commandment and you better not say anything against it! :) I'm glad to hear it's loosing popularity.
Not moving to 64 bit has been a pragmatic decision [1]. As for crashing, I used it for over 10 years at my previous company and it never crashed once. At my current company it only crashes when I am using the Stylevision plug-in, which is really, really, buggy.
That pragmatic decision was made 7 years ago. A lot has changed since then and I don't think the arguments hold up. What developer machine doesn't have at least 16 GB of RAM these days?
Thanks for posting that (although I didn't see any benchmarks).
My big problem with the first article was that in my experience, the extra registers available in 64-bit make all the difference in the world. In the comments for the article you posted, he says that isn't true with VS. Surprising, but I'll take his word for it.
Still, it bothers me that he keeps talking about 4 GB of code and data should be enough. VS becomes pretty unusable for me around 1.5 - 2 GB. If it worked reliably up to the 4 GB limit, I'd be happier.
The limit is usually address space fragmentation and at one point not being able to find a piece of memory large enough for an allocation. Depending on the allocation patterns this may happen around 3 GiB but for some applications even around 1.7 GiB already. VS gets unbearably slow around 2 GiB memory, though, most likely because the native allocator has to piece together all free pieces of memory again and in the managed part the GC probably has fun trying to compact memory.
Did you even read the article? The arguments still hold up. It's clearly the case that it's not worth the effort to move VS to 64-bit.
- "First, from a performance perspective the pointers get larger, so data structures get larger, and the processor cache stays the same size. That basically results in a raw speed hit..."
- "The cost of a full port [due to the amount of code involved, not the quality of the code] of that much native code is going to be quite high and of course all known extensions would break and we’d basically have to create a 64 bit ecosystem pretty much like you do for drivers. Ouch."
- "A 64 bit address space for the process isn’t going to help you with page faults except in maybe indirect ways, and it will definitely hurt you in direct ways because your data is bigger...32 bit processes accrue all these benefits just as surely as 64 bit ones."
OK then let's compare VS to XCode, which doesn't even have half the features of VS and crashes twice as much. If not, which comparable product would you pick?
Running tons and tons of VS instances while debugging broken, memory hogging programs, modifying the XML for weird project types for third party plugins poorly, etc. Things that MS really can't do anything about.
I would have agreed that it's a good product right up to Visual Studio 2013. But Visual Studio 2015 is slow, buggy and unstable. They really have screwed something up, it has been months, and the update pack they released didn't make any tangible improvements. Meanwhile the "extensions and updates" area just continues to push Microsoft products I don't need, and every now and then I am not allowed into the IDE because my license has "gone stale". Everyone in my team has daily crashes, bugs and slowdowns and it is hurting us. How can Microsoft not have found any of these issues during all their testing?
I've been using various versions of Visual Studio since 1999. I can count on one hand the number of times it has crashed, usually coincident with me doing something unwise.
Most things in VS run in-process (that was the case circa mid-2000s, at least). It only takes one bad component to bring the whole thing down, and there are many, many, many components.
IME, Visual Studio crashes are almost always due to misbehaving plugins. I'm in it all day every day, and once or twice a week it will develop some quirk that requires a restart to fix, but I can't remember the last time it actually crashed. The only plugins I run are Resharper, Xamarin, and PTVS.
Funny though that Visual Studio insists on defaulting to upper case menus. That must be a decision forced on them by a PHB somewhere, which just tells you that nobody is immune from the organizational rot.
Oh, I positively lurve the Microsoft design team's justification for all caps:
When we shared the RC design preview with you, we expected the uppercase menu would generate mixed feedback and emotions. We had seen similar reactions from early adopters and from our own internal users prior to posting about it. [1]
So then they said they had been "thinking about it" (why a lot of thinking was necessary is beyond me), and that "using uppercase for the menus was not an arbitrary decision" because they needed "to keep Visual Studio consistent with the direction of other Microsoft user experiences".
Then they tried to say that "some of you" won't like the change and that these people have "been very direct in expressing your opinions on this subject".
In other words: they got told it was an absolutely awful decision from the very start by beta testers and their own test team, but ignored it because, well, "consistency". Then they released it and they got almost total user revolt, but they can't back out now because, well, "consistency". But they now allow you to change to letter case with a registry tweak, after all - Microsoft know better than their end users, even when those end users are screaming for them to change a fairly fundamental UX error.
There is a very interesting book, "I Sing The Body Electronic" by Fred Moody, who had the opportunity to observe a team at Microsoft very intimately over the course of a project. Back then, apparently part of the problem was that good managers were very rare, and as the company grew and grew, they put people in management positions that were not really qualified for (or comfortable with) the job.
Deadlines appear to have been a problem back then, as well.
I have wondered, how much of that has changed since. From your description, not much... :(
I also previously worked for Microsoft (Azure Active Directory). We took code quality very seriously. All new features required at least 80% code coverage from tests before commit (not a particularly high bar, but a pragmatic one). New features also required a test plan + review covering stress/perf, unit, integration, & system tests. Many features required threat models which were reviewed with the security team.
Progress/bugs were reviewed weekly at ship room. Live site issues were reviewed the next morning (involving the relevant on-call engineers).
So as with any large company which isn't a monoculture, things vary from team to team.
So, Microsoft has forgotten the concept of technical debt? That's hilarious. Also sad. Wasn't it Joel Spolsky (a former Microsoft employee) who actually coined the term "technical debt"?
Ward Cunningham coined the term "technical debt" according to Wikipedia. He worked for Microsoft for a short period of 2003-2005, but this was long after the term was invented.
MS has some very talented programmers. They're not very common, but they exist. The problem is that the entire company is completely and totally focused on developing an absurd number of new features and products, giving them completely unrealistic deadlines, and then shipping software on those deadlines no matter how half-assed or buggy it is.
The idea is that everything is serviceable over the internet now, so they can just "fix it later", except they never do. This perpetuates a duct-tape culture that refuses to actually fix problems and instead rewards teams that find ways to work around them. The talented programmers are stuck working on code that, at best, has to deal with multiple badly designed frameworks from other teams, or at worst work on code that is simply scrapped. New features are prioritized over all but the most system-critical bugs, and teams are never given any time to actually focus on improving their code. The only improvements that can happen must be snuck in while implementing new features.
As far as M$ is concerned, all code is shit, and the only thing that matters is if it works well enough to be shown at a demo and shipped. Needless to say, I don't work there anymore.