Hacker News new | past | comments | ask | show | jobs | submit login
Stack Overflow Developer Survey Results 2017 (stackoverflow.com)
342 points by kenrick95 on Mar 22, 2017 | hide | past | favorite | 163 comments



Shows you effects of the echo chamber you are in, as I was very surprised that Scala was only at 3.6%, and beaten by Go at 4.3 and Swift at 6.5% in the programming language section.

The self-selection of people I follow on Twitter, the meetups and conferences I attend, the choice of companies and colleagues I have worked with, etc, probably just reinforce my blinkered echo chamber. This is probably true for many Haskell, Rubyist, Go people and others as well.

By heart, I would have guessed 20% Scala usage, but in my head, I do know of the floods of javascript misuse across the world, the legions of java developers in every country and location, having been one myself once. And I guess my current location of London which I feel use Scala more than most places and mixing a lot with startups does affect my expectations of what everyone else use elsewhere.

[http://stackoverflow.com/insights/survey/2017/#technology-pr...]


Visual Studio is the most used IDE and somehow I don't know a single person who uses Visual Studio.

http://stackoverflow.com/insights/survey/2017/#technology-mo...


Not only that, but it seems to be most used IDE in the web development category. I had to make sure they didn't mean Visual Studio Code, which appeared a few positions further down though.


We all tend to know our circles, but ignore most of what is outside of it. Myself when I was a C# developer, could tell of many companies using .NET in my city, but couldn't tell you a single company using Java, even though there was a lot of them (probably more than companies using C#). If I had to guess the usage of newer languages like Rust, Go, etc., I'd probably underestimate it tremendously, just because I'm not part of that ecosystem (not looking at job postings, not participating in conferences, not knowing popular open-source frameworks).


To be fair, if you're in the USA, you're missing out on two of the largest audiences in the world: India and China. I imagine that's similarly why iPhone developers are more prominent in the USA/UK than elsewhere, where Android developers dominate.


HN is the only place I've ever heard/seen anyone mention Scala, ever. I don't know anything about it, not even what the syntax looks like or if it's interpreted.


I think there is severe bias in their results simply because of the content in stack overflow and the people that visit.

For example, the data on a lot of developers only having 1-3 years of programming experience. That would naturally follow that a majority of active users on a Q&A site would not have much experience, right?

Same can be said for data about what technologies are represented - JavaScript was at the top of "Most Popular Technologies". There is a lot of churn in JS frameworks... which probably leads to a lot of questions for inexperienced programmers... which would visit the site more frequently.

There definitely are some good results in there - I would just take anything regarding popularity with a grain of salt. I also suspect the population of people taking the survey don't fully represent all software developers equally. People that are in a stable role using "unpopular" or proprietary technology have no use for stack overflow.


I agree. Anecdotal but I used to visit SO all the time during my first years as a programmer compared to now where I only need it occasionally. It was also a long time ago I posted an actual question.


> It was also a long time ago I posted an actual question.

This doesn't necessarily mean it's about your experience, that is, as more and more questions are asked and answered, it is more likely that an answer already exists, so you don't need to ask a new question.


I think the survey makes a very fundamental mistake in chucking a lot (the majority) of developers under the umbrella 'web developer'. I know they also had subdivisions (frontend, backend, full stack), but in the entire survey they are represented as one category.

Those are fundamentally different roles. Someone working exclusively on server-side applications has a totally different profile than someone working exclusively with web frontend. Concepts, technologies, tools, everything differs, and so probably does the personal profile of those developers.


Most Loved Language: Rust

Most Wanted Language: Python

It's good to see these two languages win out these categories! Good things lie ahead (hopefully).


Wikipedia updated with 2017 results:Rust won first place for "most loved programming language" in the Stack Overflow Developer Survey in 2016 and 2017.

Love the site.


These two lists made a lot more sense to me than the "most popular technologies" category. Which I think should be worded "Most common languages for questions" or something along those lines.

But like others have said, I need to double check that I'm not living in a bubble while the rest of the technical world openly and not-ironically loves javascript.


I'm surprised to see "platforms" as Win: 41%, Desktop Linux: 33%, Mac: 18.4%. Presumably this question was about the machine you use to develop software on.

The reason I'm surprised has nothing to do with any advocacy. It's just that whenever I look around at a non-Apple dev conference in the US (Web, for example), it looks as though fewer than 1/5 of the machines are NON-Macs, while the answer to the question above is the inverse: fewer than 1/5 of devs use Macs. (Though multiple answers were allowed this still says that fewer than 1/5 use Macs.)

So, I'm wondering: Are StackOverflow and US dev conf attendees significantly different groups? (Ex: SO hobbyists very different from conference pros?) Or do devs usually install Desktop Linux on Mac hardware, and I don't realize that half of the Macs laptops I see are running Linux? Or do people use PC desktop hardware for development but use Mac laptops for portable use (ex: attending a conference, but they would still qualify as Mac users)? Or are all of the non-US conferences solid walls of PC hardware with so few Macs that they overwhelm the Mac usage in the US? Or have things changed significantly in the past few years? (I haven't been to a dev conference for a while.)

Or what? Presumably all of the above to some extent, but am I missing something big?


Developers that attend conferences in USA are very small and biased subset of total developers worldwide. For vast majority of developers attending a conference in USA would cost few to several months worth of salary, so it's no brainer. Not to mention bullish USA border controls that makes people (especially non-white) think twice before traveling. I haven't seen many developers using Macs outside of USA and few Western Europe countries.


Developers that attend conferences in USA are very small and biased subset of total developers worldwide.

It's even a very small and biased subset of the total developers in the United States. The majority (I would say vast majority) do not attend conferences. Sometimes because their workplace won't fund it, but often simply because they find no value in it. I personally gave up going to conferences years ago partly because workplaces treated it almost like they were giving me a vacation that I should feel grateful for, but it certainly didn't feel like a vacation. And afterwards I would try to itemize the value I received...to have nothing (talking to developers of the vendors face to face was, if anything, less productive than simply dropping a note on a discussion board).

So that subset often comes from more moneyed employers, or are the "see and be seen" sort, or they really get value out of it. For the former two sets a Mac is a pretty likely device.


FWIW, in this survey I would've answered "Linux Desktop", but at a conference you'd see me on a MacBook. Not everyone uses the thing they bring to a conference as their workstation. :)


The big / national tech conferences - yes. They are expensive. There are many local or regional ones that are much less expensive.

I'm heading to a regional one in August that is registered as a nonprofit and is less than a week's salary to attend (and I work in the public sector in the Midwest so don't think that this is big left cost salary).

That said, even with a strong Microsoft bend to many session technologies, macs still are several times more frequent than non-mac.


Plane tickets, travel, hotel, catering alone can easily add up to few thousand dollars.


For the regional one I go to (That Conference), its about an hour each way. $10 in gas/day or so? Since its close enough for a long commute, I don't have to worry about a hotel (or plane tickets).

Since the conference isn't trying to pull in big bucks (its a break even - 501.C non profit even), attendee costs for 2017 are $425 (food included - and they don't skimp on food). Even if you're going to add on the hotel the conference is located in, its only $175. Toss in a plane ticket and it is probably still less expensive than the national ones for the "get in the door" ticket.

There are lots of regional or local conferences that are not absurd in pricing.

For another example of an upcoming local one: https://stirtrek.com which is a one day $99 one. Its draw is likewise local.

Yea, NoFluff is $1000 to walk in the door for two days. I'm very glad that there are others that are closer in line with not needing to drain the training budget to go to.


Outside of US, very few people use macs compared to you. Here in euw even in a CS course, engineering class in a top university or hackathon in a big startup incubator I can see like 1 macbook every 30 standard notebooks.

Reason is simple: they are very overpriced for ours salaries (and I am in a first world country, I think their costs are prohibitive in places like India/Cina/etc) and - personal reason - as a webdev I can basically do everything I need in a 250$ old t430 with fedora, so I can't justify to spend like 10 times more just for a shinier piece of hw which is not even easy to upgrade.


As you say that depends on income. Based on location and whom you mix with. And perhaps some culture bias/patterns but that is my just guessing.

Macs have dominated in the areas I have worked in for the past 6 years. I would say in the companies, hands-on meetups and conferences I been to in London and across Scandinavia: 75% use macbooks, then perhaps 15% dell xps with ubuntu and 10% mix of lenovo, MS surface etc.

Though I think this split will change with the recoil of the new touch bar macs.

(Again this is just from my smaller demographic of location and people.)


London is special because there are a lot of Apple stores here (I count 7). That's more than many European countries have.

The calculus of owning a Mac changes drastically if you live near an Apple store. If my Mac has a problem, I can book an appointment at the Apple store, real person will try to fix it, and I can have it back ASAP. The customer service is half the price.


> London is special because there are a lot of Apple stores here (I count 7). That's more than many European countries have.

Probably because London has more people than most European countries.


Whaa ? https://en.wikipedia.org/wiki/List_of_European_countries_by_...

Yep, it's almost true. London 8.5M. If you count all 50 "countries" (some are a joke though) it ranks between the 21st and 22nd.

Though in my city (Paris 11M, I think), similarly sized, there's only 2 Apple stores.


If you go by metropolitan area London(13M) and Paris(12M) would be 13th and 14th the country population list.

Ref:[ https://en.wikipedia.org/wiki/List_of_metropolitan_areas_by_...]


The point isn't that though. Very similar cities have a very different number of Apple stores.

Paris and London are 500km apart, a couple hours of train or plane away from each other, both capital cities, both have ~10¨7 people.


> "countries" (some are a joke though)

Which? Vatican City is fairly extraordinary, granted, but I think 'a joke' is unfair...


The situation is entirely the other way round here in Singapore (for the younger generation).

I used to run Linux on a laptop, but eventually I gave up and switched to Macbook so that I don't have to deal with drivers and low-level config issues on a daily basis.

In my university, 60%-70% of CS students use a Macbook. They are not working or attending conferences yet, so the change is going to take some time.


Singapore is the 3rd state for GDP per capita, I bet buying a macbook is not seen as "oh I have to invest on it almost two month of my pay" in that area

Just for comparison, with what I should spend for a macbook here I would pay 3 years of a Bachelor Degree + half a year of a Master Degree (obviously I am referring to just university tuition costs, not rent or other general expenses) in a top 3 university of my country with international recognition.


"deal with drivers and low-level config issues on a daily basis"

What in gods name are you talking about? Are you using a distro from 1995?


I ran Ubuntu 12.04 on a Lenovo laptop for a year or so.

The graphics and wifi drivers broke down every now and then and sometimes it broke the OS completely so I had to reinstall the OS.

Need to fix multiple config issues and compatibility issues when I wanted to install some Linux-alternatives for Windows applications.

Plus I need to do video editing and Android apps so I still have to constantly switch back to Windows with dual-boot.

Mac just seems to be a perfect balance, giving you all the standard terminal stuff plus everything just runs out of the box.


I ran a debian-based distro on a Lenovo thinkpad in graduate school from 2008 to about 2011. I had basically the same experience. Everything worked 95% of the time, but that other 5% felt entirely random and pretty dehabilitating.

Now I use a mix of Win, OS X, and Ubuntu depending on what I am trying to accomplish. I make heavy use of Linux servers, but have no inclination of using one as a primary end-user desktop environment again.


You should try. The changes happened in the last 5 years are extremly big. I mean, all my notebooks (some DELL and a lenovo t430) work better with any linux distro compared to win10, and I even get +30/40% hours of battery on them


Which distro's have you used on those notebooks? I'll likely be switching back to Linux laptop from Mac, and I'm trying to figure out which distro I want to use.


same experience here.

When people complain about Linux my first question is if they had given it a serious try in the last 2 years.


I love linux, and use it as my daily machine for work and home. But whenever I mention to others in the community that we really need to clean up this crap, and make our systems reliable instead of flashy, they treat me like I'm insane. It's amazing to me that we still have sound, wifi, and display issues that need to be solved with shitty hacks in 2017.


Dude, every time I bring up my difficulty installing Ubuntu on a relatively new Macbook Pro (Retina 13-inc early 2015) people respond like this. And yet [1] I was never able to solve this issue with video drivers (or something, I never figured it out) causing incessant crashing. I mean, look at the stack overflow post, I tried so hard to get it working. This is consistent with every ubuntu install I've ever done - there's always some shit that just makes me wonder whether it's worth it.

[1] http://askubuntu.com/questions/838855/crashing-freezing-hang...


Apple pretty much dominates the startup scene in the NL as far as I can tell.


I would say the startup scene represents a very small portion of professional programmers.


10% according to the survey


of 64,000 developers.


> whenever I look around at a non-Apple dev conference

I think you're seeing selection bias here. The devs that attend conferences are a very, very small sliver of the total population, and the price of attending a conference (not only the ticket, but travel and PTO necessary to attend) also overlaps heavily with the more "well-off" devs, which probably have a higher percentage of Mac usage.

FWIW: I've had a personal Macbook (for two years in college) and a work Macbook (when necessary), and in both cases I only ran linux on them.


> It's just that whenever I look around at a non-Apple dev conference

Most devs don't attend dev conferences.


StackOverflow is more tilted towards "dark matter developers"[1] than many other places - the unseen, the unheralded, those that are toiling away at LOB apps inside of companies. Often they're using PHP or Java or C#, or god help them, Visual Basic, still. Those are not necessarily the kind of people that are going to be seen at dev conferences, but there is a huge number of them, and they use StackOverflow hard.

[1] https://www.hanselman.com/blog/DarkMatterDevelopersTheUnseen...


Heh. Two of my clients are still using VB Apps I wrote for them in VB6. I'm not ashamed. They work, and they've worked well for 12+ years, with only the most minor of updates. Not every business follows the 'innovate fast and break things' mold. Thankfully.


Exactly, we should be proud of creating Software that works and works great. Not the languages we develop in. The one main lesson if there is I learnt from 13 years of development is, keep the dev stack boring!


Whenever a kid comes around with the cool idea lets rewrite this in X, they need to explain me the business value of the proposal, like in, how will that change the monetary outcome of a project.


Isn't the implicit argument for any refactor/rewrite essentially increased velocity?

"This change will allow us to do X in Y days when normally it would take some multiple of Y."

It sounds like what you want is a more salient list of pain points in the current development environment and exactly how the proposed new thing would address them. Additionally, you'd want to hear the new set of pain points that the new thing will impose (cue the quote about some devs knowing the value of everything and the cost of nothing).

You may also, for their benefit, direct the next kid with the cool idea towards a link that's commonly thrown around here: https://en.wikipedia.org/wiki/Wikipedia:Chesterton%27s_fence

I'm going to pull the more direct section "What this means on Wikipedia":

> If you're considering nominating something for deletion, or changing a policy, because it doesn't appear to have any use or purpose, research its history first.

> You may find out why it was created, and perhaps understand that it still serves a purpose.

> Or if you do feel the issue it addressed is no longer valid, frame your argument for deletion in a way that acknowledges that.


It's the implicit rationalization. New software engineers in particular suffer from "Ohh Shiny" when choosing their toolset more than just about any other group I've worked with. To the detriment of the profession, IMO.

(Nice link. I was unaware of the name for that heuristic.)


> StackOverflow is more tilted towards "dark matter developers"[1] than many other places - the unseen, the unheralded, those that are toiling away at LOB apps inside of companies. Often they're using PHP or Java or C#, or god help them, Visual Basic, still. Those are not necessarily the kind of people that are going to be seen at dev conferences, but there is a huge number of them

I think that is a really important thing to bear in mind when reading Hacker News, or pretty much any online venue. People who contribute or even post comments are not the majority. SO rates me in the top 20% of Ruby developers, because I answered a few questions on the site.


Have you considered maybe you are living inside a bubble?

After all, you consider C# an uncool language.


> After all, you consider C# an uncool language.

That couldn't be further from the truth... I love slinging C#, it's far and away my favorite. It doesn't have that HN hot-shit cachet that Go or Rust or Scala or Lisp-likes or ML-variants have, however.


I'm in the camp of "I do all my serious work on a PC, but I take a MacBook to conferences because it's light and has good battery life".


Yes, you are missing something big: the majority (by far) of development work is done on Windows or some Linux platform these days and has been for a long while. In fact, outside the Bay Area I could count on the fingers of one hand the number of developers I ever saw using a Mac. The valley and certain developer "scenes" have a huge overrepresentation of Mac users.


There was a while a few years back where a MBP running Windows 7 was one of the better* laptop choices out there.

The last two teams I've worked with have all had Macs, even though at least half the team were writing .NET in Windows.

Then there's my main personal laptop which couldn't get the latest Mac OS so I just wiped it and installed Linux. I wouldn't do that to my daily work machine though, too many little things just don't work quite correctly, but all in all it installed pretty easily and runs less sluggish than OS X did.

* Better as in good hardware, consumer appeal, and a relatively performant and stable machine for getting work done. Not necessarily best value or gaming, etc.


Macs are good travel laptops.

My windows and linux installs never reliably go to sleep on their respective laptops when I close them -- and I cannot be bothered to figure out whether it's a hardware problem, or a software problem... I'll just bring the mac which does sleep reliably.

It also has pretty good battery life compared to my other machines.


I always get the impression that SO is populated mainly by Enterprise devs using Windows.

Edit for spelling.


The founders were very much Windows guys. Their podcast which essentially developed SO in public was very Windows-focussed. Although it's not the case for me personally, I suspect their audience skewed to Windows, these were the early users of the site, and they helped drive its direction.

Although SO now has a much wider audience, the more engaged, longer-term members are more likely to fill out this survey.


Plenty of people might use a Linux Desktop as their primary development machine, but have a Mac laptop when they travel.

Also, how many of those people with Macs had a Linux distro installed on them?


Maybe the conferences you are going are the tech startups conferences. The majority of developers are working in big banks and big consulting companies. They use PCs, java or .NET, Visual Studio.


Nice to see Linux Desktop so high. Are we entering the era of Linux Desktop ?


This is surely the year of Linux on desktop :)

more seriously, once you get the hang of it Ubuntu machine is just ridiculously more productive than win and osx.

I'm surprised that not more devs are using it.


Perhaps if you're a dev.


Tl;dr India.


It is weird being a "desktop application" and mobile developer and knowing that 70% of developers do something that I am completely unfamiliar with.

Web development to me feels like the Boneyard from the Lion King. This shadow land that I am too terrified to enter into, and I have no idea why I feel this way. It also bugs me like it might be a huge shortcoming at some point in my life that I have no grasp of it.


As someone who has to work in both for work, I understand your sentiment. I much prefer your world.

I think the difference is that in Desktop/Mobile, whatever crap has stuck to the wall for years - you must use because you don't have a choice. And usually this crap is fairly well documented because it is your only choice.

In the Web World, there is a bunch of crap thrown at the wall (every day) and you have no idea what will stick and what will slide off each year.

Maybe a poor metaphor, but just kind of how I feel on some days.


This is the best metaphor I've seen in a while.

For example, NSDateFormatter is fugly but well documented[0] . Also, the MVC pattern in iOS is from macOS which is from Next (I hear).

OTOH I'm building my first React + Redux app and the best practices and patterns are still emerging. I feel like next year most of what I just learned won't matter.

[0]: http://nsdateformatter.com


I made the switch a few years ago from working primarily on Desktop apps to primarily Web apps, and beforehand I had this exact feeling.

In my experience, web apps are way, way easier than Desktop apps in almost every way, aside from maybe if you have to figure out scaling really big (which you probably don't most of the time). You can probably learn everything you really need to know about web apps to clear that spooky feeling just by working at a new job for a month or two. It's not a huge shortcoming at all in reality, because at the end of the day, software is software. You can write software, so you'll be fine and may already be better than people calling themselves "web developers."

Also, I don't know if this applies to you, but in my case part of the reason I had this spooky feeling was because the people who I knew who called themselves "web developers" before I actually started doing "web development" were not very skilled, and were inclined to portray it as Magic to other people, so that they could they could feel the satisfaction of being a sort of Arcane Web Magician. Plus, it was magic to them, because they didn't really understand it. If you have people like this around you, I suggest you ignore them and, if you need support, find people who are able to explain details clearly to you without presenting it as magic, or at least point you towards resources that can. Once I was able to do that, it made a big difference in my confidence in my own abilities and helped me improve as a professional tremendously.


With me it's the other way around.

I can do almost anything I need with Web tech.

Desktop or even native mobile development seems to be like going back to the last century.

I'm doing a native mobile project next month, and looking at the stuff a iOS dev needs to get going is horrible.


> Desktop or even native mobile development seems to be like going back to the last century.

The funny thing is that looking from the opposite perspective I feel the same. The web development looks like going back to beginnings of programming and reinventing everything again in the browser. You know, like when you read about 'tree shaking' being cutting edge technology in JavaScript land, then look at definition to realize that it's just dead code elimination that has been done by linkers & compilers for ages.


You're right, these optimisations are missing and all the fancy autocomplete of the IDEs for typed languages that are used on the desktop are missing, too.

I don't know, maybe it just seems better to me, because I'm doing Web for so long now.

When I see desktop/mobile with Qt, Java or C#, I see huge IDEs, runtime environments and SDKs and with iOS even vendor lock-in, simulators and whatnot.

When I see Web with JavaScript, I see tiny editors, browsers that are already installed on every machine and bring all the SDKs and runtime environments, fast iterations, a good package system and general simplicity.


As a web dev these days I feel like we have it the easiest:

If you want to do mobile, try making it a PWA. If it needs to be in the app store (which these days I often see portrayed as more of a necessary evil than something desirable) go hybrid. If hybrid is too slow, try React Native or NativeScript. All of these can build on your existing knowledge of web technologies.

If you want a desktop app, you probably don't actually want a desktop app anyway, so just build a decent web app. If you need something that runs outside the browser, use Electron. Or React Windows if you only target the UWP.

If you want to build something on the server-side, there's of course Node.js. If you want something more tightly integrated with the database try ArangoDB. Again, same language you already know.

There are still niches where you need the raw performance of specialised programs written in other languages than JS, of course, but they are growing smaller. It's basic economics: if you can hire two full-stack JS devs instead of one iOS, Android, Windows and back-end dev each, you can cover the same technologies with half the salaries. This is especially true if the specialists end up doing mostly busy work because you don't actually need their specialised expertise all the time.

Few companies can afford to have all these dedicated specialists on the team but they may still want to cover multiple markets (e.g. an app for desktop and Android). In this case going for the web is the obvious choice for the smaller budget. The alternative would be hiring multiple contractors or outsourcing the project to a company that has the resources to provide all of the needed specialists all the time.


Sure.

There are many ways to leverage your Web knowledge, you don't have to start from zero.

But it's still a chore to do mobile development.

I want to build a iOS app, because 80% of the potential customers are iPhone users, but in the future I want to use as much of the code on Android. So its Hybrid or React-Native.

So I thought, well Expo sounds nice, and gives me a Web like dev experience without the need to set up a Mac OS system.

Then I checked if they support AirPlay and HealthKit and the answer was "detach".

So now I'm back to "regular" React-Native and have to setup a Mac to get started.

Well, at least it seems running a dev VM for Mac OS isn't big of a deal anymore :\


Oh, absolutely. For some reason Apple seems to have filled Microsoft's role as the big bad legacy bad guy with vendor lock-in.

Is it even possible to run macOS in a VM on non-Apple hardware with a non-macOS host yet? Microsoft provides free VM images for every OS and browser version you might want to test against, Google provides the Android devkit that works cross-platform. But if you want to target macOS or iOS it seems you're still stuck buying Apple hardware for $$$ to do anything. And don't even get me started on the lackluster support for PWAs -- at least the fallback for those is a web app that still works in the browser (provided you have Apple hardware to test it on).

EDIT: Also Microsoft has moved React Windows into its official GitHub org and advertises it as an official way to provide extensions to Office.


Is it a problem, yes? I read a few blog-posts yesterday that explained how to get Mac OS Sierra running on VirtualBox on Windows. Well, it's only 16€, so I will try it next week.

It seems like MS thinks their only way to win something in the "mobile war" is pushing PWAs.

And maybe they're right. Most people use the iTunes or Play Stores. They need to get devs off these platforms and show them that their tools are the way to go if you don't want pay taxes to Apple or Google.


Some observations.

Wouldn't have thought of built-in help as a popular (47.1%) way of teaching yourself.

Oracle usage only 16.5%, but I guess it makes sense considering the web developer proportion.

CoffeeScript as the third most dreaded language, behind only two instances of Visual Basic. But reading the definition of "dreaded" in makes more sense.

Sharepoint as the most dreaded platform, ha ha, no surprise there.

Clojure as the top paying tech worldwide, wow. But missing entirely from the list in US, UK, Germany, France sections, so where are all the Clojure devs? In general that's... comforting, if only there were any Clojure shops in my country (okay, admittedly I've heard about one startup using it).


> Wouldn't have thought of built-in help as a popular (47.1%) way of teaching yourself.

The built-in help for eg Java, C# or Delphi is amazing. It starts with autocomplete, then a tooltip with a summary, and full details including examples are just an F1 away.


Re: Clojure - something seems off with Clojure not being in any of the sub-regions at all.

I also noticed that Java is not in the world-wide list and yet is in all the sub-regions.

Given that the JVM is such a big part of the Clojure ecosystem, I want to dig into the data once they make it available. I'm totally willing to believe that developers with Clojure skills are in a high-salary bucket, but I'm a little suspicious of what they are showing here.

BTW, for some years I chased specific languages as the cause of the high earnings. I believe this is much less true than I originally assumed . . .


> Clojure as the top paying tech worldwide, wow. But missing entirely from the list in US, UK, Germany, France sections, so where are all the Clojure devs?

Probably, because it's reported mostly by US developers, but pay is not high enough to crack into US top paying list. Lowest position in US list is $75,000.

Also, world distribution might have affected Java position. While it is pretty high on US/UK/France and Germany list, the world salaries were depressed due to outsourcing.


> Oracle usage only 16.5%, but I guess it makes sense considering the web developer proportion.

At the end of the day, Oracle is expensive. Web or not.


Maybe that goes part of the way towards explaining the higher salaries. If all the jobs are in the US and jobs there usually pay more.


I can't find the actual questions, so if anyone can reference them it may help to understand the answers more.

The "platform" part is at best ambiguous and at worst white noise.

It covers mobile & desktop OS's, "cloud" providers - both IaaS like AWS/Azure & SaaS like Salesforce - hardware platforms like Raspberry Pi and Arduino, buzzwords like "Serverless" and mainframe.

Apparently all of those things represent a "platform" that is in some way comparable, but Linux, BSD and Windows servers do not.

The "most popular languages" shows Javascript as the most popular with Sysadmin/DevOps.

I'm going to just assume the people answering subscript to the idea that "DevOps === Developers Doing Ops" (aka NoOps). I'm pretty confident actual Sysadmins don't use/prefer Javascript more than anything else.


I'm pretty confident actual Sysadmins don't use/prefer Javascript more than anything else.

I think that question was multiple choice. My guess is that most sysadmins use their favorite language and a bit of JavaScript when they have to, and since they all have different favorite languages javascript ends up 'winning'


> most sysadmins use their favourite language and a bit of JavaScript

I think more likely is that every nodejs developer considers themselves "devops".


I found it fascinating that Smalltalk was #2 as most loved language in spite of its minimal use. Many of the ideas developed for Smalltalk are now commonplace, so I'm guessing that it's the careful integration of those ideas that makes it special.

It is my favorite language, btw. :)


> I found it fascinating that Smalltalk was #2 as most loved language in spite of its minimal use.

It probably is because of the minimal use (and I believe #1, Rust, is also there because it's still not that common). Their "loved" metric is number of people who are using a language and would not like to switch to a different one. I suppose that the only people who use it are people who love it so much that they can ignore issues caused by lack of popularity (small number of libraries and tools, problems with OS support, etc.).


Probably because it (still) pays so well? [1]

[1]: http://stackoverflow.com/insights/survey/2017/#work-salary-a...


I saw that too and wondered if it wasn't an artifact of history - when I think of smalltalk I think of peter norvig and people of his stature. Maybe the high smalltalk developer salary means it's probably used by older and very experienced developers.


  We will publish additional analysis related to respondents’ disability status in the coming weeks.
As a student with a severely limiting physical disability, I'm looking forward to this. I'm really interested to see what roles those in my position are in.


Would love to read about your experiences, if you are willing to share :)


Is gender and ethnic background of developers relevant?


Some people consider it to be. It's something people are at least curious about. There is a claim that "not enough X" go into software, where X is women or minorities or both, for various hypothesized reasons that seem very difficult to test for.

One that's particularly interesting to me and at least has some semi-plausible story is that the way personal computers were marketed in the US ended up influencing the perception of the computer as something male, even though prior to 1980 women made up a larger share of working programmers than they do today.

What I would find interesting from this survey is whether the gender breakdown between specializations changes when nationality is an additional factor. Because in my own limited experience, there are plenty of women who are competent developers, but I've only known one born in the US (n=8, but I work at a small company and this is is at least a double-digit percentage of the engineering staff)


Of course it is. Homogeneity is dangerous for an ecosystem and even if it weren't, not all apps, websites, products of code are consumed by straight white guys living on the US coasts. A mix of perspectives is a healthy thing.


I like to think that since white guys are generally the most universally hated group by other race\gender combo's, white guys just dont really care who you are as long as you get the job done. Thats why they have the lowest interest in diversity.


I think it's probably something else:

To us white guys, diversity means us someone like us not getting a job, or possibly us not getting the job. All the other groups rated it much higher because it means someone in their group getting the job, including possibly themselves.

It's an inherently selfish question, even thought it's not presented as such.


That's blatantly racist.


Its really not. Care to explain why?


> "We estimate that 16.8 million of these people are professional developers and university-level students."

I wonder if it is relatively safe to infer from this that there are ~16.8 million developers in the world? Are there developers who never visit stackoverflow? I'd expect that number to be higher, even though 16.8M is a lot.

Also if you take all "years since learning to code" up to "9-10", they add up to 50.6%, which means that the developer population doubles every ~10 years. Though I have heard elsewhere (I think it was a talk by "uncle Bob") that the developer population doubles every 5 years, not 10.

Edit: Ha, interestingly his data was also based on stackoverflow: http://blog.cleancoder.com/uncle-bob/2014/06/20/MyLawn.html ...now does this mean the rate of growth is slowing?


Note that this data excludes the vast majority of Chinese software engineers (they represent just 0.4666% of the total survey responders)


> Are there developers who never visit stackoverflow?

Would non-English speakers visit?


> which means that the developer population doubles every ~10 years

Or that people stop being developers with time. Or older people use stack overflow less.


Most loved platform:

    ...
    iOS (62%)
    Android (61.6%)
    ...
Most wanted platform:

    Android (20.6%)
    ...
    iOS (13.2%)
Android and iOS neck-a-neck for currently used, but Android far ahead as future platform?


Some things that stand out:

- The "web" is not a platform?

- France pays $10k less than Germany? US pays twice that of Europe?

- People who work for a small company usually have many roles, so asking people to pick their role in a single-choice way seems not right.


> - France pays $10k less than Germany? US pays twice that of Europe?

Yes, France has higher taxation and politically leans left, even compared to Europe. That's weighted out by strong labor laws.


This doesn't account for taxation in France for 60K€ you would pay 10550€ (17.6 %) and in Germany 14000€ (23,21%), and there is also local taxations.


About the "ship it (optimize later)" point: The interpretation seems to completely opposite to the data. It seems most people seem to disagree with it.

It's also most reasonable from experience. Everybody experience stuff that is supposed to be optimized later but then never gets revisited until it's rewritten completely 20 years later (again with the idea to just ship it of course).


Surprisingly a full 2.6% of Stack Overflow visitors identify as non male/female, of which a 0.5% transgender. Both numbers much higher than those for the general population.

It would be interesting to see more information about this, looking forward to the follow-up article: "We will publish additional analysis related to respondents’ gender identities in the coming weeks."


Based on my experience both in developer communities and outside of them, I don't find that surprising. While it is evidently true that there are disproportionately more men in tech, it also seems to be true that there are more queer/trans/non-conforming people in tech.

Also I wouldn't be surprised if the numbers are actually higher: assuming the choices are exclusive, the way they are worded it's not clear how transgender is intended to be represented in the results. Not everyone who is transgender sees their "trans" status as a distinct gender identity (i.e. the emphasis in "trans-woman" or "trans-man" doesn't have to fall on the "trans" aspect). I think a clearer distinction would have been a separate question "my gender identity matches the gender I was assigned at birth" (instead of distinguishing between male, female and "trans") but that might have overcomplicated things.


Yes, most female developers I know are trans women.


That doesn't necessarily correlates to real life identities. I never post personal information if given choice not to, that includes as many "other" as there is.


The "Years of coding experience" statistics are interesting in a number of ways IMHO. Is there an interpretation why it doesn't go down linearly, but has up and down micro-cycles? Is the bump at "more than 20 years" of coding experience just an artifact of not assessing answers with over 20 years of experience at finer granularity?


> Is the bump at "more than 20 years" of coding experience just an artifact of not assessing answers with over 20 years of experience at finer granularity?

I would think so. If someone started a programming job right out of school (~22 years old), and is still programming today, that would be any developer who is ~43 or older. I would say at least 1/4-1/3 of my former and current colleagues would fall into that age bracket. Not all of them started programming out school but many of them did for sure.


I wonder why age was not included?


Suggestion: Add automated machine info collection to the Developer Survey

A suggestion to improve the amount of data available in future surveys without adding to the manual collection effort, provide a facility to automatically collect workstation/environment info in a parsable format for consumption by the survey: https://meta.stackoverflow.com/q/345859/361842


The "languages over time" section [1] is overrepresenting Javascript. It shows separate charts for JS and node.js, when the latter is not a language but a runtime environment.

I would love to see the chart for Go instead of Javascript twice.

[1] http://stackoverflow.com/insights/survey/2017/#technology-la...


How is it possible that

"When we asked respondents what they valued most when considering a new job, 53.3% said remote options were a top priority. A majority of developers, 63.9%, reported working remotely at least one day a month, and 11.1% say they’re full-time remote or almost all the time."

And only a few percent of companies hires remote?


The most likely explanation is that the distribution sampled doesn't match the distribution from which you drew "only a few percent of companies hires remote".

But it's also possible that people have different opinions on what "almost all the time" means.


People have discussed in the past that SO's user base is heavily biased towards web/mobile/database (including ETL) development. Are there any competitors to this survey that would be more holistic/diverse in its view of the industry?


Students' Expected Salaries was the most surprising section for me. As a US soon to be grad, those numbers seemed surprisingly low. Perhaps the numbers are a lot less outside the US? I don't even think I could find a dev position that pays $30,000.


You do realize that there are developers in poor/developing countries, right? $500/month is considered a decent salary for junior devs where I'm from.


Where are you in the US? Cost of living plays a major role in salary pressure in both directions, especially on the lower rungs of the ladder. Adjusted for (not much, but measurable) inflation my first job paid $32k. Entry-level dev jobs in my area are in the $40-50k range depending on the technology and company.


In Western Europe a senior dev makes less than 100k. A regular dev makes about 60k. In Eastern Europe a regular dev makes about 30-40k.


Interesting how "Knowledge of algorithms and data structures" is the third most important thing developers think should be prioritized when recruiting (3.77/5), following communication skills and track record.


It would have been nice to see more of the long tail of Years Since Learning to Code and Years Coding Professionally. Wonder why they chose to not ask for the distribution after 20 years?


Loved the gif question :)


I can't wait for somebody to send me a screenshot of the salaries and ask me why are we paying $150K when StackOverflow says we can find people for $58K.


Who's the elected officials that are using stack overflow? I really wish they had put more info up about that.


Glad to see desktop development is still kicking (28.9%) but remote jobs are very rare...


even north koreans uses stackoverflow!


Smalltalk <3



I use spaces exclusively, but only because everyone else does. Tabs are clearly superior, since their width can be adjusted to look however you like. For instance, I prefer 4 spaces of indentation in Javascript. Another developer here prefers 2. With tabs, we could both be happy.


Oh, those memories... That's exactly what I thought a convincing argument when I was taking my first steps in development. Everyone can make the code look as they want! But wait... Why would everyone want the code to have different visual indentation? I fail to see the benefit in that. The so called "personal preference" is a hoax. After having given up on tabs I used to think 4 spaces are so amazingly cool, 2 spaces are weird. And then standard came along and finally brought consistency. Of course, they couldn't get it right, and abandoned semicolons, which is why there is also semistandard. Ask Douglas Crockford why you should have semicolons in your code. There is a good reason for that, not based on anyone's "personal preference".

The tabs vs 4 spaces vs 2 spaces is not the whole story, oh no. It is merely the beginning. There are many other life threatening problems to solve. Where to put the brackets? Multiple var statement or just one? Space between function name and brackets or no? The list just goes on. You keep thinking what is better. There is no clear answer, one choice is no better than the other but you have to make up your mind. You have to make the decision and stick to it, because code should be consistently formatted. And when you finally decide, it becomes part of you, part of your identity. "I am the guy who has just one var statement! I am the guy who does not bother with a space between function name and the bracket!" Yeah, you're such a cool, original rebel, you are. And if someone tries to take it away from you you get really angry. It is a bit like religion. On some level we know it does not really make sense, but all the same we believe in it because it defines who we are.

Douglas is right, you know. We, devs, are not as rational and logic driven as we think we are. We are very irrational and emotions driven, maybe even more than other people around us. 2 spaces in the way of your happiness. Bah!


Yeah, this is the argument that I make as well, and I don't understand why it doesn't just settle the debate. With tabs, everyone can have their code look however they want it to look, even the weirdos who want it to be 8 spaces.


The problematic ones are the 19.3% who prefer both...



Tabs for indentation, spaces for alignment. That way devs can set whatever tab length they prefer in their editors without messing up alignment.


I choose between tabs and spaces depending on existing code base, but prefer neither (or both). When writing code from scratch, it depends on language and text editor used.


It1s a safe question to estimate the troll numbers.


On the contrary, those of us who prefer both are the smart ones[0]

[0] https://www.emacswiki.org/emacs/SmartTabs


That's too complicated for the unwashed masses


Tabs are far superior. :)


This is the correct answer.

The bad developer is identified by the rhythmic noise of arrow keys.


... because they aren't using an editor coughvimcough that correctly positions their cursor automatically or let's them jump to the beginning of a line with a single keystroke ?


How arrow keys are related to tab/space flamewar? Isn't everybody (whose job requires typing) using moving between words as the most basic mean of navigation? (w, M-f, Ctrl+arrow, etc.)


Yes, strange, I rarely encounter codebases with tabs. I wonder if some of them mean they press the tab key and the editor adds spaces.


> strange, I rarely encounter codebases with tabs

I prefer tabs, but I never use them.

It's like social networks, it doesn't matter what you prefer, it matters what your friends prefer. If 57% prefer spaces, well okay I suppose we're using spaces then.


I think the problem with tabs is that you inevitably end up using both tabs and spaces. This still OK, as long as tabs are used for indentation, and spaces for alignment. But if you don't have good editor support for it, you'll end up with some lines indented using tabs, and some indented using spaces.

So while in theory tabs are superior, in practice they seem to end up being more trouble that they're worth.


I've been using tabs in Python for 14 years and I haven't needed to align anything in 13. Join me, it's a wonderful, readable world out there.

Indent level alignment is far less readable than just indenting one extra for new context. As for inline alignment, I found its super counterproductive due to diff noises every time you update the members.


Just learned that:

Most Popular Languages by Occupation

  For Sysadmin / DevOps no 1 is JavaScript
  For Data Scientist / Engineer no 1 is JavaScript


This is a little bit misleading. This was a multiple choice question, so it's understable that most people have used JavaScript in some way within their projects. This doesn't mean, that JavaScript ist the most used language in those categories by the time spent using it.


That was my takeaway too. If you ever had to change a spelling error in the front-end of a devopsy-style tool, then chances are you wrote some javascript, and bam: top of the list.


My instant takeaway: Does China have their own version of stack overflow or is it due to the firewall? That many more developers in India vs. China or is English just more commonly spoken in India?


English is much more widely spoke in India. It used to be a British colony, and English is an official language.


Survey has a particular flaw: it lists ".NET Core" technology, but it's a vaporware (e.g. not a real thing). It should be corrected to ".NET".


In what way is .NET Core "vaporware"? It's a very real thing that is documented and downloadable from here:

https://www.microsoft.com/net/core


For those who cannot grasp the point:

1. .NET Core is on 4th year of development

2. Breaking changes from version to version

3. The absence of release culture. The approach of .NET Core team is to bodge together something with a scotch tape, then call it a release. Yep, they are even lazy enough to eliminate words like "beta" or "preview" in their "releases".

4. Astronautic APIs. They are so minimal to the point they are useless

5. Does not provide a reliable way of doing things. "You can attach any configuration provider you want"... Yeah, the truth is nobody wants any, everybody wants the one that works as a basic requirement.

6. Documentation is scarce and is totally useless. Yep, I know how that method is called, thank you. What I really need is the description of what it does, how it does it, and code samples. But what you get now is a large page of HTML per class full of some obvious statements like "Method void GetState() - Gets the state.". They call it documentation. I call it BS.

I can continue. But the point is that this is enough to kill the thing off. .NET Core is already dead, but most people can't see it from the hyped bubble walls.


Much of this is debatable, and none of it adds up to the platform being dead, let alone 'vaporware'. It exists. It is in use. In production systems.


Why is not a real thing? I know people using it in production.


Javascript and Java reign supreme. No trendy wanky languages like the ones mentioned on here all the time will ever be up there. Companies just pretend they "use go" to attract the trendy people here, when in reality "use go" means "there was a project once that we used go for" and 99% of their code is javascript and java.


Not sure if trolling or... but fwiw at Stack Overflow's SRE team we use Go a _lot_. We just open-sourced our latest tool written in Go, DNSControl (https://github.com/StackExchange/dnscontrol/), and we have dozens of internal tools and software written in Go.

Not to mention other open-source tools like Grafana, InfluxDB, or Cockroachdb.

I agree with you that Java and JavaScript will reign supreme for years to come. Just pointing out that not all of us think that's a good thing, and there is plenty of variety out there, in the real world.


Sure, and the you from 15 years ago would have written that about VBScript vs Python/ Ruby/ Whatever.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: