The self-selection of people I follow on Twitter, the meetups and conferences I attend, the choice of companies and colleagues I have worked with, etc, probably just reinforce my blinkered echo chamber. This is probably true for many Haskell, Rubyist, Go people and others as well.
For example, the data on a lot of developers only having 1-3 years of programming experience. That would naturally follow that a majority of active users on a Q&A site would not have much experience, right?
There definitely are some good results in there - I would just take anything regarding popularity with a grain of salt. I also suspect the population of people taking the survey don't fully represent all software developers equally. People that are in a stable role using "unpopular" or proprietary technology have no use for stack overflow.
This doesn't necessarily mean it's about your experience, that is, as more and more questions are asked and answered, it is more likely that an answer already exists, so you don't need to ask a new question.
Those are fundamentally different roles. Someone working exclusively on server-side applications has a totally different profile than someone working exclusively with web frontend. Concepts, technologies, tools, everything differs, and so probably does the personal profile of those developers.
Most Wanted Language: Python
It's good to see these two languages win out these categories! Good things lie ahead (hopefully).
Love the site.
The reason I'm surprised has nothing to do with any advocacy. It's just that whenever I look around at a non-Apple dev conference in the US (Web, for example), it looks as though fewer than 1/5 of the machines are NON-Macs, while the answer to the question above is the inverse: fewer than 1/5 of devs use Macs. (Though multiple answers were allowed this still says that fewer than 1/5 use Macs.)
So, I'm wondering: Are StackOverflow and US dev conf attendees significantly different groups? (Ex: SO hobbyists very different from conference pros?) Or do devs usually install Desktop Linux on Mac hardware, and I don't realize that half of the Macs laptops I see are running Linux? Or do people use PC desktop hardware for development but use Mac laptops for portable use (ex: attending a conference, but they would still qualify as Mac users)? Or are all of the non-US conferences solid walls of PC hardware with so few Macs that they overwhelm the Mac usage in the US? Or have things changed significantly in the past few years? (I haven't been to a dev conference for a while.)
Or what? Presumably all of the above to some extent, but am I missing something big?
It's even a very small and biased subset of the total developers in the United States. The majority (I would say vast majority) do not attend conferences. Sometimes because their workplace won't fund it, but often simply because they find no value in it. I personally gave up going to conferences years ago partly because workplaces treated it almost like they were giving me a vacation that I should feel grateful for, but it certainly didn't feel like a vacation. And afterwards I would try to itemize the value I received...to have nothing (talking to developers of the vendors face to face was, if anything, less productive than simply dropping a note on a discussion board).
So that subset often comes from more moneyed employers, or are the "see and be seen" sort, or they really get value out of it. For the former two sets a Mac is a pretty likely device.
I'm heading to a regional one in August that is registered as a nonprofit and is less than a week's salary to attend (and I work in the public sector in the Midwest so don't think that this is big left cost salary).
That said, even with a strong Microsoft bend to many session technologies, macs still are several times more frequent than non-mac.
Since the conference isn't trying to pull in big bucks (its a break even - 501.C non profit even), attendee costs for 2017 are $425 (food included - and they don't skimp on food). Even if you're going to add on the hotel the conference is located in, its only $175. Toss in a plane ticket and it is probably still less expensive than the national ones for the "get in the door" ticket.
There are lots of regional or local conferences that are not absurd in pricing.
For another example of an upcoming local one: https://stirtrek.com which is a one day $99 one. Its draw is likewise local.
Yea, NoFluff is $1000 to walk in the door for two days. I'm very glad that there are others that are closer in line with not needing to drain the training budget to go to.
Reason is simple: they are very overpriced for ours salaries (and I am in a first world country, I think their costs are prohibitive in places like India/Cina/etc) and - personal reason - as a webdev I can basically do everything I need in a 250$ old t430 with fedora, so I can't justify to spend like 10 times more just for a shinier piece of hw which is not even easy to upgrade.
Macs have dominated in the areas I have worked in for the past 6 years. I would say in the companies, hands-on meetups and conferences I been to in London and across Scandinavia: 75% use macbooks, then perhaps 15% dell xps with ubuntu and 10% mix of lenovo, MS surface etc.
Though I think this split will change with the recoil of the new touch bar macs.
(Again this is just from my smaller demographic of location and people.)
The calculus of owning a Mac changes drastically if you live near an Apple store. If my Mac has a problem, I can book an appointment at the Apple store, real person will try to fix it, and I can have it back ASAP. The customer service is half the price.
Probably because London has more people than most European countries.
Yep, it's almost true. London 8.5M. If you count all 50 "countries" (some are a joke though) it ranks between the 21st and 22nd.
Though in my city (Paris 11M, I think), similarly sized, there's only 2 Apple stores.
Paris and London are 500km apart, a couple hours of train or plane away from each other, both capital cities, both have ~10¨7 people.
Which? Vatican City is fairly extraordinary, granted, but I think 'a joke' is unfair...
I used to run Linux on a laptop, but eventually I gave up and switched to Macbook so that I don't have to deal with drivers and low-level config issues on a daily basis.
In my university, 60%-70% of CS students use a Macbook. They are not working or attending conferences yet, so the change is going to take some time.
Just for comparison, with what I should spend for a macbook here I would pay 3 years of a Bachelor Degree + half a year of a Master Degree (obviously I am referring to just university tuition costs, not rent or other general expenses) in a top 3 university of my country with international recognition.
What in gods name are you talking about? Are you using a distro from 1995?
The graphics and wifi drivers broke down every now and then and sometimes it broke the OS completely so I had to reinstall the OS.
Need to fix multiple config issues and compatibility issues when I wanted to install some Linux-alternatives for Windows applications.
Plus I need to do video editing and Android apps so I still have to constantly switch back to Windows with dual-boot.
Mac just seems to be a perfect balance, giving you all the standard terminal stuff plus everything just runs out of the box.
Now I use a mix of Win, OS X, and Ubuntu depending on what I am trying to accomplish. I make heavy use of Linux servers, but have no inclination of using one as a primary end-user desktop environment again.
When people complain about Linux my first question is if they had given it a serious try in the last 2 years.
I think you're seeing selection bias here. The devs that attend conferences are a very, very small sliver of the total population, and the price of attending a conference (not only the ticket, but travel and PTO necessary to attend) also overlaps heavily with the more "well-off" devs, which probably have a higher percentage of Mac usage.
FWIW: I've had a personal Macbook (for two years in college) and a work Macbook (when necessary), and in both cases I only ran linux on them.
Most devs don't attend dev conferences.
"This change will allow us to do X in Y days when normally it would take some multiple of Y."
It sounds like what you want is a more salient list of pain points in the current development environment and exactly how the proposed new thing would address them. Additionally, you'd want to hear the new set of pain points that the new thing will impose (cue the quote about some devs knowing the value of everything and the cost of nothing).
You may also, for their benefit, direct the next kid with the cool idea towards a link that's commonly thrown around here: https://en.wikipedia.org/wiki/Wikipedia:Chesterton%27s_fence
I'm going to pull the more direct section "What this means on Wikipedia":
> If you're considering nominating something for deletion, or changing a policy, because it doesn't appear to have any use or purpose, research its history first.
> You may find out why it was created, and perhaps understand that it still serves a purpose.
> Or if you do feel the issue it addressed is no longer valid, frame your argument for deletion in a way that acknowledges that.
(Nice link. I was unaware of the name for that heuristic.)
I think that is a really important thing to bear in mind when reading Hacker News, or pretty much any online venue. People who contribute or even post comments are not the majority. SO rates me in the top 20% of Ruby developers, because I answered a few questions on the site.
After all, you consider C# an uncool language.
That couldn't be further from the truth... I love slinging C#, it's far and away my favorite. It doesn't have that HN hot-shit cachet that Go or Rust or Scala or Lisp-likes or ML-variants have, however.
The last two teams I've worked with have all had Macs, even though at least half the team were writing .NET in Windows.
Then there's my main personal laptop which couldn't get the latest Mac OS so I just wiped it and installed Linux. I wouldn't do that to my daily work machine though, too many little things just don't work quite correctly, but all in all it installed pretty easily and runs less sluggish than OS X did.
* Better as in good hardware, consumer appeal, and a relatively performant and stable machine for getting work done. Not necessarily best value or gaming, etc.
My windows and linux installs never reliably go to sleep on their respective laptops when I close them -- and I cannot be bothered to figure out whether it's a hardware problem, or a software problem... I'll just bring the mac which does sleep reliably.
It also has pretty good battery life compared to my other machines.
Edit for spelling.
Although SO now has a much wider audience, the more engaged, longer-term members are more likely to fill out this survey.
Also, how many of those people with Macs had a Linux distro installed on them?
more seriously, once you get the hang of it Ubuntu machine is just ridiculously more productive than win and osx.
I'm surprised that not more devs are using it.
Web development to me feels like the Boneyard from the Lion King. This shadow land that I am too terrified to enter into, and I have no idea why I feel this way. It also bugs me like it might be a huge shortcoming at some point in my life that I have no grasp of it.
I think the difference is that in Desktop/Mobile, whatever crap has stuck to the wall for years - you must use because you don't have a choice. And usually this crap is fairly well documented because it is your only choice.
In the Web World, there is a bunch of crap thrown at the wall (every day) and you have no idea what will stick and what will slide off each year.
Maybe a poor metaphor, but just kind of how I feel on some days.
For example, NSDateFormatter is fugly but well documented . Also, the MVC pattern in iOS is from macOS which is from Next (I hear).
OTOH I'm building my first React + Redux app and the best practices and patterns are still emerging. I feel like next year most of what I just learned won't matter.
In my experience, web apps are way, way easier than Desktop apps in almost every way, aside from maybe if you have to figure out scaling really big (which you probably don't most of the time). You can probably learn everything you really need to know about web apps to clear that spooky feeling just by working at a new job for a month or two. It's not a huge shortcoming at all in reality, because at the end of the day, software is software. You can write software, so you'll be fine and may already be better than people calling themselves "web developers."
Also, I don't know if this applies to you, but in my case part of the reason I had this spooky feeling was because the people who I knew who called themselves "web developers" before I actually started doing "web development" were not very skilled, and were inclined to portray it as Magic to other people, so that they could they could feel the satisfaction of being a sort of Arcane Web Magician. Plus, it was magic to them, because they didn't really understand it. If you have people like this around you, I suggest you ignore them and, if you need support, find people who are able to explain details clearly to you without presenting it as magic, or at least point you towards resources that can. Once I was able to do that, it made a big difference in my confidence in my own abilities and helped me improve as a professional tremendously.
I can do almost anything I need with Web tech.
Desktop or even native mobile development seems to be like going back to the last century.
I'm doing a native mobile project next month, and looking at the stuff a iOS dev needs to get going is horrible.
I don't know, maybe it just seems better to me, because I'm doing Web for so long now.
When I see desktop/mobile with Qt, Java or C#, I see huge IDEs, runtime environments and SDKs and with iOS even vendor lock-in, simulators and whatnot.
If you want to do mobile, try making it a PWA. If it needs to be in the app store (which these days I often see portrayed as more of a necessary evil than something desirable) go hybrid. If hybrid is too slow, try React Native or NativeScript. All of these can build on your existing knowledge of web technologies.
If you want a desktop app, you probably don't actually want a desktop app anyway, so just build a decent web app. If you need something that runs outside the browser, use Electron. Or React Windows if you only target the UWP.
If you want to build something on the server-side, there's of course Node.js. If you want something more tightly integrated with the database try ArangoDB. Again, same language you already know.
There are still niches where you need the raw performance of specialised programs written in other languages than JS, of course, but they are growing smaller. It's basic economics: if you can hire two full-stack JS devs instead of one iOS, Android, Windows and back-end dev each, you can cover the same technologies with half the salaries. This is especially true if the specialists end up doing mostly busy work because you don't actually need their specialised expertise all the time.
Few companies can afford to have all these dedicated specialists on the team but they may still want to cover multiple markets (e.g. an app for desktop and Android). In this case going for the web is the obvious choice for the smaller budget. The alternative would be hiring multiple contractors or outsourcing the project to a company that has the resources to provide all of the needed specialists all the time.
There are many ways to leverage your Web knowledge, you don't have to start from zero.
But it's still a chore to do mobile development.
I want to build a iOS app, because 80% of the potential customers are iPhone users, but in the future I want to use as much of the code on Android. So its Hybrid or React-Native.
So I thought, well Expo sounds nice, and gives me a Web like dev experience without the need to set up a Mac OS system.
Then I checked if they support AirPlay and HealthKit and the answer was "detach".
So now I'm back to "regular" React-Native and have to setup a Mac to get started.
Well, at least it seems running a dev VM for Mac OS isn't big of a deal anymore :\
Is it even possible to run macOS in a VM on non-Apple hardware with a non-macOS host yet? Microsoft provides free VM images for every OS and browser version you might want to test against, Google provides the Android devkit that works cross-platform. But if you want to target macOS or iOS it seems you're still stuck buying Apple hardware for $$$ to do anything. And don't even get me started on the lackluster support for PWAs -- at least the fallback for those is a web app that still works in the browser (provided you have Apple hardware to test it on).
EDIT: Also Microsoft has moved React Windows into its official GitHub org and advertises it as an official way to provide extensions to Office.
It seems like MS thinks their only way to win something in the "mobile war" is pushing PWAs.
And maybe they're right. Most people use the iTunes or Play Stores. They need to get devs off these platforms and show them that their tools are the way to go if you don't want pay taxes to Apple or Google.
Wouldn't have thought of built-in help as a popular (47.1%) way of teaching yourself.
Oracle usage only 16.5%, but I guess it makes sense considering the web developer proportion.
CoffeeScript as the third most dreaded language, behind only two instances of Visual Basic. But reading the definition of "dreaded" in makes more sense.
Sharepoint as the most dreaded platform, ha ha, no surprise there.
Clojure as the top paying tech worldwide, wow. But missing entirely from the list in US, UK, Germany, France sections, so where are all the Clojure devs? In general that's... comforting, if only there were any Clojure shops in my country (okay, admittedly I've heard about one startup using it).
The built-in help for eg Java, C# or Delphi is amazing. It starts with autocomplete, then a tooltip with a summary, and full details including examples are just an F1 away.
I also noticed that Java is not in the world-wide list and yet is in all the sub-regions.
Given that the JVM is such a big part of the Clojure ecosystem, I want to dig into the data once they make it available. I'm totally willing to believe that developers with Clojure skills are in a high-salary bucket, but I'm a little suspicious of what they are showing here.
BTW, for some years I chased specific languages as the cause of the high earnings. I believe this is much less true than I originally assumed . . .
Probably, because it's reported mostly by US developers, but pay is not high enough to crack into US top paying list. Lowest position in US list is $75,000.
Also, world distribution might have affected Java position. While it is pretty high on US/UK/France and Germany list, the world salaries were depressed due to outsourcing.
At the end of the day, Oracle is expensive. Web or not.
The "platform" part is at best ambiguous and at worst white noise.
It covers mobile & desktop OS's, "cloud" providers - both IaaS like AWS/Azure & SaaS like Salesforce - hardware platforms like Raspberry Pi and Arduino, buzzwords like "Serverless" and mainframe.
Apparently all of those things represent a "platform" that is in some way comparable, but Linux, BSD and Windows servers do not.
I think more likely is that every nodejs developer considers themselves "devops".
It is my favorite language, btw. :)
It probably is because of the minimal use (and I believe #1, Rust, is also there because it's still not that common). Their "loved" metric is number of people who are using a language and would not like to switch to a different one. I suppose that the only people who use it are people who love it so much that they can ignore issues caused by lack of popularity (small number of libraries and tools, problems with OS support, etc.).
We will publish additional analysis related to respondents’ disability status in the coming weeks.
One that's particularly interesting to me and at least has some semi-plausible story is that the way personal computers were marketed in the US ended up influencing the perception of the computer as something male, even though prior to 1980 women made up a larger share of working programmers than they do today.
What I would find interesting from this survey is whether the gender breakdown between specializations changes when nationality is an additional factor. Because in my own limited experience, there are plenty of women who are competent developers, but I've only known one born in the US (n=8, but I work at a small company and this is is at least a double-digit percentage of the engineering staff)
To us white guys, diversity means us someone like us not getting a job, or possibly us not getting the job. All the other groups rated it much higher because it means someone in their group getting the job, including possibly themselves.
It's an inherently selfish question, even thought it's not presented as such.
I wonder if it is relatively safe to infer from this that there are ~16.8 million developers in the world? Are there developers who never visit stackoverflow? I'd expect that number to be higher, even though 16.8M is a lot.
Also if you take all "years since learning to code" up to "9-10", they add up to 50.6%, which means that the developer population doubles every ~10 years. Though I have heard elsewhere (I think it was a talk by "uncle Bob") that the developer population doubles every 5 years, not 10.
Edit: Ha, interestingly his data was also based on stackoverflow: http://blog.cleancoder.com/uncle-bob/2014/06/20/MyLawn.html ...now does this mean the rate of growth is slowing?
Would non-English speakers visit?
Or that people stop being developers with time. Or older people use stack overflow less.
- The "web" is not a platform?
- France pays $10k less than Germany? US pays twice that of Europe?
- People who work for a small company usually have many roles, so asking people to pick their role in a single-choice way seems not right.
Yes, France has higher taxation and politically leans left, even compared to Europe. That's weighted out by strong labor laws.
It's also most reasonable from experience. Everybody experience stuff that is supposed to be optimized later but then never gets revisited until it's rewritten completely 20 years later (again with the idea to just ship it of course).
It would be interesting to see more information about this, looking forward to the follow-up article: "We will publish additional analysis related to respondents’ gender identities in the coming weeks."
Also I wouldn't be surprised if the numbers are actually higher: assuming the choices are exclusive, the way they are worded it's not clear how transgender is intended to be represented in the results. Not everyone who is transgender sees their "trans" status as a distinct gender identity (i.e. the emphasis in "trans-woman" or "trans-man" doesn't have to fall on the "trans" aspect). I think a clearer distinction would have been a separate question "my gender identity matches the gender I was assigned at birth" (instead of distinguishing between male, female and "trans") but that might have overcomplicated things.
I would think so. If someone started a programming job right out of school (~22 years old), and is still programming today, that would be any developer who is ~43 or older. I would say at least 1/4-1/3 of my former and current colleagues would fall into that age bracket. Not all of them started programming out school but many of them did for sure.
A suggestion to improve the amount of data available in future surveys without adding to the manual collection effort, provide a facility to automatically collect workstation/environment info in a parsable format for consumption by the survey: https://meta.stackoverflow.com/q/345859/361842
"When we asked respondents what they valued most when considering a new job, 53.3% said remote options were a top priority. A majority of developers, 63.9%, reported working remotely at least one day a month, and 11.1% say they’re full-time remote or almost all the time."
And only a few percent of companies hires remote?
But it's also possible that people have different opinions on what "almost all the time" means.
The tabs vs 4 spaces vs 2 spaces is not the whole story, oh no. It is merely the beginning. There are many other life threatening problems to solve. Where to put the brackets? Multiple var statement or just one? Space between function name and brackets or no? The list just goes on. You keep thinking what is better. There is no clear answer, one choice is no better than the other but you have to make up your mind. You have to make the decision and stick to it, because code should be consistently formatted. And when you finally decide, it becomes part of you, part of your identity. "I am the guy who has just one var statement! I am the guy who does not bother with a space between function name and the bracket!" Yeah, you're such a cool, original rebel, you are. And if someone tries to take it away from you you get really angry. It is a bit like religion. On some level we know it does not really make sense, but all the same we believe in it because it defines who we are.
Douglas is right, you know. We, devs, are not as rational and logic driven as we think we are. We are very irrational and emotions driven, maybe even more than other people around us. 2 spaces in the way of your happiness. Bah!
The bad developer is identified by the rhythmic noise of arrow keys.
I prefer tabs, but I never use them.
It's like social networks, it doesn't matter what you prefer, it matters what your friends prefer. If 57% prefer spaces, well okay I suppose we're using spaces then.
So while in theory tabs are superior, in practice they seem to end up being more trouble that they're worth.
Indent level alignment is far less readable than just indenting one extra for new context. As for inline alignment, I found its super counterproductive due to diff noises every time you update the members.
Most Popular Languages by Occupation
1. .NET Core is on 4th year of development
2. Breaking changes from version to version
3. The absence of release culture. The approach of .NET Core team is to bodge together something with a scotch tape, then call it a release. Yep, they are even lazy enough to eliminate words like "beta" or "preview" in their "releases".
4. Astronautic APIs. They are so minimal to the point they are useless
5. Does not provide a reliable way of doing things. "You can attach any configuration provider you want"... Yeah, the truth is nobody wants any, everybody wants the one that works as a basic requirement.
6. Documentation is scarce and is totally useless. Yep, I know how that method is called, thank you. What I really need is the description of what it does, how it does it, and code samples. But what you get now is a large page of HTML per class full of some obvious statements like "Method void GetState() - Gets the state.". They call it documentation. I call it BS.
I can continue. But the point is that this is enough to kill the thing off. .NET Core is already dead, but most people can't see it from the hyped bubble walls.
Not to mention other open-source tools like Grafana, InfluxDB, or Cockroachdb.