Also if they are starting out I would recommend a language that enforces better OO (Object Orientated) practices than python, like Java (don’t hate me but it’s true).
I can't say I agree here. I suspect you knew this would trigger some debate.
First of all, the definition of OO is not well established. What defines OO? Is it encapsulation? Inheritance? Message passing? Jonathan Rees discusses the issue in more depth here: http://paulgraham.com/reesoo.html . Whether OO, by any accepted definition is a Good Thing is also debatable, and beyond the scope of this comment.
More importantly, I think imposing that "good OO practice" on a brand new programmer is a distraction from learning the basics of programming. None of the boilerplate Java requires for a "hello world" program actually has anything to do with printing "hello world" to stdout. The beginner has a great deal of information to absorb, and the meaning of "public static void main()" is an extraneous distraction from what's fundamentally required to make a program work, and arguably not helpful for making programs that are well-structured and easy to understand. If "hello world" requires much more than "print('hello world')", it's not a good language for learning the basics.
If you must know the reason behind that statement, was due to the fact at where I work we use jython, java and python extensively. When I first got there I was rapid prototyping applications for the end user, based on some data that had been structured in Java. While this was available through API I had no inner understanding the objects that I was passing around in python, which you can get away with in python very easy. Which is why I think its bad to get into the habit for beginners. Understood that clutter is a bit extraneous but at the same time necessary, or least until we all move towards simpler languages like python. :)
At my workplace I need to code in Java. After programming in Python for 2 years I am simply hating the experience. Nothing to do with OO, but the language is a lot more verbose and I personally find it a bit irksome.
That clutter is only necessary in cluttered languages like Java; it is only necessary to learn it if you're using one. It's perfectly safe to put it off until you need to.
Passing around objects without having to understand their innards is part of what's generally thought to be good about most definitions of OOP. Being able to understand them is part of learning the language, of course.
So ignore his arguments and opinions and take a look at that graph. That's one of the more informative pieces of "what language should I learn" dialogue I've ever seen.
Take a look at the github and stackoverflow communities. If you think they're doing cool stuff, then you can probably feel safe tackling any language in the upper right corner of that graph. It's a safe bet.
The angle above the regression line suggests whether the language is more talk (so) or walk (github), the clusters are sensible based on my domain knowledge of the languages: useful workhorses, weird and developed.
I love the informative display of quantitative information. This is especially true when everything rhetorically boils down to opinions.
I don't think distance from the line is a talk/walk dichotomy. Rather, I think it's about the respective limitations of SO and github as metrics of popularity. SO's core audience came from people who followed Jeff and Joel, who are both pretty Windows-centric folks. Meanwhile, github's basis - git - came from the very creator of Linux! It seems to me that above the line means more in the MS camp, and below the line more in the open source / early adopter group.
The standouts on above the line, things like Delphi, F#, C#, are all primarily Windows oriented. The things that are to the south-east of the broad thrust are things like CoffeeScript, Ruby, Lua, D, Go, etc. - a lot of stuff that's either fairly cutting edge, or played with by people who like to think they're on the cutting edge.
Write more! And make more graphs if you think of other cool ways to consider the usefulness/impact/community/power of languages (beyond the analyzed-to-death language shootout stuff).
I'd recommend taking the scrolling "who I am" box and making it be fixed relative to the page or fixed relative to the screen. Its movement makes reading really annoying and it isn't convenient or anything.
Fortunately in Chrome the reader can inspect the node and delete it.
I think the big secret about these recommendations is that it doesn't really matter _that_ much. Lots of people use different development environments, languages, and they get stuff done. I can scoff/smirk at Java ("Java. Heh.") but then my favorite web application is written in Java. My favorite text editor is written in a disgusting, horrible language. As for the PHP/Ruby/Python/Perl decision, it turns out that all of them work. Even PHP works. It turns out that the vast majority of time is _not_ saved by being able to write anonymous functions.
While i didnt drop out of software engineering class, i found it to be a real chore to learn how to program. That was in 1998-2000.
Since then, i've had a little dabble with html and css and thats about it. On one or two occassion, i had a colleague try to shove Delphi down my throat, and i didnt take to it too well.
Up until a month ago, i never wanted to touch or look at code, but i had to get involved with a little bit of VBA for an important project at work, and then i also had a little dabble with javascript. I actually enjoyed working with them both, and im finally able to start reading code, and oddly am getting very interested in coding now.
I turned 30 about 6 months ago, so its strange to be learning at an age where most are usually seasoned veterans, but still i'm having fun.
But as an entrant into this area, i've been most intrigued by Javascript, both for client-side AND server-side (Node.js), and that graph makes me feel much better for opting for javascript as my language of choice for the short-to-medium term.
Sorry for the mini auto-biography, but i felt i had to share.
I'm not really sure that looking only at github is fair. You'll find many more .Net projects on codeplex, and more of everything (though not necessarily in the same distribution) at bitbucket and sourceforge.
Counting repositories inflates results for Github. Consider a project that consists of a server, a client, and a library. In Subversion, that would quite likely be one repository, containing server, client, and lib subdirectories. People just interested in the client can checkout the client directory from Subversion as if it were its own repository. (That's the one thing I really miss from Subversion since I switched my department at work to Git).
In Git, there's a good chance it would be three repositories. One for the server, one for the client, and one for the library.
I can't say I agree here. I suspect you knew this would trigger some debate.
First of all, the definition of OO is not well established. What defines OO? Is it encapsulation? Inheritance? Message passing? Jonathan Rees discusses the issue in more depth here: http://paulgraham.com/reesoo.html . Whether OO, by any accepted definition is a Good Thing is also debatable, and beyond the scope of this comment.
More importantly, I think imposing that "good OO practice" on a brand new programmer is a distraction from learning the basics of programming. None of the boilerplate Java requires for a "hello world" program actually has anything to do with printing "hello world" to stdout. The beginner has a great deal of information to absorb, and the meaning of "public static void main()" is an extraneous distraction from what's fundamentally required to make a program work, and arguably not helpful for making programs that are well-structured and easy to understand. If "hello world" requires much more than "print('hello world')", it's not a good language for learning the basics.