Take parametric polymorphism  as a random example; i.e. what's called generics in Java, templates in C++, etc. You might know how to use it in a particular language and realize it provides some benefit. But if you truly understand the underlying concept then you can quickly recognize it and/or know to look for it in any language you pick up, whether it's Java, OCaml, Visual Prolog, etc. Note each of those languages are in three totally separate paradigms (object oriented, functional, & logic) and yet all of them still make use of the same underlying idea to achieve a particular goal (in this case, a form of type-safe expressivity). If you have a good understanding of $random_fundamental_concept then all you would need is to take a couple minutes to learn its syntax in your new language of choice and be on your way. I chose that example because it was the first that popped into my head, but in reality it goes for just about anything.
In short, with a good understanding of the fundamentals, you can make the time it takes to learn something new that will invariably be based on them significantly shorter.
Also, "learning and cultivating rock solid CS fundamentals" is a given - we're talking about what more to do.
I signed up to learn functional programming and have learnt a lot.
The biggest bottleneck I see that prevents people from being able to do this is their code-reading skill.
With practice you can read and understand codebases much, much more quickly than most people (who are so intimidated they never even try). Once you can do this, you can very quickly pick up (1) the rich details about how the system works, (2) the ability to extend the system and fix bugs in it, (3) understanding of what is idiomatic and what is not.
This! Could you please share your insights as to how to improve your code-reading skill? Besides the obvious, "read more code". For example, how to select which sources of code to read from, any tips/tricks you've learned in your process, etc?
This not only lets you read more code, but also lets you pick up knowledge in areas that are often taken for granted (no need to think about X, there's a library / plugin /framework for that).
Every piece of software has bugs and limitations. When you encounter one, go figure out why it is the way it is, and maybe even fix it.
If all your dependencies are so perfect that this never comes up (hah!), you can always pick a project you use and like and look at their open issues. Begin trying to solve one of them.
_You don't really need to abandon ship every time something shiny comes along._
Really, you don't. Use the tools that help you get the job done quickly and efficiently. I jumped ship from .NET (asp.net mvc) to Ruby on Rails because of three very distinct features:
1. Cheaper hosting.
2. A better baked-in ORM.
3. Ruby made me more productive.
It wasn't because some random guy in Germany wrote a blog post and called it the Next Coming of Christ - it was because I tried it out for a small project, and loved how it made my life easier. Less time in front of the computer, and more time with my family.
Just make sure you jump on the next big thing for the right reasons.
This is true in the same sense that quicksort is faster than insertion sort - that is when we ignore some constant and are interested in asymptotic performance.
Very few companies are able to ignore this constant and hire someone who - being as good a programmer as he is - for half a year will be less then proficient and then for two more years will be still under-performing in some cases. Companies would be stupid to ignore this constant, especially given that in three to four years the programmer in question will leave for a better work.
For the vast majority of jobs it really matters what software stack you know and it's not really a bad thing to do for companies to check for that when hiring.
It's also worth noting that learning a language is something entirely different than learning to use a language. That's why it's that important to actually build something when learning - because then you are forced to use various tools that come with the language, it's entire stack.
A hell of a lot less time than it takes to master architecture.
Would you expect to hire either if they didn't have mastery of those tools?
Back when I worked at a 3D animation studio they hired a a guy who had literally never used a 3D animation or modeling package before. He had however spent 20+ years doing concept and character design, sculpting, stop motion animation and model building. It took him about 8 weeks to become proficient enough at Maya to start turning out character models that outshone most of the others (admittedly at a much slower pace). He could also just look at other peoples models and instantly spot subtle changes that totally transformed the models.
Basically if you're hiring an architect or designer based or their mastery of software, you're probably doing it wrong.
Autocad takes more that a few years to master, your architecture degree is only like 5 years + an apprenticeship. Yes, you will master autocad before architecture, but you will likely not have time to master any other competing cad package that doesn't greatly resemble it (hence, lock in extreme), if you also want time to master architecture.
Think of this as a bare minimum. You don't want to hire a young visual designer who doesn't have adobe CS proficiency. Of course, they should be good designers also. But you might make an exception for someone senior (experienced) who predates Adobe and does everything the old fashioned way; their skills are still very useful.
I've never been involved in any hiring 3D artists.
I hate mangers who think like you, they think developers are basically interchangeable without specialties that took substantial time acquire. As if our investments were always in the six month range!?! How about 5 or 10 years of deep experience?
I hate managers (and other programmers, but managers are more problematic) who think that languages are specialty of programmers rather than a tool. "C#" isn't a programming specialty any more than a particular model of chef's knife is a culinary specialty.
Web Application development might be a specialty. OS development might be a specialty. Java is a tool.
Knowing Scala is not just scala + scala.lang, but rather it involves knowing and mastering a bunch of libraries, as well as some heavy frameworks (lift or play?).
Anyone (manager or programmer) who thinks of languages in isolation of stacks and platforms is probably an academic or incoherent.
Languages are not tools, they are merely starting points to something much more complex and involved. "Can I learn Haskell in 6 months?" Sure! Can "I master whatever library/stack that I'm learning Haskell for?" It depends very much on what that is and my past experience, and the answer is probably not!
I was able to learn Smalltalk in a day. It has stupidly simple syntax - one of the simplest in existence! - and there are objects everywhere - and we all just know objects, right?
As I noted above, knowing a language and knowing how to use a language are two different things. To really master a language usage you have to learn and internalize a whole bunch of things, from which the language itself is the smallest bit.
Having worked with Asp.Net MVC for years now, I've found how Express on NodeJS handles form validation to feel very cumbersome. Technically, it doesn't even handle it and one has to rely on third party modules or one's own wits. After struggling with that one bit for a few days I dropped nodejs and moved on. I enjoyed making ViewModels with DataAnnotaions in .net that two top validation related modules I looked at seemed weak in comparison.
Now I'm trying to learn Play! using Scala. But the IDE support isn't as good as Visual Studio especially in the View templates. I've only tried the Scala plugin on Eclipse. The Play! plugin in IntelliJ IDEA only works on the Ultimate version which sounds expensive so didn't bother looking into it.
I just installed Ubuntu on my computer and really want to make the switch from Windows+C#+Visual Studio to Linux+Scala+Play but all those years getting comfortable in .net I feel will really hold me back. Everytime I hit a wall I'll be thinking back at how much easier this would be if I was using .Net or Windows that I feel this will be my biggest problem I would face.
I jumped from .NET to Rails 3 years ago and found that the ecosystem being mature and large helped me get over all the typical humps of losing IDE support (I code in vim full time now) and switching to Unix. Had I jumped in 2006 or 2007 like some of my colleagues did, this would not have been the case.
I definitely think that these days Node and Play are on the rise, but based on your post I think you should consider something a little easier to slide into for someone more used to a mature ecosystem. Once you get into the Unix-based way of programming, it won't feel like such a painful leap to jump to Scala or Node.
Ruby and Rails will always have a special place in my heart as that's where I first truly learned how to program after being beaten into submission by Java in university. I think the point I hated programming was the time we needed to build a Swing app using Notepad. Then over a year after graduation I found Hackety Hack and _why's poignant guide and everything clicked and I knew all I wanted to do was to code.
I just installed Ubuntu on my PC last night and this time I'm trying to learn it right. No more copy pasting stuff into the terminal like the last 15 times in the past 8 years I've attempted to learn Linux. I make sure to read the man pages of every command before I run it so I know what its doing. I've must of extracted archives tons of times but today is the first time I understand what the arguments -xfv mean in tar -xfv FILE :)
My current methodology has me moving ever so slowly. For instance, after downloading Eclipse, I knew better than leaving it at my downloads directory. So a google search told me to put it in /opt. But that just leaves me wondering what the heck /opt is for. Some Googling led me to a document entitled "Filesystem Hierarchy Standard" which explains what all the directories in the root are for. Trying my hardest not to think about how much quicker things would be moving along if I just give in and carry on in Windows >.<
I still say you should consider just buying a cheap Mac, a lot of the things you're finding challenging aren't as big a deal. The Mac makes Unix A LOT more approachable :)
-People seem to enjoy programming with it
-Makes somethings very easy (concurrency)
-Standardized formatting, makes everyone else's code easier to read or vice-verse.
Watching the evolution of programming languages and programming jobs for the last two decades I still have no idea how to predict this; I can only believe in my intuition and bet.
I love this advice. I think that I want to tackle building a Node application as my next "weekend project." I dabbled in the very beginning, but today, Node is very mature and the barrier to entry is extremely low.
I thought at first this was headed in the other direction - e.g. candidates who could not stop talking about how great their new language is and how their old language is "a dinosaur". That is the tendency I have - to inflate the advantages of languages that interest me most because of their novelty to me.
If you are an expert in OOP, try to code for a few months in a functional language.
- Theorems for Free languages: haskell, scala, ocaml,
- lisps: CL, Scheme/racket, clojure
- logic programming/unification: prolog, mercury
- stack-based/concatenative: joy, forth, factor
- O-O paradigm: D, eiffel, smalltalk, scala(again)
- others I keep hearing about: Golang, rust, kotlin, Mozart/oz
1) I already have years of experience in Java. And it's not a dead field. There's Java 1.8 coming out, new libraries, frameworks etc. And there's the added benefit that I can convince my workplace to adopt these things since they use Java. That way I can efficiently use my work time to make money and learn new technologies at the same time.
I came to this conclusion after time spent fighting a futile battle to become good at Python. Learning the syntax was easy, there's nothing in Python that was difficult to understand. However, I ran into the following issues trying to truly master Python or any other language I don't use at work:
2) Zero synergy from other developers. A workplace is also a community of developers. My Java and JS skills stay sharp and increase because I can constantly talk and discuss with other developers. I learn best practices, things to avoid, trends, etc. I get none of this with Python unless again I spend personal time to do it.
3) I made some small projects with Python. But a moment came when I realized, it doesn't matter, I'll never even come close to how good I am at Java and JS with Python. It was when I read some random post somewhere about how requests library is so much better than urllib/urllib2. And indeed it is. But if I had not seen that random post, how long would I have stayed ignorant until I found out? For someone who uses Python at work, such knowledge spreads quickly. Also just staying up to date on what's happening in the Python world can help gain such knowledge. I was already spending time keeping up to date with the Java and JS worlds. I just didn't have time to keep up with Python too.
I really liked Python a lot. It's a clean and practical language. But node.js fits much better with my skills and although I like the Python language better than JS, I'm a lot better at JS and will continue to be so since I use it at work.
I guess the TLDR of this is that yes you can try to learn and become good at a language and its community of libraries and frameworks in your personal time. But it's so much more efficient to just become better at the languages and frameworks you use at work and then use that experience to become good at new technologies for that area.
Perhaps the final breaking point that lead me to abandoning Python in my personal time is when I realized, I could've spent all that time building something on the side. I could've used my existing Java and JS skills to make an app or website or anything. And I could've done it fast, quickly, and with the confidence that comes from truly mastering development tools. Instead I wasted it to gain a useless amateur understanding of Python.
I'd say that your time would be much better spent learning language concepts and learning to implement them yourself in some sane environment, like OMeta, for example. This way, you could 1) do the incremental changes you want (instead of throwing away one language wholesale and learning another from scratch), 2) recycle the execution environment you have to use (e.g., at work, instead of being forced to switch), 3) broaden your horizons in a fundamental way (learn about type systems, implementation techniques etc.).
"But it's so much more efficient to just become better at the languages and frameworks you use at work and then use that experience to become good at new technologies for that area."