Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: As a Full stack developer how do you keep up with all the technologies
37 points by kiraken on Feb 7, 2015 | hide | past | favorite | 56 comments
I started learning web development 6 years ago when i was 13 or something, it started as something to do just for fun and ended up as a passion that pays for college. When i started learning i focused all my attention to HTML and CSS since they were relatively easy, and as time past by i grew accustomed to them and didn't feel the desire to learn anything else, but once i started freelancing i had to learn a couple of things to keep myself in business. But the thing is, with every new project i take i have to learn something new, wordpress development, joomla development, angular, responsive, JSON... There's like a thousand thing to learn, and each time i take 2 days to learn what i need to know and finish the project but then after 2-3 weeks or after a new project i can't remember a thing about that technology! How do you guys do it?



"Never memorize what you can look up in books."

When you do it long enough, you get a really good sense of the structure of programs. Then, it's just a matter of fitting in the best technologies of the time into the 'architecture'.

The full stack is just data storage, data transfer, processing, user interface, system structure, tools, and processes. When you do it long enough, you get a good, general understanding of all of these pieces.

When I take on a new project, I do the following:

1. learn about all the 'current' technologies

2. write simple apps (full stack) in each of them

3. decide on which are best (sometimes I make mistakes, often I use 'old' stuff)

4. cram a bit

5. write a somewhat more complex app

6. learn the rest, as I go


This seems like a good way to go! Thanks for sharing


I don't. I understand the principles of the stuff I use and can draw upon past experience to analogize to new stuff. Like, despite what a lot of the Node community will tell you, a Reactor model on N threads (where their N = 1) is not new. I saw it in Java, with Netty, most of a decade ago and you see it today, albeit abstracted from you a little, in Play. At a gig last year, I was able to build out a plan of attack for scaling out a Ruby web server infrastructure pretty easily (despite not knowing Ruby or Rails) when I learned that EventMachine was yet another similar system. At some point, the principles stick, and the stuff on the surface just chances. And that--whatever. You can pick that stuff up and set it aside whenever.

Wordpress? Just another CMS.

Joomla? Just another CMS.

JSON? Just another interchange format.

That said, I'm convinced at this point that, barring the very occasional unicorn, measuring yourself against other people who claim to be "full stack developers" is really just a way to make yourself feel bad. "Full stack developers" are stupefyingly rare, despite every startup you see on here trying to say otherwise. It's the new "rockstar" or "ninja". I used to call myself one, but I was a liar. My "stack" skills include "debugging MySQL with gdb and strace" (this has happened, and it was terrible), but doesn't include "JavaScript flavor of the week" because I just have no interest in it. Other folks are more than happy to keep up on the cutting edge of JavaScript, but their conception of a "stack" ends at "well, I tried to configure nginx and it didn't work" (or even higher than that, at "Rails did X but I don't feel comfortable diving in and figuring out why").

Just do what you can, and don't beat yourself up over what you can't.


Thanks man. Building on previous knowledge seems like a good advice, i was mostly closed up in my work and by nature i don't like joining forums and looking up what other devs do, so when i started freelancing regularly, i found it to be necessary to keep up with the world, i guess i wanted to be one of the good developers when i started learning, and somewhere i started overestimating them thinking they can do anything and learn everything, all the while they were just regular people straggling like i was to keep up


No problem. =) Everybody fakes it. Eventually you make it.


What I do is I try to be in the Early Majority group instead of being an Early Adopter [0]. There are "a thousand things to learn" but how many of them will be still popular in 2 years? You can start learning them after 1 or 2 more years and still be in the early majority group, saving yourself some time learning a technology that is overhyped and dies in 6 months.

[0] http://upload.wikimedia.org/wikipedia/en/4/45/DiffusionOfInn...


This actually makes sense, but trust me clients always want and sometimes are determined on a specific thing, just because a friend of theirs used it and was very pleased with the results


If you don't actually believe the tech will be useful for you outside of that one job, the solution is simple: charge them for the time you spend getting up to speed.


Most people don't keep up. They're on the bleeding edge in some areas and way behind in others.


I used to feel bad there were so many areas I was clueless in, then [about a year or two ago] John Carmack tweeted something along the lines how he knew no SQL and nearly nothing about databases (it's just not something he ever had to deal with) and I realized no matter how good or experienced you are there will always be some areas you know little or nothing about.


Definitely--everyone has their blind spots. Having the best native code developer I know, a guy who's worked at nVidia and Apple and who has forgotten more about GPUs than I will ever know, ask me how to make a web app was an eye-opener.


Have you considered specializing? Pick a set of a few tools you enjoy working with and only work with them. Sure you need to keep your eyes on the horizon for better tools, as you wouldn't want to be a "CGI specialist" these days.

What I'm getting at is that MOST of the time, even if a particular job lists off some particular tech, the ultimate customer is generally interested in a good working solution and doesn't care that much about the tech used.


I did specialize for sometime, i learned how to recreate some of the javascript animations and web elements in pure CSS or SASS, but no one is interested in CSS solutions, they just want simple jQuery scripts, cause it's the new big thing, so i decided to adapt myself with new trends, since freelancing provide most of my income and college tuition


The vast majority of things in webdev land work similarly to something else. The trick is to learn the underlying pattern and then just remember the parts that make a specific implementation of the idea different. The exception tends to be platform extension (the wordpress/joomla stuff you mention) where every platform has it's own patterns and thought process and I just write off as project overhead.


I suppose you can still focus on a few technologies when looking for new projects and freelancing. At the beginning it should be quite complicated since you won't like the idea of letting a project go just because it requires some new framework you don't know, but it's a price to be paid in order to have more expertise and experience using your skillset (proper financial planning might help in this transition). If you keep learning and relearning things for each 2-weeks project you find, you'll end up without much depth in any of the technologies you have used. The question then is: which technologies/frameworks/languages should you focus on?


But most developers i know are proficient in multiple frameworks and languages, that's why i'm trying to be at least good in them, i suppose the only way to do so is to create a repo for each new thing i learn in github and make sure to put a new project in each rep every week


I'm not saying you can only know a couple of frameworks, but it 's hard to be an expert in Python/Django, Ruby/Rails, Node.js and Java. You can do the same for frontend stuff. For example: I think it's a better option to be a Django and Angular.js expert, good at Node.js and Backbone.js.. and kind of ignore projects related to Java, Rails, Ember, React etc. You can pick anything you want, it's just an example. You can still work on side-projects with the trendy stuff and migrate depending on your interests and the market, but you shouldn't transition into new things every month. Just my 2 cents, best of luck. :)


Are they proficient, or are they good at faking it until they become proficient?

I have a lot of tricks in my toolbox to seem ready and productive, while I'm scrambling behind the scenes to actually become ready and productive. (Difference is, I'm willing to admit it, because I am confident in the value I create.)


I'm curious to know how you admit it. I recently completed a website redesign for a relative in Brazil. He wanted to use the existing MySQL db and some PHP functions, and expand on both. I had some experience with HTML, very little with CSS and Javascript, and had never built a website using a text editor alone for the code. I'd just completed the HTML/CSS, JavaScript, and jQuery courses on Codecademy and was excited to apply what I'd learned, but I made it very clear that I wasn't an expert at frontend code nor design and had no PHP nor databse experience and would need to learn them to complete the project.

He showed me various websites and features he liked, and I recreated them with HTML, CSS, and jQuery designing the new site to his satisfaction in about 5 days. Then, I powered through a couple tutorials on PHP and MySQL for a couple days. Most of the old PHP that he wanted to salvage was filled with deprecated code, so I rewrote it. The data he wanted to store and use for the new site required entirely new db tables. So the new site didn't reuse any of the old code. It took me a month to complete the site and get it online. A few days later, the owner of a quite-a-bit larger company that partners with my relative's company told him that he really liked the new site, and asked for my info. My relative told me that I was far too modest about my ability before I took on his project; that I wasn't an amateur at all.

I know that I was and still consider myself very much an amateur, but I know I can (and enjoy to) learn new technologies pretty quickly. I take pride in doing a job well, and really believe I'll provide a lot of value to the projects I work on. I just can't sell myself without being completely honest about areas that I know I have lots left to learn. How can I be completely honest without seeming incompetent next to a 'professional' (and great salesman) who only knows how to plug content into a CMS template? How do you express your confidence in the value you create when you're aware you will have to learn a lot to create that value?


"I don't know how to do that. But I know how to do X and Y, and can be productive using that by Z date. Is that an acceptable starting time for your project?"

That said, I don't consult or contract anymore, I work FT instead. As a FT employee, seeking a fit requires a greater investment of time on both sides, and a slight misalignment can prove valuable to both parties.


You jump around jobs and see where it gets you. Eventually, you realize how to pick and choose your tech.

I got real burn out with front end, with emberjs, angularjs, and now react. I'm going to just wait this out until ES6 come about and see where that go.

I think in general I moved toward back end with all the big data and data science is going to be at. And now I'm going back to school to become a data science practicioner.

The trick is to have enough experiences and understanding overall concepts. I've done a variety of MVC in different languages and was thrust upon to do frontend and all that jazz. Those experiences will help you solve problem easier and hone your skill set in finding solutions.

I've seen people talking about how full stack is unicorn. I'm a full stack, I'm experienced in many technologies, master of none but a very small subset. I am constantly surprise how I am way more qualified than some of those frontend programmers or architecturers out there.

The deal is just continue to gain experiences on variety and you'll hone your problem solving skill. You might also be under selling yourself too. There are very few engineers out there and there are tons of pretenders. I've worked with two "front end" and a "acting cto" that specialized in architecture. They were anything but the position they claim to be but they sure do have a purrty linkedin profile with tons of people writing good things about them.


> I've seen people talking about how full stack is unicorn. I'm a full stack but I'm experienced in many technologies

Can you debug your runtime (Ruby, Python, the JVM, whatever) in gdb? I can, and I still don't consider myself "full-stack"--both because I leave JavaScript to JavaScript people, but also because if you asked me to debug a kernel I'd look at you and just shrug.

The stack goes down past your application.


Full stack to me means "Given a design and a brief, can turn it into a reality with no external help".

So for me that's HTML, CSS (Less/Sass), JS (React, Angular, jQ), on the frontend and then a choice between Node, PHP, Ruby, or Python plus DBMS on the backend depending on the requirements / my mood.

The only bits that really extend beyond that are interfacing with external libraries such as ffmpeg, imagemagick etc.

I've not seen any instances of full-stack implying heavy dev-ps experience for example. And that's in 12 years of pissing my life away being a "full-stack" developer.

Heck, right now I'm employed as a "front-end web developer" and spent last week writing an API-only Node app and configuring TC/Azure deployments and NginX.

Anyway, got a bit off track here... my original point was that "full stack" is pretty well defined and has been for a long time.


If you think running gdb when something breaks is "devops", we may simply not have compatible vocabularies. I consider it a fairly foundational part of making a product that works.


I wasn't saying gdb was devops, I meant stuff to do with administering server clusters for high traffic sites or otherwise-complex setups.

The line generally gets drawn at "Does it run on one server?", in my experience.

I suspect if Docker pans out like it wants to that will change though, but perhaps that's my naiivety showing through


True, but then really no one could be considered a "full stack" developer. Personally, I consider "full stack" to be just a blanket statement to mean people who have no problems wearing different hats and can figure things out to get the job done. Unless they've been pigeon-holed their entire careers due to enterprise specialization or are just that closed-minded to learning new things, I think most good developers end up becoming "full stack" after enough time.


I think "full-stack" means, when translated out of marketspeak, "you won't get any support or help from anyone else, so you'd better be self-sufficient at all things."

But I am a cynic.


Full stack development doesn't mean that you work on all the things in the world, just all the things in your stack. You would work on the HTML/Javascript/CSS, the middle layer (PHP, Python, Scala or other), and the database. If you find a way to competently work on all those things, BOOM, you're a full stack developer.

The secret to doing full stack development well is figuring out how to keep the number of things you need to know at any given time low. A product that is using Ember, React, Angular, PHP, Python, Scala, Postgres, Redis and MongoDB is going to be hell on the developers. The developers (and the product) would likely benefit from some serious triage work. Get rid of half your middle languages and half your databases, because they are redundant and doing so will make your life easier. Great engineering depends very much on deciding what you can get rid of.

How this relates to freelance work, I couldn't tell ya. By hopping from project to project it seems like you're going to be exposed to a mind numbing number of technologies. Sounds like fun actually, but it's a different animal from full-stack development. Most of the successful freelancers I've met focus on a particular technology. E.g. Mongo contracts out experts in their database, some people specialize in search technology, others do HBase. Nobody does all of those things at the same time.


Agree with a lot of the opinions already stated in this thread, but didn't see this one so I will add it:

I find it's pretty important to know the kind of thing (what it does, the 1000ft view, the basics of the area that it's attacking) and then the differences between the (generally guaranteed to exist) alternatives/other products that do the exact same thing.

Long Example:

Databases - You should know what a database is. How have people done databases in the last 50 years? 40? 30? 20? 10? now? (while that seems like a lot of research, it's generally not - as much as some things change, a lot have not changed for a long time in CS). What are the big theories that propel most of the solutions? ( for databases you might find stuff like mmap, b-trees, indexing, hashing)

Then, there is the question of what is the difference between Postgres & MySQL? MSSQL & Oracle? RBDMS & NoSQL? RethinkDB/Mongo & Neo4J?

I generally find that knowing stuff in those frames is more than enough. Personally, this approach works but I am embarrassed when I have to look up things like "how to list all the tables in mysql" on Google, but I'm starting to think that's not such a big deal. Yes, I don't know every oracle/microsoft/mysql/postgres command, but I think it would be more ridiculous to take the time to try hard to memorize stuff like that, and then completely miss out on the benefits of NoSQL databases, for example.

Also -- do tons of side projects.


At the end of the day, a lot of the details end up on the mental cutting room floor, there's no getting out of it. If you can't specialize, get used to a bit of Swiss cheese brain, but hopefully retaining the 10000 foot overview so you pick the technology back up reasonably fast next time.

The only things that have really helped:

Key takeaways go into a Deck in Anki (spaced repetition).

Lots of notes and/or screenshots in Google Docs for easy searches. (still haven't embraced Evernote)

Bookmarks in Firefox with tags.

Teach someone what you just learned.

One frustration in particular is the time spent wading through minutiae instead of creating something with impact. But sometimes that's part of what we get paid for, navigating / remedying the pain points. Anyone can (eventually) slog through most development technologies, but adding understanding and context and utility to it, that's where the challenge lies.

Heinlein said something to the effect of "specialization is for insects," but increasingly the bulk of world seems to be leaning that way. There's nothing wrong with specialization, but choose wisely. Check out Google Trends on a few technologies over the past 10 years and see their rise and fall.


Turn your news feed into push mode instead of pull mode :)

HN is the only site I visit for tech news. The rest of the news I get through newsletters. I follow about 20ish, which I sift through each week to stay updated.

JavaScript Weekly, HTML5 Weekly, Node Weekly etc has me covered and a lot less stressed about staying updated. Basically, just Google [Topic] weekly or newsletter and subscribe to a few. If they don't deliver, just unsubscribe.


You misunderstood my question ,what i meant is how do you learn most of the new technologies and not forget them after a while


Oh, I'm sorry. My bad. I don't really get what you mean - have you just copy-pasted or actually used stuff by reading docs?

Firstly, you can't possibly learn everything. I once was in a spot where I tried everything, desperately trying to learn. This was probably the worst part in my learning curve - not being able to tell what's important and what's not.

So, I'd recommend you to pick something and stick with it for a while. This way, you give yourself time to do some focused learning and keep up with specific best practices. Last year I focused on Angular and Node. This year it's React and RxJS.

Tldr: choose an area, focus on it for a while until it sticks. Put the other techs aside until you've learned.


Learn and understand the concepts, you can google the specifics (syntax, API's, etc). Specialize on the stack that casts the widest net given present and future trends (cloud, saas, paas, mobile, web apps, IoT). For example, if you are already using some front-end MVC, keep going and become a pro, most likely it will help you solve 99% of requests.


I don't, for the most part. The trick that I've found is to understand the underlying theory and don't sweat the individual syntaxes or design patterns. The new platforms of today have a lot in common with the platforms of yesterday. The state of the art languages of today are combining theory and technology which has existed for decades. Tomorrow's Joomla will be doing things in a way very similar to yesterday's Wordpress.

For example, if you understand event driven workflows, you'll be able to pick up react/flux, desktop UI drivers, and video game addons. If you understand continuations (and the various ways to abstract away continuations), you can grok most of Javascript (and by extension, Node.js). If you understand pointers, you'll be able to follow C (though I'd also recommend learning some Assembly as well to really understand C). Pattern matching will get you Haskell and Erlang. And if you can use Google well, everything else is a search away.

Now it is important to note that theory will never make you an immediate expert at every language and platform. However, it does let you bootstrap yourself into a language or framework quickly enough that outsiders will hardly be able to tell the difference.

I've found that once I understand the flow of a program, and ideally the theory driving that flow as well, the syntax required to express the changes you want to make to the flow is merely a few Google searches away.

Another thing to note, if you bounce around like this, you will never get true mastery over a particular language or framework. This is OK for more than 80% of the work you do, but you will end up picking up more information about one language over another - of one framework over another - simply to complete your work. Having this depth of knowledge is not even remotely a bad thing, so long as you keep sight of the bigger picture: don't let the quirks of one language or framework constrain your thinking for all languages.


> Another thing to note, if you bounce around like this, you will never get true mastery over a particular language or framework.

I think this is a bit unfair. I find myself rapidly getting to the point of expertise where other people start asking me questions, just because I didn't have to spend time going from "dumb newbie" to "intermediate"--but being able to start at "intermediate" because of the built-up knowledge of everything in my back pocket. Part of it's also that I recognize that technical mastery is mostly just being able to Google faster and better than the next guy, but still.

We're short a letter in English for what I really mean, but there are I-shaped people and T-shaped people--what about U-shaped people? Building on multiple sources of knowledge to join them at the top? =)


That's a fair assessment (and it mirrors my experience as well), but in the context of my comment, I consider mastery to be a level of knowledge and experience exponentially above the expertise which you refer to.

Mastery of a language, to me, is an innate understanding not only of the language, but of its behavior under most circumstances. Mastery is being able to look at a core dump and be able to identify the cause as GCC exploiting undefined behavior to optimize code, and you can fix it with these keywords or compiler arguments.

Mastery of a theory is not only understanding its benefits and complexity, but its shortcomings and knowing how to compensate for them. Mastery of sorting algorithms is knowing when a bubble sort is more performant than a binary sort, and knowing how to determine when to use one over the other.

Mastery is the act of pushing up against the edge of all human understanding of a particular topic.


Most people do end up specializing but since you feel that you can't at the moment the best way is to understand the underlying concepts. One thing I've found helpful is to keep example applications and work you've done locally on your computer. I keep mine sprayed by project type ie wordpress, joomla, angular. It's helpful to start building generic utility libraries that are usable across projects. The key point is to be able to reuse techniques, code patterns and hopefully code across multiple projects. I've also seen friends blog what they've learned so that they can quickly relearn what you need


It is okay to forget framework-specific syntax/methods. The most important thing is that you get to understand its underlying concepts.

Re-learning a framework/tool is usually pretty easy.


Interesting point, it is indeed easier to relearn something


Don't learn everything. Pick something that makes you happy/money, and be very good at it; instead of trying to learn everything and learning it poorly. Heck, people who refused to learn anything but COBOL make more money than many of us (http://www.glassdoor.com/Salaries/cobol-programmer-salary-SR...)


If you don't want to learn new things don't take jobs that require something you don't know. Take jobs that require the skills you already have.


Note that this strategy wears down. If you started this strategy 8 years ago, you'd be making websites with cgi, perl, and asp and desktop apps with Visual Basic and Access.

If you're going to be a developer for more than a handful of years, you're going to need to be prepared to retrain yourself on new things every few years.


Very good point. In my short career so far as a developer (4 years as a hobby, 6 years professionally currently) I've found learning a constant part of the job. If you don't enjoy learning at all it's going to be difficult to be a professional developer. The best case scenario is taking a job where the tech in unlikely to change too much and learning can be minimised (e.g. Java dev at a large finance company).


I don't know how js developers deal with explosion of frameworks. The other day I was reading about React and someone was mentioning om, morearty, and omniscient and I had no idea. I was frustrated with reading HN at that point. Lots of learning and forgetting? What a waste. This is where one has to rely on some authority to pick the winners. I dunno. Ignore almost everything and just pick a framework?


That could work for someone with a day job where they use that framework, but for a freelancer like myself you need to be flexible and keep up with at least most of the technologies..Which is getting to be too much to handle


You're still young, so I'm guessing you don't like saying no, perhaps because you need the money. But the longer you're doing this, the easier it will get to say, "well, actually, no, I only do X" and in fact get paid more than a generalist, and be in higher demand than a generalist, because people like paying for experts. That's not an answer to your immediate predicament, but it does pay to think where you want to be a couple of years from now and how to start moving in that direction.


The Thoughtworks radar [1] has done a pretty good job in the past of predicting which technologies would catch on and which ones wouldn't.

http://www.thoughtworks.com/radar/


I think being a full stack developer makes sense if you're an employee, working with the same platform/framework for at least a year. If you have to constantly switch from using Nhibernate to Entity Framework, Angular to ExtJs, ASP.NET MVC to webforms, C# to PHP, it'll be hard to retain knowledge and skill.


Don't worry about not remembering things. Your brain is very good at keeping track of the things you actually use frequently, it happens automatically.


After you had to learn to use spice, matlab, VHDL, all the weired stuff of physics (Quantum mechanics, physical stats, relativity...), english and german ... then computer new technologies are laughably neither complex nor disrupting.

Most of the new techs are fraud like angular; like in science a new tech should be something that simplifies the world with a simplified view : a map.

A map should always be simpler than the stuff you want to describe. Angular for instance is kind of the string theory of JS development; formalism is complex, it makes a lockin for incompetent devs but it brings nothing new.

But still, I can do angular fairly well, because I like to rant, and it is funnier to rub your despise into people's face when you outsmart them.

Computer language when you had to learn thousands of words to be understood and complex grammars for foreign languages are freaking easy; python is around 50 core words. Perl 300 ...

Most of what computer science are proud of is scam. No critical thinking and no building up of intuition makes people looks like headless chicken yelling «new tech, new tech» and bumping into each other.

Me on the other hand, I can guess where the flow of circuitry will bring the data into memory, if their will be localisation if the L1, L2 cache will be used. I can feel the unstability of the sequencer with page fault or OOP missing.

Why?

Because computer is still a dumb automata that is part of my formation; I learnt how to build microchips.

But electronics also learns you how to face complexity.

You don't see the world bottom up or top down. You are like a go player. You embrace the world's complexity by building up 2 pictures of complexity one for the bottom the other for the top and you try to make them converge in the middle.

high level abstractions like CSS and files (yes a file that has the open/read/write/seek/tell/close/ioctl is an abstraction, it does not exists on the silicium level) requires to be careful. But still they are never new. They are built up on other layers of abstraction (geometry for CSS)

Some abstractions are insanely more complex than the world they describe (CSS). Happily for me I learnt Tk/Tcl that gave the packing geometry understanding (the so called boxing model).

Angular is shit, but I learnt xmlhttppartialrequest a long time ago, so I know how to circumvent this horror.

I can quote a lot of example where focusing on the basics on the core of knowledge (what does the GIL do, how to bind on a library with python, who does the PCI bus works) finally makes you outsmart the competition of the stupid devs throwing themselves in new technology in the hope they can cut corners.

There is no lazyness possible. Computer programming requires a lot of knowledge, rather focus on learning the basics : - what is an OS (especially POSIX); - network programming; - algorithm; - CPU architecture; - a tinge of ASM; - bus specification; - memory allocation; - physics (only physicists understand why distributed MUST not use timestamps or any global clock and why «acausal» events might occures); - math: linear algebrae, proba and especially geometry (yep even for CSS); - theory of measure; - signal processing and the theory of dectection of errors

When you have the basics, no frameworks or new technology are either intimidating or out of grasp.

Most of the so called new techs are fraud.

If a map is not simpler and as accurate as the world it tries to describe, then throw it away. That's how I deal with new techs; by discarding most of them.


Even though the OP talks about learning new technology, his meaning is actually very different from yours. You're talking about understanding things. The OP is talking about getting up to speed with something and using it productively.

Learning a programming language is easy. Learning how to use it productively includes everything from architecture to testing to style to knowledge of all of the popular libraries and so forth... understanding any of this is not the problem, using it productively and keeping up to date is. And knowing CPU architecture and physics isn't going to help you with that.


I think Julie1 has a point.

As soon as I started watching Computer Science courses that articulately explained the core concepts of a computer I could start looking as languages I was unfamiliar with and at a base level understand what they were trying to do.

Elon Musk says it best...

"“I think it’s important to reason from first principles rather than by analogy. The normal way we conduct our lives is we reason by analogy. [With analogy] we are doing this because it’s like something else that was done, or it is like what other people are doing. [With first principles] you boil things down to the most fundamental truths…and then reason up from there.”


I get the idea, but i'm afraid to say that it's completely unrelated, you can't compare learning a language to a developing or framework syntax. Because sometimes it could be easier and others harder, i speak and write in french English and German so i'm talking from experience. And nothing is simple or easy because there is always a new layer of depth behind the language that you could get into, and the further you go in the bigger the possibilities get. And my question was never how to learn, but how not to forget


That is the point.

You should assess critically why you can't remember.

It means either you lack knowledge in underlying abstractions OR the map is wrong.

So, you have to orthogonal thinking to do:

- reinforce your capacitive memory by echoing to other concepts (analogy); - or not remembering inconsistent technologies.

I love the map idea: when learning a new technology try to see it as a map.

Follow blindly the analogy. If it is easy to remember and it works it is good technology.

Else, it may mean the technology is shitty or you lack other intermediates abstractions.

Knowing if it is you or the frameworks that sucks is daily challenge, but I can give you an hint: most modern techs are neither modern nor technology; most of them are pure marketing hype of broken tools.


> As a Full stack developer how do you keep up with all the technologies

<rant>

Warning: Strong, controversial, personal opinions ahead. YMMV.

I'm not a "Full stack developer", believe that that is not a good goal, and, thus, don't want to try to be one.

Instead, I'm just trying to make money with, yes, a career.

In a restaurant analogy, I want to be good as a chef and open and run a financially successful restaurant. Maybe I want to open a pizza shop, an Italian red sauce place, a high end northern Italian place with $100 Barolo wines, a French bistro, a French provincial place, a French nouveau cusine place, a French classic heute cuisine place with $1000+ bottles of Romanee Conti, an American steak house, e.g., 18 ounce dry aged Porterhouse steaks, a Memphis chopped pork shoulder and rib BBQ place, an American diner, an American fried chick shack, an American hot dog and fries shack, an English fish and chips shop, an American NE seafood bar, a family Mexican place, or something from Korea, China, Viet Nam, Japan, ....

I want some one of those and certainly not all of them. I need to be a good chef for the one I want, not a full stack chef for all of them. Being a full stack chef for all of them is unnecessary and hopeless and, thus, foolish.

Then in computing, I want to know what I need to know for my business; learning everything about computing or just software on Windows, Linux, mobile, embedded, etc. is unnecessary and hopeless and, thus, foolish. No thanks.

So, in particular, I'm not trying to circulate my resume as a full stack developer, able to walk into just any old software for any old project, hit the ground running, fully understand everything being used right away, like being able to walk into just any kitchen of any restaurant and cook anything on the menu, right away, etc. and to be a do it all employee of someone else who is the founder, CEO of the business and himself knows much less.

If such a CEO wants a full stack developer, then he is asking for too much, really for something next to impossible, as the question of the OP correctly implies.

Also full stack developer comes close to being an insult for everyone who doesn't know everything. But since the goal of knowing everything about software is essentially impossible -- won't find chaired professors of computer science trying to do that -- the insult is really on the CEO asking for the impossible.

Really, I've decided that I should be a founder, yes, very much a technical founder, so far a solo founder, and not an employee. Then as a founder, my real interest is in my business, not the full stack.

Actually, I want to minimize what I learn, not maximize it. Or, in the restaurant analogy, assuming I want to open an Italian red sauce place, I want to know well everything I need for that restaurant and likely with nothing about Memphis BBQ, Japanese sushi, Chinese dim sum, etc. And even in that Italian restaurant, I may be the back of the house guy and leave the front of the house to a partner or employee. And I will subcontract linen service, kitchen design and construction, dining room decoration, etc. Similarly for my business based on software.

Net, to heck with the full stack and, instead, concentrate on the business and there learn what need to know, maybe learn that quite well, but don't try to learn things don't need and certainly don't try to learn everything.

Really for my business, the crucial parts are (1) the problem I'm trying to solve, a problem faced by essentially every user of the Web in the world, (2) some crucial core, original math I derived to be the secret sauce for an especially powerful, valuable solution to the problem, (3) some initial data, (4) publicity, (5) server farm and network management and security, and (6) the corresponding software which, given (1) -- (5), is supposed to be just routine.

In more detail, I decided to build on Windows instead of Linux, so I am using Visual Basic .NET, .NET Framework 4.0, ASP.NET (Active Server Pages or some such, object library for the server software to build Web pages), ADO.NET (Active Data Objects, object library for using SQL Server), IIS (Internet Information Server, for the lower level parts of the Web site), SQL Server, TCP/IP, and system management and security. That's about it. Already the documentation I have is 5000+ Web pages and about a cubic foot of books -- not trivial.

E.g., for my Web site session state server, I set aside Redis and just wrote a little code using TCP/IP, class instance de/serialization, and two instances of a collection class -- easier than just understanding Redis. For communications between separate server side programs, I am using just some simple TCP/IP and class instance de/serialization -- it's simple and is working fine and was much less work to write than just reading Microsoft's documentation of some of their software to make such communications easy.

So, no JavaScript, JSON, Java, Python, C Python, Iron Python, Ruby, Rails, C (actually I wrote some C but set it aside for some very high quality open source code that did much more), C++, Objective C, Haskell, Go, Lisp, integrated development environment (I just type into my favorite text editor, that has a good macro language), code repository, formal development and test environment, model, view, controller Web site framework, AJAX, etc. Why? So far, don't need them. So, when I really need some JavaScript, I will learn some and use it.

So, to heck with the full stack and, instead, on with my business.

If my business works, then I will have to hire, but I will never try to hire any full stack people. Instead I will hire for ability to (1) write clearly, especially software documentation, (2) communicate clearly with others, especially in a team, (3) read, understand, and apply technical material, (4) have good new ideas and make them real, (5) work hard, (6) be honest, (7) know at least the basics of computer usage. For the rest, I will train.

YMMV.

</rant>




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: