The downfalls... 1) stuff can take a LOT longer to develop, 2) sometimes I feel like I'm stuck implementing middleware when I should be developing features, 3) there is more room for bugs to sneak in 4) scaling could be tougher when/if the need arises
The positive... 1) I for once feel like I actually know the full flow of execution and this at times makes certain solutions much easier 2) I don't feel that Im on the treadmill of hopping from learning one framework to the other 3) it grows my skill 4) I feel more like Im chiseling something from one stone rather than tying together rocks
Im not entirely sure Id recommend what Im doing - in someways its really stupid but I personally get gratification from it and oddly feel more productive than when Im fusing together multiple frameworks.
Alan Kay, "The Power Of The Context", p.8, https://docs.google.com/viewer?url=http://www.vpri.org/pdf/m...
EDIT google search (mostly) provides a "Quick View" link for pdfs - there's also addons for chrome/FF that add a right-click context menu option to open a file in google view, useful for checking out lecture notes/slides, that often are a list of pdf links.
1) The libraries are guaranteed to be decoupled.
2) The libraries are better maintained individually.
3) It's much easier (sometimes painless) to replace them if they don't suit your needs.
4) ... probably many more.
The hardest part of bootstrapping a new application is the boilerplate. Let someone else take care of managing session state, generating SQL queries, URL routing, DI containers, buffered logging, templating, etc. If you haven't found something that suits your needs, sure, write it yourself.
I wouldn't say what you're doing is stupid. Do what makes you successful and happy. Just don't be stubborn ;)
One of the directions LedgerSMB (http://www.ledgersmb.org) is going is from a framework which was originally written by one guy to something which outsources as much as possible to CPAN modules. This poses some issues, but in general it works pretty well. We probably have as much framework code as we used to but the code does a lot more. We decided not to go with other frameworks because of the dependency issues that sometimes arise.
Also I will personally vouch for #3. We started off using Std::Config for processing ini files. But this isn't well supported in distros or well maintained so we switched to Config::IniFiles and it was a pretty painless effort.
the bigger thing though is you don't have to do it all yourself. :-)
In Rails land at least, a lot of relatively inexperienced developers tend to have trouble when they can't google for the thing they want to implement or find it in the docs. I find that if I open up the source in roughly the area in which I need to implement something, a hook point is provided at about the right level of abstraction. Your mileage with other frameworks/environments may vary.
* Bundle comes with a `bundle open <gemname>` command that opens up the gems source code in whatever your shells $EDITOR is set to. I use this at least three or four times a day.
* Pry (an IRB alternative) will tell you the file + line of the definition of the method you'd be running if you ran `show-source some_object.some_method`. You can also `ls` and `cd` your way around ruby modules/objects which has been invaluable.
My plan to solve this is to open-source my tools/libraries and switch to Flask, so at least some of the stack is familiar. Open-sourcing these tools will force me to isolate them, clean them up and write some documentation - - it can benefit other people and maybe open a pool of developers that know the tools better.
Quick advice: Don't. You'll make it worse.
You're already undermining your thing in that first sentence ("switch it to X so that other people will understand it"). Turning an in-house framework into a "ready for the real world" framework is a ton of work and will invariably need to dull down the sharp edges and general case a bunch of things.
The thing is, those sharp edges are what make it such a good tool. You want sharp edges on your tools. Dull tools might not hurt anybody else but they won't be as good at doing what they do.
You've got something that works really well for you. Chances are it's so focused towards your workflow and way of thinking about the world that it just won't be all that useful to anybody but you. That's just fine.
Trying to make it useful for everybody else will only succeed in making it less useful for you.
Pretty accurate… I like your idea of how to solve it… You can do it the other way around too, have/ask a coder with whom you work spend at least some time on cleaning up and documenting the code… Sounds like a lot of extra effort, but they are already trying to understand and conceptualize the thing for themselves, might as well write it down…
As soon as you try to bring somebody that will mainly "use" those tools/libraries instead of actively developing them, the tools/libraries becomes a de-facto framework to them, just a proprietary one with no community.
If/once you open-source it, it just becomes another framework competing with the other ones.
In short, if you are not rewriting everything new from scratch for every project, you are using a framework - only your own.
If you're switching between frameworks all the time, it would be even more difficult because you wouldn't have deep knowledge, but at least there is presumably a group of people somewhere to consult. If you're lucky they even had some plan when they came up with design decisions.
I can see how you avoid a lot of cruft, but how do you discipline your own projects? How much documentation do you create for your own later reference?
1) Consistent but evolving paradigms. In other words, let's not be static, let's keep most change happening relatively slowly. But let's review and try to fix problems as they happen, cleaning old code slowly and continuously.
2) Try to document everything to at least a basic extent.
3) Let the documentation practices evolve. Don't try to document too much too soon. Start out with a basic level. As the project matures you will have a better idea of what needs more documentation and you can spend your time on the areas that help.
4) Don't underestimate person to person mentoring.
I think it is wise to learn other frameworks and languages. It will help you with making wise decisions when developing your own tools.
While things like paradox of choice and project abandonment are worth discussing, let's not forget that thousands of developers have found something that they think is lacking, fixed it, and taken the time to give it to us gratis so that we can use and learn from it. I, for one, am excited.
But I think it's also possible that the HN frontpage simply doesn't give a good impression of the average hacker. The average hacker is not welding together the latest js frameworks while tying it all together with node.js and hadoop. Perhaps more articles on "common man" hackers could be beneficial, or at least give some perspective.
For JS specifically, it's starting to look like there is no "right" clientside model, at least yet. Web apps behave so differently and have different requirements that no one framework will fix it, sort of like rails, or game engines that solve a large class of problems pretty well.
For side projects or "helper apps" (I make a lot of them), I still use rails 238 and prototype because I'm productive with it. Think of all the things you could accomplish if you didn't spend all that time fiddling with 0.0.1 version software.
Even if you're doing a client side app, some views are easier to implement in the server-side "full-page reload" style, and you want to have the option to do that.
Web apps behave so differently and have different requirements that no one framework will fix it
If that's the case, why stick with one framework? Why not explore as many as you can and then pick the best one for each new project?
The best framework for each new project is one that has the features I need, and I'm productive with.
Unless I REALLY need the performance improvements, or the modular design of Rails 3.0 why even waste the time?
Exploring as many frameworks as I can is the exact problem I want to avoid. Generally I read enough and do research on the side because its fun, and I know what's out there and what I'm missing out on. If I need to create a simple web app to do simple things, and I'm not scaling to a billion users, and I simply need software to help me automate or model something, Rails 2.3.8 works.
When you quoted me on different web app behavior, I simply meant the client side. Rich client-side JS driven apps need different things because the data behaves differently (especially if you are doing realtime stuff, versus not, etc). When it comes to the server, Rails 3 isn't providing anything I need for my tools. Now, are we porting to Rails 3 for our flagship product? Yes. And we are doing it because we want the performance enhancement and maintaining the gems we use might be difficult in the future.
Anything else though? Much more productive to type 'rails my_app' with my current environment and build something that works.
I want to solve problems that need solving. I don't want to reinvent wheels. As you grow in expertise, you get better at picking the right framework for the right job. If you're not there yet, just make your best pick and move forward. In the end you can make anything work, you can refactor, you can replace. Software is malleable.
Could you expand on this?
I often feel that the software industry should be much 'further along' than it is. It's 2012 and most software interactions are through kludgy, blocky web interfaces, its not uncommon for rather basically behaving systems to be backed by millions of lines of code.
I know this is rather a soft lament, but it doesn't feel right. It feels that with all the years we've been at this practice we should have progressed further than we have.
I'm having a lot of fun right now doing 68000 ASM for my personal projects. Going to try to figure out where we could have been if we didn't get stuck in a rut in the late 80s.
The 80s were a period of great diversity and hence progress. In that decade I used 6502 (BBC Micro), 68000 (ST), ARM (Archimedes), there were also Z80 and early x86s. There were many approaches to systems design, many competing ideas, what you might call a Cambrian Era (http://en.wikipedia.org/wiki/Cambrian_explosion).
By the 90s, things settled down. People (as in, "end users", the people who ultimately pay for software) had figured out what they wanted to do, which is forms (screens for entering data into a database) and reports (screens for getting it out again, nicely formatted). Messaging, as in email/IM, is just a special case of this (and today, FB et al are just forms and reports). What was "good enough" for this was an x86 (which happened to be running Windows, but that's irrelevant IMHO) and that became the lowest common denominator. The diversity was still there in the form of the Unix workstation market but they spent all their strength competing with one hand and trying to standardize with the other. Meanwhile the relentless march of hardware continued on, Intel was swimming in cash and none of the Unix vendors noticed, they were all preoccupied. Now there is one architecture to rule them all (tho' ARM still clings on).
And we the developers are to blame. We fell into a comfort zone. The users wanted forms and reports, and that's what we gave them. Then the hardware gave us more power and we gave them bells and whistles too. But really, that is solved, has been solved for 2 decades now. Nearly everything we do with computers now, could have been done in 1990, and what they did then, we do the same now, except arguing over syntax, which of a dozen functionally equivalent "frameworks" to use, which ALGOL-derived language to use, which desktop to use with slightly different-looking widgets that do the same thing, etc etc etc. All this is just procrastination. None of these things matter. But comfort zones are, well, comfortable. It's like we climbed a small hill and stopped to enjoy the view because we are afraid of the mountain we have to climb still.
There are a few tantalizing glimmers now, of the "next level". But we wasted 20 years dicking about.
Interviews are nothing short of awkward.
Interviewer: So what do you like to do?
Me: I'm a Python/Ruby guy. I also have worked in Java/C. Currently playing around with node.js, machine learning algorithms, some functional programming and pet project with my Arduino Duemilanove.
Interviewer: Sounds good, you'll hear from us in the next few days.
Part of it is the language. Technical interviewers love hearing about how I wrote a cool program in Lisp, for example, but then never seem to want to hire me to actually write Lisp code. (I didn't learn Lisp just to show off.)
Part of it is the project. Virtually no software project I see these days looks interesting enough that I'd want to work on it. (Most software I don't even want to use, and even if it's free.)
There may well be other parts. But thinking back, I had one job where I picked the language and the project, and did have quite a bit of fun there, even though it was by far my lowest salary.
I would love to be paid for programs I want to write anyway, the way I want to write them. Still working out how to do that. :-)
Interviews are a drag, and none of them like hearing that it will take one month to become fully accustomed (in-depth knowledge) with their framework. They'd prefer someone with limited knowledge who knows their framework rather than one who's well versed in multiple frameworks/languages.
It seems to me that a better approach would be to put new hires into code maintenance first with a mentor who can help them get up to speed.....
But most of us know that HR departments are not always the best judges of talent.
Programmers who advertise themselves as "Ruby programmers" or "Python hackers" or what-have-you come off as inexperienced and one dimensional. The best hackers have used the right tool for the task at and, and the best hackers have solved a wide variety of problems to require different tools.
Finally I have committed what might sound like the ultimate sin and given in to the Java/Spring framework. I'm learning various aspects of Spring, should have a small project up on github and plan on "specializing" in this framework for good.
All my other fun work is going to be done on my time alone. I look at it as the 'get serious, time to separate work and play.'
The reason is there seems to be a large number of programmers who "want to be good programmers" that do this. All that it really results in is a shallow understanding in many things with an accompanying attitude that they're smart, well cultured, or have seen a lot of things.
Dabbling in many different things is killing your learning curve because of all the context switching and you don't ever get awesome at any one thing. Putting those technologies or frameworks on your resume and blog like Foursquare badges doesn't mean anything. What have you done? Then and only then do I care about what you know.
Your project is not the framework you use, nor does it matter which language you chose. It is a non-issue if Facebook or Google uses the same stuff you do. Your projects value is not a combination of buzz-words.
Choose technology based on what gets the job done, and all the better if it makes the project fun to do. Don't be paralyzed by the choice, focus on solving the problems of your project instead.
PS. The last bit is a joke
See my point? It doesn't mater what you use as long as you're making things. Pick something and do something. Iteration can only come after you've completed projects. Eventually you'll figure out what you prefer, and even then it wont hold a candle to what you've made.
Your whole comment is centered on the technology you use, not what you're doing with it. Also you're talking about every of those piece of tech like it's easy to do something in it. But see, when you switch technology with every project, even if you learn plenty of things, you :
1. Prevent yourself from becoming really fluent in one piece of technology.
2.Force yourself to learn a myriad of details that are really useless to what you ultimately want to do.
Ultimately you can burn yourself out, learning 1000 things, and all you have to show for it at the end is a collection of unfinished crap projects.
My advice for somebody who feels like this would be quite the opposite :
- Find what you want to do
- Think about it, in a technology agnostic way
- Pick your technology, pick the technology you're the most familiar with at the moment you start realizing the project, even if it seems dull to you, if you're sensible to the kind of syndrome outlined in the OP, it probably isn't :)
- Do it. Don't switch techs.
If you aren't in a tech hotspot and you aren't chasing jobs, it doesn't matter what you learn. It matters what you make and how you can sell yourself.
And a quote from the same article:
>>Think of the history of data access strategies to come out of Microsoft. ODBC, RDO, DAO, ADO, OLEDB, now ADO.NET - All New! Are these technological imperatives? The result of an incompetent design group that needs to reinvent data access every goddamn year? (That's probably it, actually.) But the end result is just cover fire. The competition has no choice but to spend all their time porting and keeping up, time that they can't spend writing new features. Look closely at the software landscape. The companies that do well are the ones who rely least on big companies and don't have to spend all their cycles catching up and reimplementing and fixing bugs that crop up only on Windows XP. The companies who stumble are the ones who spend too much time reading tea leaves to figure out the future direction of Microsoft. People get worried about .NET and decide to rewrite their whole architecture for .NET because they think they have to. Microsoft is shooting at you, and it's just cover fire so that they can move forward and you can't, because this is how the game is played, Bubby.<<
Except that in this case the wounds are self inflicted.
Inspired by Crockford's talks in which he mentioned a couple of times the lack of history and historical knowledge among developers, I suddenly remembered that I got an education in humanities before I even went into programming and really did start with the classics - Greeks and Romans and Philosophy - and I asked myself why on earth I never really considered doing something similar in computing. (Someone called it in some article I've sadly forgotten the Oxford way - you learn Latin and Math and from there you can learn anything anyways ;)
So I decided to ignore all fashions, new things, upcoming frameworks and such for some time until I got what I would consider (totally subjectively) a solid foundation (not there yet, will probably take another year at least) of knowledge.
I personally decided to define "classics" along the lines of "knowing Unix and its history and concepts well" (re-learning shell and commandline wizardry on top), "understand decent C and Assembler", "becoming familar with the influential languages of important concepts like functional and OO programming (Scheme/Lisp and Smalltalk). This includes really understanding SQL (which I suddenly started to really like to my own surprise). Maybe I add some PostScript and TeX along the way. I also found a new appreciation for Perl's text processing capabilities and its influence from the 1990ies on.
Set aside that I constantly have to fight kind-of a "bad conscience" exactly BECAUSE I'm not hurrying along to try out the latest and greatest new fashion, I'm starting to feel a deep change in my programming skills, in my thinking about design and I'm constantly marveling where computing already has been and how much there is to learn from the classics. I also started to get a distinct feeling of "Man, I just don't NEED all this clutter and stuff" and a newly found appreciation of "simplicity" (Watch Rich Hickey's talk about "simple versus easy"..).
I started to slow down, to think more carefully, to read a lot more on concepts and ideas. I've finished recently Christopher Alexander's Pattern Language book for example to get a better feel for the "orginal" idea or read up on the history of "lean production" and Toyota's influence or looked again at Dieter Rams' design principles (I'm German and I basically grew up with Braun appliances, I didn't even realize how influential his designs have been to me..)
All this changed me and my thinking about programming deeply and made me find kind of a "central theme" or "essence" in programming and design I like and I'm starting to strive for.
In the long run, this also gives me a foundation of how I'm judging new tools, ideas, frameworks and programming languages.
On top of that, I'm better able to place myself into a certain "style" or "culture" of programming I'm not going to give upon as long as I can afford it.
Anyways - I personally think there actually is a choice whether one tries to catch up with everything new or peek into new things selectively or just steps back a little and watches how all this will unfold in the long run.
(I also have a long-held, un-proven personal theory that people simply like _writing_ frameworks a lot more than _using_ them - some are just faster to release theirs into the public.. :)
Only... less Latin, more Math :-)
I recently had a similar experience - on a whim I decided I would learn python by taking a asp.net mvc app I built and have maintained for the last 3 years and port it. So I just needed to choose a python web framework right? omg. So many, micro, mini, full stack, no stack, template engines, wsgi, uwsgi, whoa! My head was spinning
You can be completely overwhelmed just trying to make a decision, I haven't even looked at storage yet but I'll probably just use sqlite (the old .net one used db4o, which ended up being a huge mistake, great tool, pitfa to do maint on)
on a side note, loving python, and I settled on bottle, but I wrote my own template framework called canvas, inspired by the seaside component/tag/canvas classes.. essentially it's all just python code, no html.. example usage is https://gist.github.com/3087622 (i'll push it to github once I'm happy with it, hah!)
But is "canvas" (and "HtmlCanvas") meant to refer to the HTML5 <canvas> element? If not, I can see that being a point of huge confusion.
canvas.canvas() would definitely look weird
I for one am glad for the existence of frameworks, but this can lead to pretty schizophrenic programming. For example, one of my latest projects has 3 front-facing ends. I somehow made a wrong judgment call and ended up with Flask, Bottle and Web.py on each end (the web.py end has been culled just today).
One has to use what the customer or project requires.
If it is something that we can enjoy, great, if not it is a job and the customer should always be an happy customer.
The same is true for projects: there are relatively few projects that force you to use some particular language.
As people like to endlessly repeat, you should choose the best tool for the job. And, unless there are strong reasons against it, the best tool is one you like.
On a more cynical note: the job market is great right now. If you don't like the technology you're using, it's a prime time to look at new opportunities. Apply to some cool startups doing cool things with cool technologies--at the very least, it'll be exciting :P.
On the corporation consulting world I work, the technology is always part of the RFP sent by the customer to the consulting companies.
It also happens with other tech areas. So I simply take the road around them unless it looks like I need to use one.
My personal work is done with Common Lisp, and I would prefer to keep it that way, by and large. It's a sufficiently extensible language that I can feel comfortable that (1) needed features will be programmed in, using CL, and (2) it will be stable for a long time.
But I guess that's a bit eccentric of me. :-)
Getting into a new platform or familiarizing with a new tool is something you get better at the more you do it. Learning your eleventh programming language will be easier than learning the third. Getting familiar with yet another framework is easier if you know a few already and can compare the similarities and differences.
And let's not forget about the fundamentals. Having a strong background in web plumbing, understanding http and html, etc well is the key to being a good web programmer, not which frameworks you know by heart. Knowing a little theory about programming will make it easier to grok new languages faster.
If you're tired of constantly learning new frameworks and languages, go learn C and systems programming. That stuff is going to stick around for a long time and will provide secure jobs for the foreseeable future.
Back in the dino days, the rate of change was tiny compared to today's daily delta V. We mostly had monthly and a few weekly rags to tell us about the next big thing; there weren't that many next big things, anyway. Hey, it was a really big deal when cartridge tapes came out and we didn't have to thread a 1/2 inch magtape by hand anymore... Geez, who could forget 1-base-T networking? or 9600 baud modems? Whew.
Don't get me started on software innovations like Oracle SQL\Forms! ManOhMan! Fergit CICS!
Seriously - the barrier of entry is so low now, that the least of us can throw something up on the wall and if it sticks or even just leaves a little residue, the flies are all over it, preaching it up, "This is the best shit ever."
There really isn't much new out there, mostly it's just lots of new flavor wheels. Think "Hudsucker Proxy" at times like this...
Exhausting as it may seem, its far cooler than 10 years ago! As another reason pointed out, use what you like.
ASP.NET MVC - check
LINQ - check
EntityFramework - check
next step, SignalIR!
next next step: Metro! :)
ASP.NET MVC 4.0 projects now include knockout.js by default. So that may be around for a while if only to support .NET devs who are just using out-of-the-box technologies.
At the end of the day, what is important is to deliver a piece of software that works according to the customer requirements.
If the customer dictates the operating system, language, etc. Then it is already decided what to use.
If some technology liberty is given, then it should be something that is working as desired at the given deadline, regardless of the technology.
Compare and contrast the number of iPhone or iPad models with the rest of their respective industries. They're delineated along good/better/best and the tradeoffs are obvious. It's a lot easier to shop for an iPad than it is an arbitrary laptop.