Hacker News new | comments | show | ask | jobs | submit login
The Crazy World of Code (tilomitra.com)
150 points by hobonumber1 1959 days ago | hide | past | web | 87 comments | favorite



I myself have reverted back to doing "everything" myself... I don't use frameworks besides my own crap.

The downfalls... 1) stuff can take a LOT longer to develop, 2) sometimes I feel like I'm stuck implementing middleware when I should be developing features, 3) there is more room for bugs to sneak in 4) scaling could be tougher when/if the need arises

The positive... 1) I for once feel like I actually know the full flow of execution and this at times makes certain solutions much easier 2) I don't feel that Im on the treadmill of hopping from learning one framework to the other 3) it grows my skill 4) I feel more like Im chiseling something from one stone rather than tying together rocks

Im not entirely sure Id recommend what Im doing - in someways its really stupid but I personally get gratification from it and oddly feel more productive than when Im fusing together multiple frameworks.


> ... in programming there is a wide-spread 1st order theory that one shouldn't build one's own tools, languages, and especially operating systems. This is true—an incredible amount of time and energy has gone down these ratholes. On the 2nd hand, if you can build your own tools, languages and operating systems, then you absolutely should because the leverage that can be obtained (and often the time not wasted in trying to fix other people's not quite right tools) can be incredible.

Alan Kay, "The Power Of The Context", p.8, https://docs.google.com/viewer?url=http://www.vpri.org/pdf/m...

EDIT google search (mostly) provides a "Quick View" link for pdfs - there's also addons for chrome/FF that add a right-click context menu option to open a file in google view, useful for checking out lecture notes/slides, that often are a list of pdf links.


thank you for posting a google doc view link!


A good middle ground is using a collection of well written libraries that achieve the function you're looking or. Doing that instead of using a monolithic framework has some nice benefits:

1) The libraries are guaranteed to be decoupled.

2) The libraries are better maintained individually.

3) It's much easier (sometimes painless) to replace them if they don't suit your needs.

4) ... probably many more.

The hardest part of bootstrapping a new application is the boilerplate. Let someone else take care of managing session state, generating SQL queries, URL routing, DI containers, buffered logging, templating, etc. If you haven't found something that suits your needs, sure, write it yourself.

I wouldn't say what you're doing is stupid. Do what makes you successful and happy. Just don't be stubborn ;)


I would agree with this.

One of the directions LedgerSMB (http://www.ledgersmb.org) is going is from a framework which was originally written by one guy to something which outsources as much as possible to CPAN modules. This poses some issues, but in general it works pretty well. We probably have as much framework code as we used to but the code does a lot more. We decided not to go with other frameworks because of the dependency issues that sometimes arise.

Also I will personally vouch for #3. We started off using Std::Config for processing ini files. But this isn't well supported in distros or well maintained so we switched to Config::IniFiles and it was a pretty painless effort.

the bigger thing though is you don't have to do it all yourself. :-)


Out of curiosity, did you ever spend much time reading the code of the various frameworks on offer? Especially since you have the experience of building everything from scratch, I think you could have the same level of understanding of any frameworks you put together by browsing the code.

In Rails land at least, a lot of relatively inexperienced developers tend to have trouble when they can't google for the thing they want to implement or find it in the docs. I find that if I open up the source in roughly the area in which I need to implement something, a hook point is provided at about the right level of abstraction. Your mileage with other frameworks/environments may vary.


I jump into the Rails (or other Ruby libraries) source for understanding quite often. This is good advice, but for folks coming from static languages it's really a challenge to figure out how to find stuff. I've resorted to running it all under a debugger and setting breakpoints, just so I can find out which monkey-patch (or non-namespaced, duplicate class) is being called.


2 things in particular I use:

* Bundle comes with a `bundle open <gemname>` command that opens up the gems source code in whatever your shells $EDITOR is set to. I use this at least three or four times a day.

* Pry (an IRB alternative) will tell you the file + line of the definition of the method you'd be running if you ran `show-source some_object.some_method`. You can also `ls` and `cd` your way around ruby modules/objects which has been invaluable.


If you use Pry's `edit-method some_object.some_method`[1] you can actually open the method in an editor and it'll even fast forward the cursor to the first line of the method.

[1] https://github.com/pry/pry/wiki/Editor-integration#wiki-Edit...


Definitely check out Pry (http://pry.github.com). It's designed specifically for your use case... Just start Pry, load the gems you want, and then navigate to the method/class you want to know about using Pry's `ls` and `cd` commands. Once you find what you want, either use `show-source` to view the source directly in Pry, or use `edit-method` to open it in an editor.


Beyond a point, mastery over a particular framework comes only when you look at the actual code. Just fire up a debugger with a breakpoint set in the problem area. Figure out what happens and understand why your code is breaking. Rinse and repeat for ninja level in your framework of choice.


I do this as well - - everything from the JavaScript framework, to the ORM, to the web-framework is built or assembled by me. I even use my own special databases built on top of Tokyo Tyrant and Redis. It's basically my world where I know every part of the system. This makes me super productive and I can change anything. It's also a great learning experience. This said there is a huge problem with this approach and it's scaling your development team... It's hard for people to figure out how stuff works in an environment where nothing is standard.

My plan to solve this is to open-source my tools/libraries and switch to Flask, so at least some of the stack is familiar. Open-sourcing these tools will force me to isolate them, clean them up and write some documentation - - it can benefit other people and maybe open a pool of developers that know the tools better.


"My plan to solve this is to open-source my tools/libraries"

Quick advice: Don't. You'll make it worse.

You're already undermining your thing in that first sentence ("switch it to X so that other people will understand it"). Turning an in-house framework into a "ready for the real world" framework is a ton of work and will invariably need to dull down the sharp edges and general case a bunch of things.

The thing is, those sharp edges are what make it such a good tool. You want sharp edges on your tools. Dull tools might not hurt anybody else but they won't be as good at doing what they do.

You've got something that works really well for you. Chances are it's so focused towards your workflow and way of thinking about the world that it just won't be all that useful to anybody but you. That's just fine.

Trying to make it useful for everybody else will only succeed in making it less useful for you.


Hey Jason. The general problem is hiring people when you use a lot of internal tools - - as some of them are made in a haste and are hackish, and most of them don't really have good documentation. But maybe you have a point and the first step could be to clean the tools up and write some internal documentation for them.


> My plan to solve this is to open-source my tools/libraries and switch to Flask, so at least some of the stack is familiar. Open-sourcing these tools will force me to isolate them, clean them up and write some documentation

Pretty accurate… I like your idea of how to solve it… You can do it the other way around too, have/ask a coder with whom you work spend at least some time on cleaning up and documenting the code… Sounds like a lot of extra effort, but they are already trying to understand and conceptualize the thing for themselves, might as well write it down…


No framework approach only work for a one person team or very small team and by design cannot scale.

As soon as you try to bring somebody that will mainly "use" those tools/libraries instead of actively developing them, the tools/libraries becomes a de-facto framework to them, just a proprietary one with no community.

If/once you open-source it, it just becomes another framework competing with the other ones.

Basically: http://xkcd.com/927/

In short, if you are not rewriting everything new from scratch for every project, you are using a framework - only your own.


How consistent you are about the patterns you use must have an enormous impact on how easy it is to resume work on something three projects ago.

If you're switching between frameworks all the time, it would be even more difficult because you wouldn't have deep knowledge, but at least there is presumably a group of people somewhere to consult. If you're lucky they even had some plan when they came up with design decisions.

I can see how you avoid a lot of cruft, but how do you discipline your own projects? How much documentation do you create for your own later reference?


I just reference a piece of code where I know I've done something similar in the past, even for public frameworks like rails.


I can't speak for the grandparent but my approach has developed slowly into:

1) Consistent but evolving paradigms. In other words, let's not be static, let's keep most change happening relatively slowly. But let's review and try to fix problems as they happen, cleaning old code slowly and continuously.

2) Try to document everything to at least a basic extent.

3) Let the documentation practices evolve. Don't try to document too much too soon. Start out with a basic level. As the project matures you will have a better idea of what needs more documentation and you can spend your time on the areas that help.

4) Don't underestimate person to person mentoring.


Its a really good point and Im not going to try and cover up the pitfalls you pointed out. I'm trying to document the reusable core as much as I can because 1) Im forgetful and at times will need a quick refresh 2) Im hoping that I can open source it soon and it might have a chance of helping someone that likes it. I know that producing core code to bundle into a framework when I don't use frameworks is pretty hypocritical but even so thats an intent I have for the future.


You need to draw the line somewhere. Building your own operating system, programming language or database isn't the best idea. It is a tradeoff between time and trust.

I think it is wise to learn other frameworks and languages. It will help you with making wise decisions when developing your own tools.


I understand where the author is coming from, and it can be hard to figure out where to even get started on new projects. On the other hand, I am thankful from the obscene amount of great software coming out of the open source community every day.

While things like paradox of choice and project abandonment are worth discussing, let's not forget that thousands of developers have found something that they think is lacking, fixed it, and taken the time to give it to us gratis so that we can use and learn from it. I, for one, am excited.


Excellent critique. I feel that HN often blows by rational ignorance with some of it's advice (http://en.wikipedia.org/wiki/Rational_ignorance). Sometimes it's best to do it wrong and actually do something, rather than hopping around per someone else's dogma and never actually making anything.

But I think it's also possible that the HN frontpage simply doesn't give a good impression of the average hacker. The average hacker is not welding together the latest js frameworks while tying it all together with node.js and hadoop. Perhaps more articles on "common man" hackers could be beneficial, or at least give some perspective.


Completely agree with this. HN is great for technical discussions, but at the end of the day, arguing over technical details doesn't solve any real problem for 99% of the users out there.


Love it. Sometimes I feel like we nerds get caught in the weeds and forget to solve real world problems. We have enough JS frameworks by now... Funny thing is most of them are the same given a few differences.

For JS specifically, it's starting to look like there is no "right" clientside model, at least yet. Web apps behave so differently and have different requirements that no one framework will fix it, sort of like rails, or game engines that solve a large class of problems pretty well.

For side projects or "helper apps" (I make a lot of them), I still use rails 238 and prototype because I'm productive with it. Think of all the things you could accomplish if you didn't spend all that time fiddling with 0.0.1 version software.


I would somewhat agree with this. I recently wrote a completely client-side app using a variety of different micro-frameworks. I used Parse's JS API for my backend. At the end of the day, I felt that a traditional CRUD system on a LAMP stack would have probably been easier to implement and would have fit the client's requirements just as well


Thanks for sharing. I feel like a lot of the time we forget what problem we are solving. The client rarely asks for new technology or a specific stack as much as they ask for web presence, some system integration, or an app to fill a specific need.


I'm in exactly the same situation - will try to finish it anyway though, but in general I regret that I didn't go with Flask and instead went with a client-side + REST framework.

Even if you're doing a client side app, some views are easier to implement in the server-side "full-page reload" style, and you want to have the option to do that.


Also, its very funny how sometimes I think I don't need the four CRUD operations for some object ("oh no, this one works differently") but in the end always notice I do need them...


While there's nothing wrong with Rails 2.3.8, there's also a lot you're missing out on. Maybe you don't start your next project with bleeding edge software, but it's definitely worth exploring to see what is out there.

Web apps behave so differently and have different requirements that no one framework will fix it

If that's the case, why stick with one framework? Why not explore as many as you can and then pick the best one for each new project?


I'm not "missing out" on anything. 99% of the time, I don't need the new features. Rails 2.3.8 and prototype do everything I need it to.

The best framework for each new project is one that has the features I need, and I'm productive with.

Unless I REALLY need the performance improvements, or the modular design of Rails 3.0 why even waste the time?

Exploring as many frameworks as I can is the exact problem I want to avoid. Generally I read enough and do research on the side because its fun, and I know what's out there and what I'm missing out on. If I need to create a simple web app to do simple things, and I'm not scaling to a billion users, and I simply need software to help me automate or model something, Rails 2.3.8 works.

When you quoted me on different web app behavior, I simply meant the client side. Rich client-side JS driven apps need different things because the data behaves differently (especially if you are doing realtime stuff, versus not, etc). When it comes to the server, Rails 3 isn't providing anything I need for my tools. Now, are we porting to Rails 3 for our flagship product? Yes. And we are doing it because we want the performance enhancement and maintaining the gems we use might be difficult in the future.

Anything else though? Much more productive to type 'rails my_app' with my current environment and build something that works.


As frustrated as I am with what seems like an unnecessarily immature software industry, I am grateful for the plethora of wonderfully crafted free frameworks that I have at my disposal (http://www.jmolly.com/2012/07/01/the-wide-landscape-of-java....).

I want to solve problems that need solving. I don't want to reinvent wheels. As you grow in expertise, you get better at picking the right framework for the right job. If you're not there yet, just make your best pick and move forward. In the end you can make anything work, you can refactor, you can replace. Software is malleable.


> As frustrated as I am with what seems like an unnecessarily immature software industry,

Could you expand on this?


I didn't really phrase that right.

I often feel that the software industry should be much 'further along' than it is. It's 2012 and most software interactions are through kludgy, blocky web interfaces, its not uncommon for rather basically behaving systems to be backed by millions of lines of code.

I know this is rather a soft lament, but it doesn't feel right. It feels that with all the years we've been at this practice we should have progressed further than we have.


We spent the 90s and 00s rewriting the 80s again and again. The 10s look like they're going the same way. Same concepts, same functionality, different syntax only, no progress.

I'm having a lot of fun right now doing 68000 ASM for my personal projects. Going to try to figure out where we could have been if we didn't get stuck in a rut in the late 80s.


As someone with an interest in this subject, where did we go wrong? Plenty of people say *nix. But I'm not old enough to have witnessed any of that firsthand.


The issue is as much psychological and cultural as it is technological.

The 80s were a period of great diversity and hence progress. In that decade I used 6502 (BBC Micro), 68000 (ST), ARM (Archimedes), there were also Z80 and early x86s. There were many approaches to systems design, many competing ideas, what you might call a Cambrian Era (http://en.wikipedia.org/wiki/Cambrian_explosion).

By the 90s, things settled down. People (as in, "end users", the people who ultimately pay for software) had figured out what they wanted to do, which is forms (screens for entering data into a database) and reports (screens for getting it out again, nicely formatted). Messaging, as in email/IM, is just a special case of this (and today, FB et al are just forms and reports). What was "good enough" for this was an x86 (which happened to be running Windows, but that's irrelevant IMHO) and that became the lowest common denominator. The diversity was still there in the form of the Unix workstation market but they spent all their strength competing with one hand and trying to standardize with the other. Meanwhile the relentless march of hardware continued on, Intel was swimming in cash and none of the Unix vendors noticed, they were all preoccupied. Now there is one architecture to rule them all (tho' ARM still clings on).

And we the developers are to blame. We fell into a comfort zone. The users wanted forms and reports, and that's what we gave them. Then the hardware gave us more power and we gave them bells and whistles too. But really, that is solved, has been solved for 2 decades now. Nearly everything we do with computers now, could have been done in 1990, and what they did then, we do the same now, except arguing over syntax, which of a dozen functionally equivalent "frameworks" to use, which ALGOL-derived language to use, which desktop to use with slightly different-looking widgets that do the same thing, etc etc etc. All this is just procrastination. None of these things matter. But comfort zones are, well, comfortable. It's like we climbed a small hill and stopped to enjoy the view because we are afraid of the mountain we have to climb still.

There are a few tantalizing glimmers now, of the "next level". But we wasted 20 years dicking about.


This has become a serious problem for me in my search for a better job.

Interviews are nothing short of awkward.

Interviewer: So what do you like to do?

Me: I'm a Python/Ruby guy. I also have worked in Java/C. Currently playing around with node.js, machine learning algorithms, some functional programming and pet project with my Arduino Duemilanove.

Interviewer: Sounds good, you'll hear from us in the next few days.

:(


Sounds like you're not applying to the right companies. Any developer that comes to me with an obvious love for code moves to the top of the list. If you're coding in your own time for fun, it's not a job, and I'd want to hire you - in some capacity :)


There's a catch-22 for me here. I love writing programs, and write my own programs for fun. But as soon as I'm hired to write somebody else's programs, it stops being fun.

Part of it is the language. Technical interviewers love hearing about how I wrote a cool program in Lisp, for example, but then never seem to want to hire me to actually write Lisp code. (I didn't learn Lisp just to show off.)

Part of it is the project. Virtually no software project I see these days looks interesting enough that I'd want to work on it. (Most software I don't even want to use, and even if it's free.)

There may well be other parts. But thinking back, I had one job where I picked the language and the project, and did have quite a bit of fun there, even though it was by far my lowest salary.

I would love to be paid for programs I want to write anyway, the way I want to write them. Still working out how to do that. :-)


funny, I feel almost exactly the opposite. If someone gives me a fairly concrete problem to solve I love writing a program to solve it. Of course there are times where I get bogged down in tedious minutiae where I'd rather be programming on some other bit but for the most part I find it pretty motivational to be solving a concrete problem for someone.


The growing trend that I've seen now days is companies requiring extensive knowledge in "this" framework. It doesn't help that every company requires a different framework.

Interviews are a drag, and none of them like hearing that it will take one month to become fully accustomed (in-depth knowledge) with their framework. They'd prefer someone with limited knowledge who knows their framework rather than one who's well versed in multiple frameworks/languages.


It's a pretty stupid trend too. Let's face it, the best programmers will learn "this" framework quickly, get up to speed etc. Not only that but the programmers you really want to hire are going to be those who love learning, thinking, and critiquing their own code, not specialists in Framework X NG.

It seems to me that a better approach would be to put new hires into code maintenance first with a mentor who can help them get up to speed.....

But most of us know that HR departments are not always the best judges of talent.


This is exactly what we do where I work - a lot of our programmers don't know Ruby when they walk in the door but that shouldn't matter if you choose the right people. I'm scared of choosing language or framework 'specialists' because they have a limited view on what programming is, and a cemented view on what it should be. The only exceptions I can see are when the specialization is language specific to the problem domain, IE C for high performance server or system code.

Programmers who advertise themselves as "Ruby programmers" or "Python hackers" or what-have-you come off as inexperienced and one dimensional. The best hackers have used the right tool for the task at and, and the best hackers have solved a wide variety of problems to require different tools.


This is exactly my problem.

Finally I have committed what might sound like the ultimate sin and given in to the Java/Spring framework. I'm learning various aspects of Spring, should have a small project up on github and plan on "specializing" in this framework for good.

All my other fun work is going to be done on my time alone. I look at it as the 'get serious, time to separate work and play.'


In another balance, I take guard when people say "I'm doing Node, and machine learning, and this, and that."

The reason is there seems to be a large number of programmers who "want to be good programmers" that do this. All that it really results in is a shallow understanding in many things with an accompanying attitude that they're smart, well cultured, or have seen a lot of things.

Dabbling in many different things is killing your learning curve because of all the context switching and you don't ever get awesome at any one thing. Putting those technologies or frameworks on your resume and blog like Foursquare badges doesn't mean anything. What have you done? Then and only then do I care about what you know.


You're right and I am addressing that as I mentioned in this: http://news.ycombinator.com/item?id=4227914


If you start your day by reading your favorite software development news aggregation sites, containing news about interesting libraries, frameworks and how different people have succeeded with them; you can get disillusioned about the importance of technology choices.

Your project is not the framework you use, nor does it matter which language you chose. It is a non-issue if Facebook or Google uses the same stuff you do. Your projects value is not a combination of buzz-words.

Choose technology based on what gets the job done, and all the better if it makes the project fun to do. Don't be paralyzed by the choice, focus on solving the problems of your project instead.

Off to write a Clojure compiler in Haskell which can run natively on iOS and translate itself to HTML5 for live development with SocketIO backing the Closure optimized Javascript using node.js for the concurrency model.

PS. The last bit is a joke


No doubt you've heard of this? https://github.com/jspahrsummers/cocoa-clojure


Just make little things. Backbone is fun, make some little backbone views/models, they'll work perfectly with any other JS you have. Want to try rails? Do another little project in rails. Is it too much? try the next one but only use sinatra. Like javascript? Do your next project in Node.js. (and remember jifasnif!) Felt like you're in callback hell? (remember use flow control and streams!) Try out gevent in python on the next project.

See my point? It doesn't mater what you use as long as you're making things. Pick something and do something. Iteration can only come after you've completed projects. Eventually you'll figure out what you prefer, and even then it wont hold a candle to what you've made.


I respect the sentiment, but for me, this is literally the worst advice i could get, and i suspect it is for other people too.

Your whole comment is centered on the technology you use, not what you're doing with it. Also you're talking about every of those piece of tech like it's easy to do something in it. But see, when you switch technology with every project, even if you learn plenty of things, you :

1. Prevent yourself from becoming really fluent in one piece of technology. 2.Force yourself to learn a myriad of details that are really useless to what you ultimately want to do.

Ultimately you can burn yourself out, learning 1000 things, and all you have to show for it at the end is a collection of unfinished crap projects.

My advice for somebody who feels like this would be quite the opposite :

- Find what you want to do

- Think about it, in a technology agnostic way

- Pick your technology, pick the technology you're the most familiar with at the moment you start realizing the project, even if it seems dull to you, if you're sensible to the kind of syndrome outlined in the OP, it probably isn't :)

- Do it. Don't switch techs.


Completely agree with this. However, when I talk to developers who aren't based in tech hotspots, they often tell me they are lost because they dont know what they should learn, because there's so much out there. The side effect of this (which I have seen) is that people write off every new OS project that comes out, because they feel by the time they learn it, something new will be out there. I don't agree with this philosophy but it's something that I have seen with my own eyes.


One of my views is that you should figure out what you want to make before you figure out what you want to learn. Once you have a functional idea of what you are doing, then it is easier to pick something similar, or look to library repositories (CPAN, gems, etc) and decide what you want to do.

If you aren't in a tech hotspot and you aren't chasing jobs, it doesn't matter what you learn. It matters what you make and how you can sell yourself.


While I tend to do it like that, the recent post by Marco about using PHP made me doubt it. For me starting a new project tends to be a little bit painful, because I use the opportunity to learn new stuff. But I never become a full grown specialist in any of these tools. Maybe it would be better to become expert at one tool, so that you can use it by heart.


I agree that no single person should be jumping around between all of the languages and frameworks out there, but the diversity is a sign that there is an active community that is experimenting with new ways of doing things. If one of them is especially useful for your problem, then you should learn it as a matter of practicality (mind the cost of transition though). If you enjoy language development, then you should definitely switch to a .01 language framework, that is where the interesting research is being done. If you come up with something good, and work out the kinks, then mainstream languages will start to use it.


This article is probably appropriate: http://www.joelonsoftware.com/articles/fog0000000339.html

And a quote from the same article:

>>Think of the history of data access strategies to come out of Microsoft. ODBC, RDO, DAO, ADO, OLEDB, now ADO.NET - All New! Are these technological imperatives? The result of an incompetent design group that needs to reinvent data access every goddamn year? (That's probably it, actually.) But the end result is just cover fire. The competition has no choice but to spend all their time porting and keeping up, time that they can't spend writing new features. Look closely at the software landscape. The companies that do well are the ones who rely least on big companies and don't have to spend all their cycles catching up and reimplementing and fixing bugs that crop up only on Windows XP. The companies who stumble are the ones who spend too much time reading tea leaves to figure out the future direction of Microsoft. People get worried about .NET and decide to rewrite their whole architecture for .NET because they think they have to. Microsoft is shooting at you, and it's just cover fire so that they can move forward and you can't, because this is how the game is played, Bubby.<<

Except that in this case the wounds are self inflicted.


Well, this kind of insanity made me step back 10 steps and invest some time (almost a year by now) into the "classics" so to speak.

Inspired by Crockford's talks in which he mentioned a couple of times the lack of history and historical knowledge among developers, I suddenly remembered that I got an education in humanities before I even went into programming and really did start with the classics - Greeks and Romans and Philosophy - and I asked myself why on earth I never really considered doing something similar in computing. (Someone called it in some article I've sadly forgotten the Oxford way - you learn Latin and Math and from there you can learn anything anyways ;)

So I decided to ignore all fashions, new things, upcoming frameworks and such for some time until I got what I would consider (totally subjectively) a solid foundation (not there yet, will probably take another year at least) of knowledge.

I personally decided to define "classics" along the lines of "knowing Unix and its history and concepts well" (re-learning shell and commandline wizardry on top), "understand decent C and Assembler", "becoming familar with the influential languages of important concepts like functional and OO programming (Scheme/Lisp and Smalltalk). This includes really understanding SQL (which I suddenly started to really like to my own surprise). Maybe I add some PostScript and TeX along the way. I also found a new appreciation for Perl's text processing capabilities and its influence from the 1990ies on.

Set aside that I constantly have to fight kind-of a "bad conscience" exactly BECAUSE I'm not hurrying along to try out the latest and greatest new fashion, I'm starting to feel a deep change in my programming skills, in my thinking about design and I'm constantly marveling where computing already has been and how much there is to learn from the classics. I also started to get a distinct feeling of "Man, I just don't NEED all this clutter and stuff" and a newly found appreciation of "simplicity" (Watch Rich Hickey's talk about "simple versus easy"..).

I started to slow down, to think more carefully, to read a lot more on concepts and ideas. I've finished recently Christopher Alexander's Pattern Language book for example to get a better feel for the "orginal" idea or read up on the history of "lean production" and Toyota's influence or looked again at Dieter Rams' design principles (I'm German and I basically grew up with Braun appliances, I didn't even realize how influential his designs have been to me..)

All this changed me and my thinking about programming deeply and made me find kind of a "central theme" or "essence" in programming and design I like and I'm starting to strive for.

In the long run, this also gives me a foundation of how I'm judging new tools, ideas, frameworks and programming languages.

On top of that, I'm better able to place myself into a certain "style" or "culture" of programming I'm not going to give upon as long as I can afford it.

Anyways - I personally think there actually is a choice whether one tries to catch up with everything new or peek into new things selectively or just steps back a little and watches how all this will unfold in the long run.

(I also have a long-held, un-proven personal theory that people simply like _writing_ frameworks a lot more than _using_ them - some are just faster to release theirs into the public.. :)


With relational databases, I found that when I actually took at look at logic and set theory that I started to really understand what Codd wanted to achieve.

Only... less Latin, more Math :-)


We need to rewind computing to 1990 and start again from there.


The dream of the 90s is alive in Portland: http://www.youtube.com/watch?v=TZt-pOc3moc


Just use what you enjoy.

I recently had a similar experience - on a whim I decided I would learn python by taking a asp.net mvc app I built and have maintained for the last 3 years and port it. So I just needed to choose a python web framework right? omg. So many, micro, mini, full stack, no stack, template engines, wsgi, uwsgi, whoa! My head was spinning

You can be completely overwhelmed just trying to make a decision, I haven't even looked at storage yet but I'll probably just use sqlite (the old .net one used db4o, which ended up being a huge mistake, great tool, pitfa to do maint on)

on a side note, loving python, and I settled on bottle, but I wrote my own template framework called canvas, inspired by the seaside component/tag/canvas classes.. essentially it's all just python code, no html.. example usage is https://gist.github.com/3087622 (i'll push it to github once I'm happy with it, hah!)


That looks pretty neat: the use of context managers seems very nice.

But is "canvas" (and "HtmlCanvas") meant to refer to the HTML5 <canvas> element? If not, I can see that being a point of huge confusion.


no it isn't, and you raise a good point, I'll change it

canvas.canvas() would definitely look weird


I've been writing Python code since 2003 and I still have that problem of having way too many frameworks. I still remember back in the old days where you had to write CGI manually.

I for one am glad for the existence of frameworks, but this can lead to pretty schizophrenic programming. For example, one of my latest projects has 3 front-facing ends. I somehow made a wrong judgment call and ended up with Flask, Bottle and Web.py on each end (the web.py end has been culled just today).


> Just use what you enjoy.

One has to use what the customer or project requires.

If it is something that we can enjoy, great, if not it is a job and the customer should always be an happy customer.


I think the customers in quite a lot of markets don't care about what language you use to implement your software--they only care about what it does.

The same is true for projects: there are relatively few projects that force you to use some particular language.

As people like to endlessly repeat, you should choose the best tool for the job. And, unless there are strong reasons against it, the best tool is one you like.

On a more cynical note: the job market is great right now. If you don't like the technology you're using, it's a prime time to look at new opportunities. Apply to some cool startups doing cool things with cool technologies--at the very least, it'll be exciting :P.


> The same is true for projects: there are relatively few projects that force you to use some particular language.

On the corporation consulting world I work, the technology is always part of the RFP sent by the customer to the consulting companies.


Quite a few years ago, I decided I would simply not invest my learning efforts into MS tech due to their acronym/framework spinning. It was a waste of my time. Of course, this limits me in some ways, but frees me to study more deeply into other areas.

It also happens with other tech areas. So I simply take the road around them unless it looks like I need to use one.

My personal work is done with Common Lisp, and I would prefer to keep it that way, by and large. It's a sufficiently extensible language that I can feel comfortable that (1) needed features will be programmed in, using CL, and (2) it will be stable for a long time.

But I guess that's a bit eccentric of me. :-)


I think this post is missing the point a little. Being a good programmer, despite what recruitment ads like to portray, is not about being proficient in dozens of frameworks and languages. No, it's about learning the skill of learning new frameworks and languages.

Getting into a new platform or familiarizing with a new tool is something you get better at the more you do it. Learning your eleventh programming language will be easier than learning the third. Getting familiar with yet another framework is easier if you know a few already and can compare the similarities and differences.

And let's not forget about the fundamentals. Having a strong background in web plumbing, understanding http and html, etc well is the key to being a good web programmer, not which frameworks you know by heart. Knowing a little theory about programming will make it easier to grok new languages faster.

If you're tired of constantly learning new frameworks and languages, go learn C and systems programming. That stuff is going to stick around for a long time and will provide secure jobs for the foreseeable future.


Pretty awesome distillation of the history of software tools since, oh, no later than 1980.

Back in the dino days, the rate of change was tiny compared to today's daily delta V. We mostly had monthly and a few weekly rags to tell us about the next big thing; there weren't that many next big things, anyway. Hey, it was a really big deal when cartridge tapes came out and we didn't have to thread a 1/2 inch magtape by hand anymore... Geez, who could forget 1-base-T networking? or 9600 baud modems? Whew.

Don't get me started on software innovations like Oracle SQL\Forms! ManOhMan! Fergit CICS!

Seriously - the barrier of entry is so low now, that the least of us can throw something up on the wall and if it sticks or even just leaves a little residue, the flies are all over it, preaching it up, "This is the best shit ever."

There really isn't much new out there, mostly it's just lots of new flavor wheels. Think "Hudsucker Proxy" at times like this...


oh, JavaScript you say? I stopped on jQuery! Everything else sort of flutters by... too much! Especially since .NET is going insane right now with kick-ass magic! So I'm still trying to catch up on .NET stuff and the JavaScript is sort of passing in another lane, but I'm like the old lady who won't take my eyes off of my lane beacuse I'm too scared to let go of the wheel that I'm barely hanging on to.

Exhausting as it may seem, its far cooler than 10 years ago! As another reason pointed out, use what you like.

ASP.NET MVC - check LINQ - check EntityFramework - check

next step, SignalIR!

next next step: Metro! :)


I've used .NET from the start and since 3.5/4.0 its just ramped up. Feels like every day there is a new acronym to master!

ASP.NET MVC 4.0 projects now include knockout.js by default. So that may be around for a while if only to support .NET devs who are just using out-of-the-box technologies.


This is a phenomenon limited to the frothy world of webdev. See http://prog21.dadgum.com/140.html


Could not agree more.

At the end of the day, what is important is to deliver a piece of software that works according to the customer requirements.

If the customer dictates the operating system, language, etc. Then it is already decided what to use.

If some technology liberty is given, then it should be something that is working as desired at the given deadline, regardless of the technology.


In marketing, there's something called 'the paradox of choice.' The theory goes (and there's some decent research to back this up) that if you give people too much choice, it decreases the probability that they will make any decision.


Actually, there's a theory very similar to that in UI/UX design. Menus that are too large often intimidate the user and cause them to bounce out of the site.


And here I thought I was creative for applying it to UX...thanks for this! :)


This is general psychology and applies just as well to product design as it does to marketing. Too much choice can be a bad thing both because it clutters up the flow of your product and because humans just aren't very good at picking their optimal set of choices. You wouldn't release a Twitter clone to the public that required you to check 100 yes/no feature boxes just to set up an account.


This explains the popularity of the iPhone and the iPad right?


Not necessarily. I don't know if it applies to products. I feel for products, since you get to hold them in your hands, its a much more subjective decision. With a website, you are only relying on your sense of sight to make decisions. With a product, you can use all 5 senses (maybe not taste, haha!). Does that make sense?


Didn't one of the original studies involve choices of jam at a supermarket? Seems like it applies equally well to products.

Compare and contrast the number of iPhone or iPad models with the rest of their respective industries. They're delineated along good/better/best and the tradeoffs are obvious. It's a lot easier to shop for an iPad than it is an arbitrary laptop.


It would apply in the way that there is only one iPad or iPod branded device from one company, which has marketed it as the only thing in its category. While on the other hand there is a plethora of various Android based tablets with many differing features, hence leading to confusion and analysis paralysis.


Reading your comment, I immediately had two thoughts:

1. Apple 2. Communism


Getting blown from one framework to another so easily means you're not doing a good job of getting traction on your project. Build some positive momentum and then staying the course will become natural.


It's an abstraction all the way down. Somewhat related: http://xkcd.com/676/


The article had all Buzz words(Positive sense) except Python.


Just use mod_perl. Duh!




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: