Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Were we more productive 10, 15, or 20 years ago?
98 points by dvanwag on Mar 11, 2018 | hide | past | web | favorite | 97 comments
Just wondering how many other people are feeling overwhelmed by technological "tools" thst are supposed to save us time and headache. Especially from those that are older, I am wondering if completing tasks was easier in the past than it is today.

I started professionally developing in '82. I'm 52 now. A major part of being an engineer is filtering out the noise generated by the marketing arms of our industry. As tech development grew, the number of available tools and platforms exploded, with an exponential growth in the marketing noise generated by all these tools trying succeed.

Since web development "matured", the number of disposable tools and the hype machine surrounding them has convinced a good population of programmers (who seem to be mostly young) that they are "not cool" if they are using older tool chains.

I see so many 20 and 30 something developers caught up trying new frameworks, trying new languages, and experimenting with their dev setup WHILE ON A PAYING CLIENT'S JOB, basically just fucking over their productivity, and needlessly generating self-stress.

I am super-uber productive. I am lead developer of 3 libraries, and 2 of the 3 flagship products of the facial recognition company I work. My background includes VFX production as a developer and digital artist for 9 major release feature films, lead 3D game console developer for 15 years, and OS developer of the original PlayStation. Through ALL that, I still use the same "make" I used back in the 80's, I hand write my make files, just give me a text editor and a compiler. That is ALL I need, that and to be left alone.

No slack/chat app bullshit, no trying out of tools, just doing the job with the tools I know very well. Delivering early, or over delivering on time, with 100% certainty of what I'm delivering because I know the libs and tool chain from years of experience with them.

I’d second that, for web devs too, a text editor is all you need.

After 20+ years but I try out most new tools. Just in case it turns out to be something great.

In webdev in 20 years Id say ONLY Github/Git, and the cloud Linux VPS, have met the standard of Great Adds to the toolchain.

React w/ Babel is undoubtably over hyped and yet surely its still a fairly significant value add over vanilla JS (when applicable) for how simple it can make state management and putting together UI.

Evergreen browsers, ES6 and etc are also make development significantly smoother in my experience, and I didn't even experience much before IE9.

I've been developing Android apps for 6 years. The tools from now to back then have gotten better, namely Kotlin, which lets us write the same thing with 30% less lines of code and less cognitive load.

However, I see that the "minimum" work has greatly increased. What was once 1 file for a screen now becomes 4-5 files and a folder.

While we once had 30 lines and 5 minutes to parse a JSON file, we now use an external library, with about 30 minutes to create a POJO for it. The new methodologies make API changes more costly too.

The added complexity also multiplies the damage from bugs, rushed schedules, team members dropping out, cost of documentation, and so on.

Most of these new tools are only useful if you want to keep on adding more engineers and still have something for them all to do. But for the most part, I feel that they just regulate productivity - keep everyone at roughly the same productivity, even if it means crippling someone from being 10x more productive.

Could you please not use allcaps for emphasis in HN comments?

This is in the site guidelines: https://news.ycombinator.com/newsguidelines.html.

the website which you list to be ceo of is having some certificate issues: www.3D-Avatar-Store.com

The 3D Avatar Store closed 3 years ago. I still own the domain, but there is nothing at it, so the error you get is a default.

I started programming (mainly web development) about 20 years ago.

I think the lack of current day productivity is due to there being so much more input available. 15-20 years ago you were forced to get into the mix of things and try things out because there weren't 800 blog posts and 227 youtube videos on the topic you're trying to learn.

Nowadays it's too easy to research yourself to death without trying anything because you're making decisions based on no experience, but instead experiences of others.

In attempts to find the perfect solution, you often do nothing. Always remember that you can get a lot done without it being perfect.

I look back at some old PHP projects I did in the early 2000s. Projects that have 7,500 line PHP files with mixed in HTML, JS, PHP, SQL, etc., but the funny thing is, some of those projects are still running today, unmaintained for 10+ years but work flawlessly on some old crappy shared host. Drag / drop FTP deploys with no version control of course.

On the other hand: remember cutting out images for rounded borders, using spacer images to align everything, using Javascript for roll over image swaps, GIF for 'transparency'?

But I agree for most part. I also have +15jr old PHP sites running. It was like upload and forget.

But I'm so glad we now have version control and better IDEs.

Haha yes. Using images for rounded borders was classic. I remember many sleepless nights of trying to do pixel perfect Photoshop to HTML layouts too. Or using many chained   characters to fix spacing issues.

Yeah version control is definitely a win.

I mean, back in the day my idea of testing was to take something like a blog.php file, copy it to blog2.php. Make my changes and upload it to the live production server. Since this new "2" version of the page wasn't linked anywhere I was free to manually test it in privacy.

Then if it worked as expected, I would delete the old blog.php file, rename the blog2.php to blog.php and upload it. That was a "release".

Zero downtime deploys without Kubernetes or load balancers, circa 2003.

> Zero downtime deploys without Kubernetes or load balancers, circa 2003.

it always irks me when i read a sentence like that.

* if something went wrong, your site would've been down. completely. irrelevent and easy to recover from as a small hobby page, but incredibly expensive if you need to honor a high uptime SLA.

* stateless services that don't need schema migrations are still very easy to upgrade without kubernetes -- or load balancers for that matter.

There have been version control and great IDEs for about 40 years.

Thats great and all, but a lot of those things had their problems and subsequent tools were created to fix them.

- FTP - deploys where one or two files would silently “not deploy” and since there are no automated testing, or error logging, it would sit like that for days until someone finds the issue

- Version control - its great when you’re a solo dev on a project, but once you have more people editting the same project, things begin to break very fast. And god help us if we’ve been editing the same file at the same time.

- SQL - sql injections anyone? Back then the auto exploit tools might have been in their infancy, and you could get away with security vuls sitting in production for years, but not any longer. The script kiddies are real and would find that _one_ place were you forgot to escape the string

- PHP - yeah 2018 php is actually awsome, but back then? If you had to do _anything_ different from just displaying a document site in english, you were in trouble, memory leaks, charecter encodings, image manipulation ... And someone else’s code was so hard to use that if it was not your main framework, or a single class function, it simply wasn’t worth the bother.

And yes I do have projects that were developed back then with those technologies, and are used successfully to this day (deployed by burning on a CD, and travelling to the next town to install it on a solaris box no less :))

I wouldn’t call dev “easier”, it had different problems, which were mostly solved by new tech. If you tried to do it the old way, you’ll run into the same old issues, _on top of_ the new requirements that you’d be very hard pressed to fullfill.

This description of analysis paralysis is spot on.

On the consumer front, you can be paperless now, and travel across the county carrying nothing but your phone. That feels like progress as far as efficiency.

Better tools and resources like Stack overflow etc.. have made us more productive.

But this has been partially offset by endless feature adding and abstraction upon abstration in software construction.

Its very difficult/impossible for people to stop development on something and say, this is all it can and should be.

The feature creep at every level of abstraction is a significantly bigger problem than just the existence of each layer of abstraction imo

That was part of Slack's reason for choosing PHP for their backend. Use a fancy language like Java or Node if you want to do "modern" development. PHP is for getting shit done.


That's a silly blogpost, to be honest. Any language provides what he descirbes under "State" if you run a script through some sort of CGI interface (I'm not very well informed on all the variations like FastCGI or httpd mod_<lang> stuff), which is roughly what PHP is doing too. Many languages (among which PHP too) have some web frameworks written in them which do complicate the applications, but for good: it's easier to write secure applications (at least against the most common attacks out there) with them. And then if you're gonna use PHP with some framework, then why bother the stupid language and not use a more proper one with a nice framework/library?

I think it is just the immediate mode, do a thing in a familiar environment and start making a web app. No dealing with tooling. No `pip install flask`. Nothing. Just write PHP and you have a web app. I was probably memeing a bit much there, but I think there were several solid points in the article. I would not reduce PHP to "run a script through some sort of CGI interface". I don't use PHP any more. Only Python so I pretty much agree on using a nice framework/library, but I still see this kind of casual dismissal of PHP. I think it has progressed enough to not be a toy language. It is no worse than JavaScript in terms of idiosyncrasies. It (PHP) has plenty of decent features these days and runs fast thanks to things like HHVM.

Not just "State." Any language with a mod_whatever or CGI interface provides all of the listed advantages, but without the horrible drawbacks.

I dont agree with the evangalism of tech companies like this. PHP has pitfalls like everything does and you need to include the context properly. What works for some does not work for many others due to differing constraints.


If you want to stop my productivity cold, use PHP. If you want me to just "get shit done", give me Python. Does this, by itself, make Python superior to PHP? No. It's what I'm used to using.

IOW, they are mistaking familiarity with tooling with the superiority of tooling.

Heh. Maybe I was just memeing a bit myself. I pretty much only use Python and Flask/Pyramid/Django for building web apps these days and can't imagine using anything else honestly.

Not your post, but I saw someone else calling the blog post out as silly. I used to write a lot of PHP for customers when I freelanced, and I use Python a lot now. PHP is very productive for many of the reasons stated and when you are doing very CRUDy apps PHP frameworks seem quite legit now at doing things like MVC.

Feels so true... unfortunately

As a young person who was 11 years old 10 years ago, I can't really answer that question. But I believe that most people don't know how to handle today's distractions, and it takes a toll on their productivity.

Growing up in midst of the rise of smartphones, I just now realized how much time I spent staring at a screen, being distracted by the constant flow of information and the constant need to communicate. And I still am, although I am cutting back on phone usage and senseless browsing. I just made the decision to subscribe to a weekly printed newspaper instead of fanatically checking the news every day (or sometimes every hour).

Now I don't have any data on this, but I'm observing that people start noticing the downsides of digital things and their interfaces, and start going back to well-tried "analog" stuff. So many blog posts about when to turn your phone off, apps that let you plant trees for not being on your phone (https://www.forestapp.cc/en/), products like the bullet journal (https://bulletjournal.com/) and feature reduced "dumbphones", etc.

It seems to me that this hype of making everything digital, from blackboards and car interfaces to home automation systems and coffee machines, happened without ever paying any thoughts to the real world profits gained by this. A prime example: Teachers use maybe 10% of the functions on an "ActiveBoard" (digital blackboards you can "write" on), spend a lot of time trying to get basic functions to work or calibrating the pen, and their writing looks crooked. Chalk on a blackboard worked just fine before that, the "interface" is intuitive, and the writing (depending on the teacher) more readable, and it didn't cost a couple thousand € of taxpayer money. I'd argue that for the most part ActiveBoards resulted in a big productivity loss in schools.

> But I believe that most people don't know how to handle today's distractions, and it takes a toll on their productivity.

When I'm teaching, I'd love to bring a hammer and smash phones students check them. They're doing it as a way to avoid thinking, as a way to always have someone entertain them. They're addicted and don't know it.

> I'd argue that for the most part ActiveBoards resulted in a big productivity loss in schools.

We don't use them here, but not that long ago, "dedicated" teachers used PowerPoint. I've visited universities where they'd put faculty in classrooms where the only way to "teach" was to use PowerPoint. No amount of technology changes the fact that you have to communicate.

Eh, before powerpoint, there were printed overhead transparencies. Before transparencies, there were teachers who spent their entire lecture writing directly from notes to the blackboard.

Rote teachers are rote teachers.

I LOVED transparencies. The teacher brought it to the classroom if was not already there, and plugged it in. And there you have the presentation. Sit and wait while a humanities professor is dabbling with the computer corpse installed in the classroom and the projector that projects nearly a diamond shaped image onto a random place on the wall, until they just give up after some dozen of minutes. Or they set up edmodo and send you PPTXs instead of proper notes, and you need to watch stupid animations until you can get to see that 64th page with the paragraph you need to read, hopefully not coloured stupidly or not illegible in another way due to an incompatibility with the thing on the phone that renders those files, so that you don't have to wait until you go back home to look at the thing on the computer.

I believe the only useful think we can teach is how to learn (because you can't teach people, you can only motivate to learn - they need to do the work themselves).

So regardless of your setting (kindergarden through university) - demonstrating good ways to study is worthwhile.

Demanding a classroom free of smartphones / social media is one very basic step in that direction.

You might be the first person to help a student realize that they are still smarter than their smart phone - and that they are in control, not it.

I recently discovered downloading library books onto phones, have you looked into that?

I live in a city and don’t have enough cash/room for hundreds of books in my apartment and don’t usually have time to reach a library branch. But this way I get/return books instantly and I don’t have to lug anything around on the subway. I read ebooks faster for some reason, too. No way to drop the book in a puddle or pour coffee on it. And finishing a library book on my phone is one of the rare times I feel like I actually used my phone to get something done.

I love tech when it cuts down on the number of objects I need to own or schlep.

I like to at least think that, if I were starting over today, I'd have a lot less clutter and physical "stuff." I think there's a mindset along the lines of "If I'm going to have a bunch of books, CDs, other stuff that today can all be digitized, then what's another ton or so of physical artifacts to lug around.

Not that I would necessarily behave that way but it's at least wishful thinking.

As a godforsaken millennial in the big American city, I do see us spending more on experiences, food, memberships, digital content, locations — things that you can’t touch — than objects. This has implications for the overall productivity of society, maybe? We produce less stuff that goes unused in attics and garages. What we do produce, we’re more likely to consume immediately or scale trivially. So less work on stuff no one will ever use. Maybe. Theory.

It’s purely anecdotal, yes. I think research papers and brand acquisition patterns can substantiate the claims we spend more on fancy food.

It could be our tastes which have been informed by analog and minimalist culture movements which I first noticed in the mid-2000s.

It could also be because we don’t have much disposable income and since home ownership is so hard where the jobs are (big cities) those with money still have nowhere to put stuff.

>since home ownership is so hard

I don't know the numbers but I doubt that home ownership has ever been all that common among twenty-somethings. And I fully expect that many current twenty-somethings will end up moving out of the city to get more space once they have families. That said, the interest in living in/near the urban cores of certain large cities is a relatively recent phenomenon. When I entered the tech industry in the mid-eighties, almost no one in my local cohort lived in the city (Boston/Cambridge) which indeed was still losing population and tech jobs to suburban/exurban areas.

Although Boston/Cambridge have long been quite good for food, culture, etc., I think it's fair to say that much of that sort of thing has been significantly upleveled over the past 20-30 years.

We just moved to a much smaller house. I have 5 boxes, probably 100kg, full of CDs and DVDs that I am ready to get rid of. Don't want to throw them out but rather have them put to a good use (but does anyone use them anymore?).

The hype of making everything digital stems from what I like to call the "oh wow cool" culture. When we see something interesting (and a big wall mounted screen that you can touch and interact, or a watch which can show you the weather is interesting), we tend to not consider its real word implications and usefulness.

> I just made the decision to subscribe to a weekly printed newspaper instead of fanatically checking the news every day (or sometimes every hour).

I'm one step ahead of you. I'm currently unsubscribing from my weekly printed newspaper to have more time to read books. :)

That said, I can totally recommend to everyone to get your news from a weekly newspaper instead of a daily or hourly medium like TV or the web. A weekly newspaper has just enough distance from the breaking news to provide a good balance between reporting and analysis/commentary.

No. Git is better, GitHub is better, most package managers for finding things are better, stack overflow is better, help forums are better, tools are often free, cloud access and compute power is better, instructional videos on youtube are better, online classes are better.

I don't want to go back to 10 years ago and definitely not 20 years ago. I have never been more productive than now.

No. You can spend 10x as long downloading random libraries and utilities looking for the perfect one than it would take to just write the function you need, and then you haul in a massive stack of dependencies that makes your project fragile and your build times long. I see projects now with literally hundreds of external dependencies. And if one of them breaks or suffers a security compromise you are toast. Say no to PyPI, NPM, Github and all the rest.

HUuuum idk about security being a problem now a days. The community overlook in an opensource library vastly outweighs my capabilities of writing secure libraries.

The community overlook in an opensource library

Haha really? https://mail.python.org/pipermail/security-announce/2017-Sep...

It’s the Wild West out there. And then there’s NPM...

+1 for package managers. Hard to imagine programming without them, and they're much better now than 10 years ago.

The maintainers of package managers and individual packages probably save humanity so many hours and probably are propelling our entire civilization forward in ways they probably don't even comprehend. The days before widespread package managers were really dark and terrible, and arguably the developer dark ages.

And those package manager maintainers (all the way from distros to software libraries) don’t see a dime of compensation for it, while companies use open source tooling to create enormous valuations. It’s a shame.

We salute you package maintainers!

And ragging on them seems to be a favourite past-time of HN. I'm a huge fan of literally any package manager.

People only complain about things they actively use.

Anyone feels Usenet is better than stack overflow?

I'm 39, and I've never been more productive. I do think there are some challenges today, that stems from more people being productive; we see fewer projects with great, in-depth documentation.

We tend to get decent tutorials, and not very useful machine generated reference material.

Think react vs "the c programming language". Or docker vs the man pages for bsd jails.

I think this makes it harder to fully grasp new technologies - there's often lack of clearly stated vision, or a problem statement (this tools makes x y and z easier with the following trade-off based on experience with a, b and c).

In The case of projects like docker, mongodb and puppet - I think we can blame marketing quite a bit for this.

As for distractions; just turn the stuff off at work, check in your free time. Hopefully as an adult you've formed some real and lasting relashionships; people will be there even if you take a day to get back to them...

[ed: as for that related trend of "keeping up to date" (used to mean reading byte and Dr dobbs etc, then the Web started to dominate with Slashdot, now it's here and reddit etc - I've found I'm much more comfortable taking a step back; the hype isn't interesting - what shakes out is.

It's a sad trend that news in general has become very much tabloid, real news agencies that pay for and do real analysis has been waning for a long time. Thankfully, technology is more powerful than ever, and enables things like the intercept or citizen reporting like http://www.raqqa-sl.com/ ]

I am wondering if completing tasks was easier in the past than it is today

Yes definitely. Here's how I used to work back in the day: I would sit down at my desk on which there would be one monitor, probably a 17" or if I was lucky a gorgeous 19" Trinitron with the blackest blacks. I would have my main tool maximised to full screen - Metrowerks CodeWarrior (Mac), Visual C++ (Windows) or SPARCworks (Sun). On my desk there would be a paper copy of the spec I was working on and a couple of reference books, all annotated by hand. I would code in two solid blocks interrupted only by lunch, and often times I would lose track of time and know it was lunchtime or hometime only when someone said.

Nowadays I work in little nibbles of 15-30 minutes at a time, and it's not just me either - everyone used to work like that and now works like this. Everyone seems busier because they are juggling more things at once, but studies have shown that multitasking is a myth, we get much less done than we used to. Modern tooling without modern distractions and interruptions would be the dream.

Your task is to keep your house at a steady 70F when you're there, and 55F when you're not. Can you do this task more quickly now than you could 20 years ago?

Your task is to assess your bills and financial health, including outstanding balances, account balances, interest rates and dividends returned.

Your task is to determine the weather forecast for this afternoon.

Your task is to turn on your computer and download a full-length film.

Your task is to design, implement and deploy a data-driven application with a web UI. You must find and allocate hosting. The UI must validate user entries and provide feedback.

Can you give some examples of tasks that take longer to do today?

Your task is to create a lasting representation of the struggle and beauty and plight of the human race.

Your task is to bring meaningful, sustainable, and relevant hope and quality of life improvement to society.

We can get more low-value tasks per unit of time now, thanks to technology. Unfortunately, there's something of our essential humanity that tech has robbed.

We pay lip service to the fact that tech has made these tasks easier and faster to do and that we are now freed to use more time engaging meaningful life. Instead, we are either so information weary, addicted to tech, or just disconnected from that essence that we don't use this free time well.

We're tired, lazy, disconnected, and addicted. Our ability to think critically across a range of disciplines has been broadly compromised.

So I'd rather just keep my thermostat at 67, balance my ledgers manually, plan for seasonally appropriate weather and adapt to change, go to the theater with my family and/or friends, and maybe write a book.

The only compelling application of technology is synchronizing financial assessments. Otherwise, I think I'd be happier if I wasn't addicted.

> Your task is to create a lasting representation of the struggle and beauty and plight of the human race. > Your task is to bring meaningful, sustainable, and relevant hope and quality of life improvement to society.

I'd be curious as to your opinion on the shape of this curve in history. Did we peak at some date in history? Perpetual downward slope? Perpetual upward slope?

> Your task is to create a lasting representation of the struggle and beauty and plight of the human race.

Compared to 20+ years ago, I think there is more popular appreciation for "meta" and pervasive performance art, which is a definite productivity booster for poets at heart. The technology isn't needed for writing, but it has forged an audience...

You may argue that you asked for "lasting" representation, and today's market is anything but. But, that's in fact part of the beauty and plight.

I think it's more that there's so many distractions now that a lot of people spend the whole day doing practically nothing.

Maybe in the past, it would take a whole night to do taxes. Today it would still take the whole night to do taxes, but three quarters of that time is going down the rabbit hole of googling tax exemptions, etc.

So presumably you’re able to optimize better given the same amount of time? That feels like a productivity increase to my way of thinking.

Robert Gordon wrote a whole book on this (“The Rise and Fall of American Growth”), and I think he’d answer your list by asking if any of those items were as transformative as going from not having electricity, to having electricity; not having indoor plumbing, to having indoor plumbing; not having a clothes washer, to having a clothes washer...

It used to take our family an hour or two to go to the video store, pick out a movie, return home — but we typically had fun doing it.

My dad got some weird pleasure out of owning the thermostat and telling us not to touch it.

Arguably people are better off if they can’t check out their investments all the time, and tax prep seems to suck just as much now as it did back then.

But honestly, yes — our digital tools are generally way faster and more advanced than they were a decade ago. I can do in an hour in Swift what would take me all day in Flash. But I’m also 10X more competent now, so it’s hard to directly compare. And I sure do spend a lot of time making sure I have the right dependencies installed any time I want to try running one of the shiny, new, JavaScript frameworks.

The real question, for me, is how distracted are we? If there have been productivity gains from technological advances, are those improvements undercut by our collective inability to focus on work for meaningful amounts of time?

Your task is to write a novel like Moby Dick.

I suspect there are not as many people with the concentration to do so as there used to be. I think that's what OP is getting at.

A smaller percentage perhaps.

There's almost certainly more people with the time and concentration to write a novel than there has ever previously been.

> Your task is to keep your house at a steady 70F when you're there, and 55F when you're not. Can you do this task more quickly now than you could 20 years ago?

I can, but I still have an old, analog thermostat that I could have had 20 years ago. Just turning the dial every time is, in my experience, much easier than fussing and babysitting a "smart" thermostat.

I only have experience with the "not very smart" programmable thermostats, which I adore. I've had one for over a decade with no issues. When I bought another house, it was one of the first things I put in. It's "slightly" easier (lower effort) to program it once and never touch it again (except for "exceptions") then to try to remember each time I move from zone to zone or go to work. Definitely pays for itself in net lower energy (unless you're super diligent yourself, which I am not!)

Imagine every switch in your house was just a screen, you had to swipe to unlock it, and then find the light switch you're standing at in a list of all switches in the house. That would be a nightmare.

How about, your task is to program a thermostat, but also keep the thermostat from mining bitcoins because an npm module you downloaded has been compromised.

Finance guy here: the productivity in finance has sky rocketed compared to 20 years ago. More computing power means quicker and better decision making. Higher level programming languages has translated into much quicker deliveries of new stuff. Quality tools has reduced a lot the needs of relying on other people.

I think this is the reality for most areas, it's never been simpler to actually get projects doing something useful in the wild

It's very difficult to say: "productivity" has changed because the "products" that we work toward have changed.

You spend an hour screwing around with a software problem while editing your video, but you're editing 4K shot on a consumer camera, not 8mm black and white film with no sound.

You lose an hour to some stupid router firmware problem, but a 30 min later, you are Facetime-ing with a friend and her new baby (not even possible for most people 10 years ago).

You reboot your phone to clear a weird Wifi problem, but 10 minutes later you're driving straight to the store that has your product in stock, while getting routed around a huge traffic jam. While listening to the radio show (er, podcast) you missed last week.

I think tech is really changing social interactions (and not entirely in a good way), but the time and headaches of life are still there -- they're just further up the abstraction stack.

No, we weren't. This is something carefully measured by economists. The workforce is consistently becoming more productive, albeit at a slower rate


I remember that if you wanted to do Windows programming in the pre-internet era, you'd have to buy some manuals via phone, then wait for about 3 weeks to get them delivered and then start hacking. You could also buy an MSDN subscription for several thousand dollars.

Nowadays you're just one google search away. Definitely more productive.

puts on tinfoil hat

We are convinced we are not productive - to the point we aren't recognizing when we are accomplishing more than 10, 15, 20 years ago.

I'm in the started 20x years ago boat. The last few years, with the explosion in the number of tools, I've made a consciously aggressive decision to make the tools be my servant whenever possible. It was easier in the past, because there were fewer tools to choose from, and in line with that a lot fewer unnecessary tools to make the mistake of wasting time on. We've hit an inflection, where simplicity explodes into vast specialization. That is likely to get worse before it gets better. People that used to be able to do it all, are now drowning in a wave of tools & increased specialization in an attempt to hold on to that ability. Most likely, if you're one of those people (I am), you'll either have to let go of trying to be able to use every tool, or you'll have to choose to specialize more.

It probably clicks at a different point for everyone, whether 5, 10, 20 years - eventually the thing you'll find that's always going to end up being more valuable than learning a dozen new tools, is your time. Your time is an extremely scarce resource. I'd like to emphasize that 407 times in a row here.

Tools aren't really tools if they don't make you more productive in some manner. What would be the point of a fancy new hammer that caused you 5x more work and accomplished the exact same result as a traditional hammer?

To answer your question. In my opinion, yes, five or more years ago it was easier and faster to complete routine tasks when it comes to building Web/Internet services & product. You can counter the growing complexity by refusing to do unnecessary things. If you don't need to use a tool, do not use it just because it's the latest fad.

Python/PHP/equivalent + Postgres/MySQL/equivalent + vanilla JavaScript = 90-95% of what you'll ever need. If you're building the next juggernaut Web service, then sure, use more tools if you have to, the emphasis though goes on the have to part.

No, I wasn't. About 15 years ago I've read about older "neckbeard" developers who allegedly could solve complex problems with a few lines of concise clear code in a very Shir amount of time. At the time I thought that was a myth, but now I realize I'm becoming one.

Tooling can be overwhelming, but one don't have to master every single tool they encounter. To this day I use only a few shortcuts in my editor, I don't remember every command line switch that exists for ls, cp, or git. I launch the Dec tools in my browser using a mouse, etc.

But it doesn't matter. These days I make fewer mistakes, and when there's a bug I can accurately predict where its source is.

do you mean "greybeard"? unless the meaning has shifted a lot over time, you probably don't want to call yourself a "neckbeard".

But isn’t calling yourself a neckbeard instead of a greybeard kind of a neckbeard thing to do? Maybe we should just leave it alone? ;)

I mean yeah, but I just don't want the poor fellow going around announcing himself as a neckbeard irl.

Having dabbled in web development for 15+ years, I feel like there can be a lot of pain involved in getting a dev environment set up and configured correctly for a project these days. But to me it is worth it: the resulting workflow becomes as simple as it used to be in the old days, except there is now much more power and flexibility available to me as I work. I sure hate debugging some BS related to a nested dependency in some random package, but I like having my code linted and compiled to ES5 meaning I can use modern conveniences without too much panic on the cross-browser front. I think over all I'm a lot more productive with the new tools.

We can be more productive but the bar keeps being been raised as our productivity goes up. Customers expect more in terms of non-functional requirements such as usability, performance, maintainability and security. This in turn fuels an arms race in tools and technologies which a lot of developers are struggling to keep up with.

Some of these productivity gains are insignificant but developers have to follow them anyway or risk being left behind. This means we're spending more of our day learning and less of our day developing.

As a developer it has become more important to find a comfort zone with a few well chosen tools and keeping an eye open for incremental improvements.

What do you mean by "task"? It has never been easier (and cheaper) to write fault-tolerant software and deploy it on a global scale to thousands or millions of users.

Or do you mean things like shopping and cleaning?

Yes? I think we get more done in the short-to-mid-term but the fruits of our labor are less likely to survive long-term. There's no value judgement there - there are pro's & con's. At 40 years around the sun, I've got one leg in the "Get off my lawn!" and the other leg in the "Damn, these kids are wicked smart!" camp. I find that people over-focus on the seemingly "attention deficit" critiques of modern day, but even if that is the case, I think it's a small price to pay for having orders of magnitude more people in the space - there will be a lot of failures but there will be more successes relative to prior generations.

If we want to blame younger generations for not "thinking deeper" or "solving the hard problems", then we should shift our focus towards the economic drivers that are favoring quick/quantity over quality. I know how to develop software but if an employer is given the choice of fixing a buggy tool at the cost of maintaining a fork (GASP!) vs spending unbounded man-hours working around said buggy tool, my money's on that they will choose the latter since maintaining tool X is not a core competency. Meanwhile, tool X is hemoraging production data or producing buggy results and nobody flinches.

10 years ago I had a private office, a small team, no scheduled meetings.

Really hard to find a workplace like that anymore, and it sure as hell was more productive than open offices and agile cargo culting.

No, absolutely not. It’s myopic to even entertain the possibility. At least in web dev, which I have experienced.

For some reason people still love to criticize browsers, even though they have improved dramatically. Around 2000, every UI required two or three versions. Once you got it done in Firefox, you would start on the IE version, which would take almost as much time as the first implementation. Then, you’d repeat the process for each point release of IE.

On the server, rails was the big revolution. The big achievement of DHH wasn’t so much techical as social: here, you had an extremely well-curated set of best practices, smuggled into your project under the guise of a framework. Before, you’d join a team and find the database password was in a switch block three levels deep in config/real/colors.inc.php3.~bac.php. Suddenly, you could join any team and know your way around the codebase on the second day.

Many other things that used to be major work items have become trivially easy. It would routinely take me a day or so just to get somewhat recent versions of Apache, php, and MySQL (and all their libraries) to compile on a new server. You’d encounter all sorts of difficult-to-debug networking problems when moving to production. There were certain tasks that seem like they should be trivial, but were actually terribly difficult to pull off in the web stack, such as chats.

I am 57. And 40 years in software development. And I would say yes, but with some caveats.

First, the speed with which new frameworks and tools come out, definitely slows things down, because of the need to learn them all over again. Second, part of my past productivity was thanks to me writing on top of what existed. Example: 20 years ago I wrote a new product in PHP/HTML etc. Had both realtime and CRUD. I predicted that in the end it would need a lot of screens, so I started with writing an extensive, data-driven screen library. This was the main reason we could crank out new features like crazy. 2 years ago I picked up Laravel, the 'best of breed' these days, and was surprised that it did not include anything that.

On the other hand, the field has advanced considerably: concepts gaining traction like immutability, functional programming, CI/CD, containerization, really helped moving everything forward. So I'm not an old grumpy greybeard that claims there's nothing new under the sun, and am really exited about a lot of the new stuff. Which I am happy to use.

I started out with mainframe assembler and C, and am now in Elixir, Docker, React Native. Higher level languages really help general productivity. So my answer is yes, we have become more productive.

I think yes and we were more focused on the task to achieve but on the other hand, computers were dead slower than today's ones. I remember launching tasks for a whole night (for example generating a mandelbrot 720×348 MDA picture near 1984 on IBM PC) and checking the result the following day. Now we can code (IDE with watch mode) and see the result in realtime as we can see on Live Coding session video on YouTube.

Heh. I remember getting a 68000 to increment a (32-bit) zero until it was zero again. It took it 8 hours. Nowadays, it's a second (if the compiler doesn't optimize it out completely).

What's that? We're discussing productivity? Oh. Carry on...

Yes, there is a lot more choice and some of those choices are questionable, especially if you aren't working at Google/Facebook scale.

From a productivity point of view you can ignore a lot of it and that is largely what I have done. However if you intend to have a long career in this industry that might not be a good idea.

I recently gave a software engineering talk at a high school's career day, and a student asked how the industry has changed, and it caused me to think about this. In a lot of ways I think it's a wash. We could do less with the tools back then but we were expected to do less. In the 80's we had Borland C++ or VC++ to write our text-based screens and BTrieve-based databases on the same machine. The tools have expanded considerably but we're expected to spin up databases, code for them, handle multi-user, multithreaded (or async) coding, GUIs or mobile web-based UIs, use different languages for each part of the puzzle.

The ability to communicate with anyone around the world within seconds surely helped in getting things done. I don’t know if it created even more tasks, but I would rather have this ability than not.

There is an overabundance of tools becausd there are less original ideas that can lead to profitable business. In other words people cant stay idle so they made too many tools while researching business. You can browse HN from 10years ago to see that people discussed tools far less.

I think programmers were always equally productive at any time, because their job is to constantly automate the non-productive parts, so sooner rather than later they are eliminated.

I felt that I had better tools 10 years ago...

Comparing apples with apples - what we produce today on the WWW and could we have done the same stuff faster 20 years ago? Absolutely not. We are much more productive today. BUT... so it all of the competition, so in terms of winning rat races, no.

What I see is that a lot of people feel the need to always look for new tools and technology to fix their problems, often causing them to get in a big loop of constantly finding new issues and challenges.

My biggest challenge these days if finding the just right amount of "new tools / solutions" that I can get control over, without losing focus on the problem I'm actually solving. I often find myself "too deep down the rabbit hole", and am getting more and more comfortable with just cutting all the "cool new stuff" and find another angle to get things done. 9 out of 10 times you can find a very nice solution within the technology you master, and it's best to ignore that "I need to use cool new stuff"-itch.

Kind of forgot the answer the question: Not at all!

The power you have these days over the complete spectrum of software is just magic! You can build a whole new platform / application in days! It's a wonderful time to be in this industry and I love it!

This is why a simplicity of style is something we should celebrate. My best known essay is "Object Oriented Programming Is An Expensive Disaster Which Must End" and I think one reason that essay remains so popular is because there are many of us who feel the way dvanwag feels: that we actually become less productive when weighed down with too many frameworks, too many tools, too many abstractions.


But it is important to realize that this has nothing to do with time passing. The most expensive failed software project ever was the attempt to the modernize the FAA, in the USA, from 1982 to 1994, a project which cost $3.7 billion and which failed completely. And there, too, the problem was too many tools, too much abstraction, too many experiments with options:

"The project was handed over to human factor pundits, who then drove the design. Requirements became synonymous with preferences. Thousands of labor-months were spent designing, discussing, and demonstrating the possibilities: colors, fonts, overlays, reversals, serpentine lists, toggling, zooming, opaque windows, the list is huge. It was something to see. (Virtually all of the marketing brochures – produced prematurely and in large numbers – sparkled with some rendition or other of the new controller console.) It just wasn’t usable… The cost of what turned out to be a 14-year human factors study did not pay off. Shortly before the project was terminated a controller on the CBS evening news said: “It takes me 12 commands to do what I used to do with one.” I believe he spoke for everyone with common sense."


So we would be wrong to think that software developers were productive in the past, whereas now they are not productive. But rather, some paradigms of development tend towards too much abstraction, and when followed they lead to failed projects. That was true in the 1980s, and it is true now.

I do think, on the frontend, the desire to take markup languages such as HTML, and then make them work on all output devices (desktop computers, tablets, mobile phones) has lead to an era where too much abstraction is the norm on frontend projects. I tried to imagine an alternative in my essay "The problem with HTML":


But feeling productive and being productive are not necessarily the same thing. Sometimes I feel really productive at a task (e.g. writing a nested data arser in Bash) with a lot of unnecessary drudge work because I can comfortably plug away at it for a long period of time. But with the drudge work mitigated (e.g. using a real language with a parser framework) I'm left focusing on the hard problems, which causes me to become mentally drained more quickly, procrastinate more, and feel less productive overall.

Consider building sites with WordPress. Does anything in the past compares ?

Assuming this is not IT-only, I'll share my thoughts as a humanities student (and prospective researcher).

For my first year in the uni, after deciding to not pursue programming professionally, I decided to go low-tech: notebooks and agenda, internet only when necessary. I definitely read more books and got more of my todo-lists done.

Then I decided I'd use the computer more, links and images were logical extensions to notetaking, and being able to hyperlink all my notes and documents was an unforegoable improvement (thank you Org-mode and Emacs, if not for these two jewels, I'd not bother using a computer for anything else than the browser).

Nowadays most of my stuff is digital, and I use some "tools" to interact with them. I research in the browser, I'm slowly getting the habit of reading shorter papers with less than 20 or so pages on the computer, almost all my notes are on the computer (with some waiting to be digitised), I use Emacs and other tools for processing all sort of data (not in the statistical sense), version control is used everywhere, from todo lists to orgs to short stories to research notes, wherever applicable (I use Mercurial or RCS generally, depending on the task).

Comparing the two ways of working, I definitely have to put up with more distraction with the digital setup, but I also know and learn more. It's hard to tame internet to not be invasive (middle finger to content websites which disable RSS feeds for pageviews, fuck you all), given most actors are actively trying to be invasive. Recently, for example, I found a reseacher whose work I wanted to follow. She had a twitter profile and an academia.com one, both platforms that provide no RSS feeds. No other profiles. Now I either have to open myself to this sort of invasive websites that are constantly trying to learn more about me and push stuff all the time, or just not follow her. And I chose the latter. But that's fucked. The business model of the internet is "we give you some stuff, often other people's stuff, pay us with your attention, and moreover we sell you to as much advertisers as we can". That's fucked, but given the utility of internet, one has to learn to put up with it, and that's not all that easy.

I do not follow live news, even with RSS. I'm subscribed to some mailing lists from newspapers and journals, weekly or daily. Other than this, I use RSS extensively. If your page does not have an RSS feed, I'll probably not follow you. A newsletter? Only when your thing is really interesting, and you post no more often than weekly. I follow youtube channels with it, so I don't have to open YT homepage and be subject to many interesting but distracing links I might be tempted to click. I do not enable notifications from anything, including mail, even on mobile phone. I decide when I want to know about the outer world: check feeds or mail manually, when I want. I use no social media. I do use Reddit, but I don't subscribe to any sub, instead, I group them into multis, and check the relevant multi when I think I need to see sth. there (and to my surprise I spend a fraction of the time I spent there in the past, even when subscribed to a handful subs, when I have a blank page from reddit.com).

The biggest distraction with computers is internet. And one needs to learn how to use it defensively. Maybe we need a "Defensive Internet Users" wiki thing?

Agile sprint with barely no gap does not help

Web dev, yes.

In what way, can you specify?

No doubt about it!

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact