Since web development "matured", the number of disposable tools and the hype machine surrounding them has convinced a good population of programmers (who seem to be mostly young) that they are "not cool" if they are using older tool chains.
I see so many 20 and 30 something developers caught up trying new frameworks, trying new languages, and experimenting with their dev setup WHILE ON A PAYING CLIENT'S JOB, basically just fucking over their productivity, and needlessly generating self-stress.
I am super-uber productive. I am lead developer of 3 libraries, and 2 of the 3 flagship products of the facial recognition company I work. My background includes VFX production as a developer and digital artist for 9 major release feature films, lead 3D game console developer for 15 years, and OS developer of the original PlayStation. Through ALL that, I still use the same "make" I used back in the 80's, I hand write my make files, just give me a text editor and a compiler. That is ALL I need, that and to be left alone.
No slack/chat app bullshit, no trying out of tools, just doing the job with the tools I know very well. Delivering early, or over delivering on time, with 100% certainty of what I'm delivering because I know the libs and tool chain from years of experience with them.
After 20+ years but I try out most new tools. Just in case it turns out to be something great.
In webdev in 20 years Id say ONLY Github/Git, and the cloud Linux VPS, have met the standard of Great Adds to the toolchain.
Evergreen browsers, ES6 and etc are also make development significantly smoother in my experience, and I didn't even experience much before IE9.
However, I see that the "minimum" work has greatly increased. What was once 1 file for a screen now becomes 4-5 files and a folder.
While we once had 30 lines and 5 minutes to parse a JSON file, we now use an external library, with about 30 minutes to create a POJO for it. The new methodologies make API changes more costly too.
The added complexity also multiplies the damage from bugs, rushed schedules, team members dropping out, cost of documentation, and so on.
Most of these new tools are only useful if you want to keep on adding more engineers and still have something for them all to do. But for the most part, I feel that they just regulate productivity - keep everyone at roughly the same productivity, even if it means crippling someone from being 10x more productive.
This is in the site guidelines: https://news.ycombinator.com/newsguidelines.html.
I think the lack of current day productivity is due to there being so much more input available. 15-20 years ago you were forced to get into the mix of things and try things out because there weren't 800 blog posts and 227 youtube videos on the topic you're trying to learn.
Nowadays it's too easy to research yourself to death without trying anything because you're making decisions based on no experience, but instead experiences of others.
In attempts to find the perfect solution, you often do nothing. Always remember that you can get a lot done without it being perfect.
I look back at some old PHP projects I did in the early 2000s. Projects that have 7,500 line PHP files with mixed in HTML, JS, PHP, SQL, etc., but the funny thing is, some of those projects are still running today, unmaintained for 10+ years but work flawlessly on some old crappy shared host. Drag / drop FTP deploys with no version control of course.
But I agree for most part. I also have +15jr old PHP sites running. It was like upload and forget.
But I'm so glad we now have version control and better IDEs.
Yeah version control is definitely a win.
I mean, back in the day my idea of testing was to take something like a blog.php file, copy it to blog2.php. Make my changes and upload it to the live production server. Since this new "2" version of the page wasn't linked anywhere I was free to manually test it in privacy.
Then if it worked as expected, I would delete the old blog.php file, rename the blog2.php to blog.php and upload it. That was a "release".
Zero downtime deploys without Kubernetes or load balancers, circa 2003.
it always irks me when i read a sentence like that.
* if something went wrong, your site would've been down. completely. irrelevent and easy to recover from as a small hobby page, but incredibly expensive if you need to honor a high uptime SLA.
* stateless services that don't need schema migrations are still very easy to upgrade without kubernetes -- or load balancers for that matter.
- FTP - deploys where one or two files would silently “not deploy” and since there are no automated testing, or error logging, it would sit like that for days until someone finds the issue
- Version control - its great when you’re a solo dev on a project, but once you have more people editting the same project, things begin to break very fast. And god help us if we’ve been editing the same file at the same time.
- SQL - sql injections anyone? Back then the auto exploit tools might have been in their infancy, and you could get away with security vuls sitting in production for years, but not any longer. The script kiddies are real and would find that _one_ place were you forgot to escape the string
- PHP - yeah 2018 php is actually awsome, but back then? If you had to do _anything_ different from just displaying a document site in english, you were in trouble, memory leaks, charecter encodings, image manipulation ... And someone else’s code was so hard to use that if it was not your main framework, or a single class function, it simply wasn’t worth the bother.
And yes I do have projects that were developed back then with those technologies, and are used successfully to this day (deployed by burning on a CD, and travelling to the next town to install it on a solaris box no less :))
I wouldn’t call dev “easier”, it had different problems, which were mostly solved by new tech. If you tried to do it the old way, you’ll run into the same old issues, _on top of_ the new requirements that you’d be very hard pressed to fullfill.
Better tools and resources like Stack overflow etc.. have made us more productive.
But this has been partially offset by endless feature adding and abstraction upon abstration in software construction.
Its very difficult/impossible for people to stop development on something and say, this is all it can and should be.
IOW, they are mistaking familiarity with tooling with the superiority of tooling.
Not your post, but I saw someone else calling the blog post out as silly. I used to write a lot of PHP for customers when I freelanced, and I use Python a lot now. PHP is very productive for many of the reasons stated and when you are doing very CRUDy apps PHP frameworks seem quite legit now at doing things like MVC.
Growing up in midst of the rise of smartphones, I just now realized how much time I spent staring at a screen, being distracted by the constant flow of information and the constant need to communicate. And I still am, although I am cutting back on phone usage and senseless browsing. I just made the decision to subscribe to a weekly printed newspaper instead of fanatically checking the news every day (or sometimes every hour).
Now I don't have any data on this, but I'm observing that people start noticing the downsides of digital things and their interfaces, and start going back to well-tried "analog" stuff. So many blog posts about when to turn your phone off, apps that let you plant trees for not being on your phone (https://www.forestapp.cc/en/), products like the bullet journal (https://bulletjournal.com/) and feature reduced "dumbphones", etc.
It seems to me that this hype of making everything digital, from blackboards and car interfaces to home automation systems and coffee machines, happened without ever paying any thoughts to the real world profits gained by this.
A prime example: Teachers use maybe 10% of the functions on an "ActiveBoard" (digital blackboards you can "write" on), spend a lot of time trying to get basic functions to work or calibrating the pen, and their writing looks crooked. Chalk on a blackboard worked just fine before that, the "interface" is intuitive, and the writing (depending on the teacher) more readable, and it didn't cost a couple thousand € of taxpayer money. I'd argue that for the most part ActiveBoards resulted in a big productivity loss in schools.
When I'm teaching, I'd love to bring a hammer and smash phones students check them. They're doing it as a way to avoid thinking, as a way to always have someone entertain them. They're addicted and don't know it.
> I'd argue that for the most part ActiveBoards resulted in a big productivity loss in schools.
We don't use them here, but not that long ago, "dedicated" teachers used PowerPoint. I've visited universities where they'd put faculty in classrooms where the only way to "teach" was to use PowerPoint. No amount of technology changes the fact that you have to communicate.
Rote teachers are rote teachers.
So regardless of your setting (kindergarden through university) - demonstrating good ways to study is worthwhile.
Demanding a classroom free of smartphones / social media is one very basic step in that direction.
You might be the first person to help a student realize that they are still smarter than their smart phone - and that they are in control, not it.
I live in a city and don’t have enough cash/room for hundreds of books in my apartment and don’t usually have time to reach a library branch. But this way I get/return books instantly and I don’t have to lug anything around on the subway. I read ebooks faster for some reason, too. No way to drop the book in a puddle or pour coffee on it. And finishing a library book on my phone is one of the rare times I feel like I actually used my phone to get something done.
I love tech when it cuts down on the number of objects I need to own or schlep.
Not that I would necessarily behave that way but it's at least wishful thinking.
It’s purely anecdotal, yes. I think research papers and brand acquisition patterns can substantiate the claims we spend more on fancy food.
It could be our tastes which have been informed by analog and minimalist culture movements which I first noticed in the mid-2000s.
It could also be because we don’t have much disposable income and since home ownership is so hard where the jobs are (big cities) those with money still have nowhere to put stuff.
I don't know the numbers but I doubt that home ownership has ever been all that common among twenty-somethings. And I fully expect that many current twenty-somethings will end up moving out of the city to get more space once they have families. That said, the interest in living in/near the urban cores of certain large cities is a relatively recent phenomenon. When I entered the tech industry in the mid-eighties, almost no one in my local cohort lived in the city (Boston/Cambridge) which indeed was still losing population and tech jobs to suburban/exurban areas.
Although Boston/Cambridge have long been quite good for food, culture, etc., I think it's fair to say that much of that sort of thing has been significantly upleveled over the past 20-30 years.
I'm one step ahead of you. I'm currently unsubscribing from my weekly printed newspaper to have more time to read books. :)
That said, I can totally recommend to everyone to get your news from a weekly newspaper instead of a daily or hourly medium like TV or the web. A weekly newspaper has just enough distance from the breaking news to provide a good balance between reporting and analysis/commentary.
I don't want to go back to 10 years ago and definitely not 20 years ago. I have never been more productive than now.
Haha really? https://mail.python.org/pipermail/security-announce/2017-Sep...
It’s the Wild West out there. And then there’s NPM...
We salute you package maintainers!
We tend to get decent tutorials, and not very useful machine generated reference material.
Think react vs "the c programming language". Or docker vs the man pages for bsd jails.
I think this makes it harder to fully grasp new technologies - there's often lack of clearly stated vision, or a problem statement (this tools makes x y and z easier with the following trade-off based on experience with a, b and c).
In The case of projects like docker, mongodb and puppet - I think we can blame marketing quite a bit for this.
As for distractions; just turn the stuff off at work, check in your free time. Hopefully as an adult you've formed some real and lasting relashionships; people will be there even if you take a day to get back to them...
[ed: as for that related trend of "keeping up to date" (used to mean reading byte and Dr dobbs etc, then the Web started to dominate with Slashdot, now it's here and reddit etc - I've found I'm much more comfortable taking a step back; the hype isn't interesting - what shakes out is.
It's a sad trend that news in general has become very much tabloid, real news agencies that pay for and do real analysis has been waning for a long time. Thankfully, technology is more powerful than ever, and enables things like the intercept or citizen reporting like http://www.raqqa-sl.com/ ]
Yes definitely. Here's how I used to work back in the day: I would sit down at my desk on which there would be one monitor, probably a 17" or if I was lucky a gorgeous 19" Trinitron with the blackest blacks. I would have my main tool maximised to full screen - Metrowerks CodeWarrior (Mac), Visual C++ (Windows) or SPARCworks (Sun). On my desk there would be a paper copy of the spec I was working on and a couple of reference books, all annotated by hand. I would code in two solid blocks interrupted only by lunch, and often times I would lose track of time and know it was lunchtime or hometime only when someone said.
Nowadays I work in little nibbles of 15-30 minutes at a time, and it's not just me either - everyone used to work like that and now works like this. Everyone seems busier because they are juggling more things at once, but studies have shown that multitasking is a myth, we get much less done than we used to. Modern tooling without modern distractions and interruptions would be the dream.
Your task is to assess your bills and financial health, including outstanding balances, account balances, interest rates and dividends returned.
Your task is to determine the weather forecast for this afternoon.
Your task is to turn on your computer and download a full-length film.
Your task is to design, implement and deploy a data-driven application with a web UI. You must find and allocate hosting. The UI must validate user entries and provide feedback.
Can you give some examples of tasks that take longer to do today?
Your task is to bring meaningful, sustainable, and relevant hope and quality of life improvement to society.
We can get more low-value tasks per unit of time now, thanks to technology. Unfortunately, there's something of our essential humanity that tech has robbed.
We pay lip service to the fact that tech has made these tasks easier and faster to do and that we are now freed to use more time engaging meaningful life. Instead, we are either so information weary, addicted to tech, or just disconnected from that essence that we don't use this free time well.
We're tired, lazy, disconnected, and addicted. Our ability to think critically across a range of disciplines has been broadly compromised.
So I'd rather just keep my thermostat at 67, balance my ledgers manually, plan for seasonally appropriate weather and adapt to change, go to the theater with my family and/or friends, and maybe write a book.
The only compelling application of technology is synchronizing financial assessments. Otherwise, I think I'd be happier if I wasn't addicted.
I'd be curious as to your opinion on the shape of this curve in history. Did we peak at some date in history? Perpetual downward slope? Perpetual upward slope?
Compared to 20+ years ago, I think there is more popular appreciation for "meta" and pervasive performance art, which is a definite productivity booster for poets at heart. The technology isn't needed for writing, but it has forged an audience...
You may argue that you asked for "lasting" representation, and today's market is anything but. But, that's in fact part of the beauty and plight.
Maybe in the past, it would take a whole night to do taxes. Today it would still take the whole night to do taxes, but three quarters of that time is going down the rabbit hole of googling tax exemptions, etc.
It used to take our family an hour or two to go to the video store, pick out a movie, return home — but we typically had fun doing it.
My dad got some weird pleasure out of owning the thermostat and telling us not to touch it.
Arguably people are better off if they can’t check out their investments all the time, and tax prep seems to suck just as much now as it did back then.
The real question, for me, is how distracted are we? If there have been productivity gains from technological advances, are those improvements undercut by our collective inability to focus on work for meaningful amounts of time?
I suspect there are not as many people with the concentration to do so as there used to be. I think that's what OP is getting at.
There's almost certainly more people with the time and concentration to write a novel than there has ever previously been.
I can, but I still have an old, analog thermostat that I could have had 20 years ago. Just turning the dial every time is, in my experience, much easier than fussing and babysitting a "smart" thermostat.
You spend an hour screwing around with a software problem while editing your video, but you're editing 4K shot on a consumer camera, not 8mm black and white film with no sound.
You lose an hour to some stupid router firmware problem, but a 30 min later, you are Facetime-ing with a friend and her new baby (not even possible for most people 10 years ago).
You reboot your phone to clear a weird Wifi problem, but 10 minutes later you're driving straight to the store that has your product in stock, while getting routed around a huge traffic jam. While listening to the radio show (er, podcast) you missed last week.
I think tech is really changing social interactions (and not entirely in a good way), but the time and headaches of life are still there -- they're just further up the abstraction stack.
Nowadays you're just one google search away. Definitely more productive.
We are convinced we are not productive - to the point we aren't recognizing when we are accomplishing more than 10, 15, 20 years ago.
It probably clicks at a different point for everyone, whether 5, 10, 20 years - eventually the thing you'll find that's always going to end up being more valuable than learning a dozen new tools, is your time. Your time is an extremely scarce resource. I'd like to emphasize that 407 times in a row here.
Tools aren't really tools if they don't make you more productive in some manner. What would be the point of a fancy new hammer that caused you 5x more work and accomplished the exact same result as a traditional hammer?
To answer your question. In my opinion, yes, five or more years ago it was easier and faster to complete routine tasks when it comes to building Web/Internet services & product. You can counter the growing complexity by refusing to do unnecessary things. If you don't need to use a tool, do not use it just because it's the latest fad.
Tooling can be overwhelming, but one don't have to master every single tool they encounter. To this day I use only a few shortcuts in my editor, I don't remember every command line switch that exists for ls, cp, or git. I launch the Dec tools in my browser using a mouse, etc.
But it doesn't matter. These days I make fewer mistakes, and when there's a bug I can accurately predict where its source is.
Some of these productivity gains are insignificant but developers have to follow them anyway or risk being left behind. This means we're spending more of our day learning and less of our day developing.
As a developer it has become more important to find a comfort zone with a few well chosen tools and keeping an eye open for incremental improvements.
Or do you mean things like shopping and cleaning?
If we want to blame younger generations for not "thinking deeper" or "solving the hard problems", then we should shift our focus towards the economic drivers that are favoring quick/quantity over quality. I know how to develop software but if an employer is given the choice of fixing a buggy tool at the cost of maintaining a fork (GASP!) vs spending unbounded man-hours working around said buggy tool, my money's on that they will choose the latter since maintaining tool X is not a core competency. Meanwhile, tool X is hemoraging production data or producing buggy results and nobody flinches.
Really hard to find a workplace like that anymore, and it sure as hell was more productive than open offices and agile cargo culting.
For some reason people still love to criticize browsers, even though they have improved dramatically. Around 2000, every UI required two or three versions. Once you got it done in Firefox, you would start on the IE version, which would take almost as much time as the first implementation. Then, you’d repeat the process for each point release of IE.
On the server, rails was the big revolution. The big achievement of DHH wasn’t so much techical as social: here, you had an extremely well-curated set of best practices, smuggled into your project under the guise of a framework. Before, you’d join a team and find the database password was in a switch block three levels deep in config/real/colors.inc.php3.~bac.php. Suddenly, you could join any team and know your way around the codebase on the second day.
Many other things that used to be major work items have become trivially easy. It would routinely take me a day or so just to get somewhat recent versions of Apache, php, and MySQL (and all their libraries) to compile on a new server. You’d encounter all sorts of difficult-to-debug networking problems when moving to production. There were certain tasks that seem like they should be trivial, but were actually terribly difficult to pull off in the web stack, such as chats.
First, the speed with which new frameworks and tools come out, definitely slows things down, because of the need to learn them all over again.
Second, part of my past productivity was thanks to me writing on top of what existed.
Example: 20 years ago I wrote a new product in PHP/HTML etc. Had both realtime and CRUD. I predicted that in the end it would need a lot of screens, so I started with writing an extensive, data-driven screen library. This was the main reason we could crank out new features like crazy.
2 years ago I picked up Laravel, the 'best of breed' these days, and was surprised that it did not include anything that.
On the other hand, the field has advanced considerably: concepts gaining traction like immutability, functional programming, CI/CD, containerization, really helped moving everything forward. So I'm not an old grumpy greybeard that claims there's nothing new under the sun, and am really exited about a lot of the new stuff. Which I am happy to use.
I started out with mainframe assembler and C, and am now in Elixir, Docker, React Native. Higher level languages really help general productivity. So my answer is yes, we have become more productive.
What's that? We're discussing productivity? Oh. Carry on...
From a productivity point of view you can ignore a lot of it and that is largely what I have done. However if you intend to have a long career in this industry that might not be a good idea.
I think programmers were always equally productive at any time, because their job is to constantly automate the non-productive parts, so sooner rather than later they are eliminated.
My biggest challenge these days if finding the just right amount of "new tools / solutions" that I can get control over, without losing focus on the problem I'm actually solving. I often find myself "too deep down the rabbit hole", and am getting more and more comfortable with just cutting all the "cool new stuff" and find another angle to get things done. 9 out of 10 times you can find a very nice solution within the technology you master, and it's best to ignore that "I need to use cool new stuff"-itch.
The power you have these days over the complete spectrum of software is just magic! You can build a whole new platform / application in days! It's a wonderful time to be in this industry and I love it!
But it is important to realize that this has nothing to do with time passing. The most expensive failed software project ever was the attempt to the modernize the FAA, in the USA, from 1982 to 1994, a project which cost $3.7 billion and which failed completely. And there, too, the problem was too many tools, too much abstraction, too many experiments with options:
"The project was handed over to human factor pundits, who then drove the design. Requirements became synonymous with preferences. Thousands of labor-months were spent designing, discussing, and demonstrating the possibilities: colors, fonts, overlays, reversals, serpentine lists, toggling, zooming, opaque windows, the list is huge. It was something to see. (Virtually all of the marketing brochures – produced prematurely and in large numbers – sparkled with some rendition or other of the new controller console.) It just wasn’t usable… The cost of what turned out to be a 14-year human factors study did not pay off. Shortly before the project was terminated a controller on the CBS evening news said: “It takes me 12 commands to do what I used to do with one.” I believe he spoke for everyone with common sense."
So we would be wrong to think that software developers were productive in the past, whereas now they are not productive. But rather, some paradigms of development tend towards too much abstraction, and when followed they lead to failed projects. That was true in the 1980s, and it is true now.
I do think, on the frontend, the desire to take markup languages such as HTML, and then make them work on all output devices (desktop computers, tablets, mobile phones) has lead to an era where too much abstraction is the norm on frontend projects. I tried to imagine an alternative in my essay "The problem with HTML":
For my first year in the uni, after deciding to not pursue programming professionally, I decided to go low-tech: notebooks and agenda, internet only when necessary. I definitely read more books and got more of my todo-lists done.
Then I decided I'd use the computer more, links and images were logical extensions to notetaking, and being able to hyperlink all my notes and documents was an unforegoable improvement (thank you Org-mode and Emacs, if not for these two jewels, I'd not bother using a computer for anything else than the browser).
Nowadays most of my stuff is digital, and I use some "tools" to interact with them. I research in the browser, I'm slowly getting the habit of reading shorter papers with less than 20 or so pages on the computer, almost all my notes are on the computer (with some waiting to be digitised), I use Emacs and other tools for processing all sort of data (not in the statistical sense), version control is used everywhere, from todo lists to orgs to short stories to research notes, wherever applicable (I use Mercurial or RCS generally, depending on the task).
Comparing the two ways of working, I definitely have to put up with more distraction with the digital setup, but I also know and learn more. It's hard to tame internet to not be invasive (middle finger to content websites which disable RSS feeds for pageviews, fuck you all), given most actors are actively trying to be invasive. Recently, for example, I found a reseacher whose work I wanted to follow. She had a twitter profile and an academia.com one, both platforms that provide no RSS feeds. No other profiles. Now I either have to open myself to this sort of invasive websites that are constantly trying to learn more about me and push stuff all the time, or just not follow her. And I chose the latter. But that's fucked. The business model of the internet is "we give you some stuff, often other people's stuff, pay us with your attention, and moreover we sell you to as much advertisers as we can". That's fucked, but given the utility of internet, one has to learn to put up with it, and that's not all that easy.
I do not follow live news, even with RSS. I'm subscribed to some mailing lists from newspapers and journals, weekly or daily. Other than this, I use RSS extensively. If your page does not have an RSS feed, I'll probably not follow you. A newsletter? Only when your thing is really interesting, and you post no more often than weekly. I follow youtube channels with it, so I don't have to open YT homepage and be subject to many interesting but distracing links I might be tempted to click. I do not enable notifications from anything, including mail, even on mobile phone. I decide when I want to know about the outer world: check feeds or mail manually, when I want. I use no social media. I do use Reddit, but I don't subscribe to any sub, instead, I group them into multis, and check the relevant multi when I think I need to see sth. there (and to my surprise I spend a fraction of the time I spent there in the past, even when subscribed to a handful subs, when I have a blank page from reddit.com).
The biggest distraction with computers is internet. And one needs to learn how to use it defensively. Maybe we need a "Defensive Internet Users" wiki thing?