Your expectations, and the expectations of others, are your enemy here.
At least, that's what got me out it. I'm still disillusioned with the world but it's manageable if I can realize I'm making a difference to my son and wife every day, and that's what counts for me.
The OP said he is "disillusioned with technology" but I didn't see actual technology being described as the problem with any point. So there's a conflation here happening between technology and "tech" companies. And I can only say the phrase "tech" companies while using sarcasm quotes around "tech", because almost nobody at any of these companies develops actual technology. And that's a big part of the problem.
It doesn't have to work well, or be bug free, or compatible with the previous version, or address any real world need.
No! Instead it has to use fashionable technology and be extremely complicated so other programmers can see how incredibly clever they are!
Almost like they took a random bug from the issue tracker by looking at last nights lotto numbers and then they opened to a random page of Knuth by letting a fan blow on the pages for 5 minutes and said "alright, I'll solve this problem in that way! Surely everyone will acknowledge my genius!"
It may be a slow lumbering buggy pile of brittle barely functional code about to implode, but boy does it look nice!
However, even I can see that a team would get burnt if they built an enterprise software the way I code my personal email client. The problem here is that people forget that building software professionally is an engineering job. Like other forms of engineering, there are processes and good practices to facilitate both functional and non functional aspects of a software and the building process. While the extra burden of version control, testability and extendability takes some of the fun away, I would have reservations working with someone who pushes directly to a release branch, does not write test and hardcode values instead of uaing configuration. It's about balance and realising that job is a job.
However, the CSS was generated. From YAML. The YAML was generated from JSON. the JSON was pulled from mongo. The Mongo was updated through a strictly validated XSLT that had an enumeration of colors. Those colors did not include the button color we needed.
Don't worry! There was an ability to reroute the xml through the use of ruby mixins and then add the attribute by parsing the dom, editing it, then re-ingesting it later downstream so it gets out to mongo right.
oh and there's a cache layer at every point here. so make sure you invalidate it to see the change. every time.
I closed the ticket and did a few more like this for 6 weeks, mostly in ember - they were even crazier.
The product, an interface to some server software, had basic html with markup like this:
Like the most trivial stupidest simple code you can think of. 15 minutes of php, at most.
However, changing it to do something else was never direct. 4, 5, 6 maybe 7 different languages, servers, restrictions, databases, input and output formats ... absolute and total batshit.
I left. Company is worth over $100 million today, looks like I'm the loser I guess.
This isn't about having a dev and release branch, this is about endless layers of abstraction and insanity that make easy things 1,000 times harder and almost impossible.
I thought you were exaggerating to make a joke. Sigh.
> I left. Company is worth over $100 million today, looks like I'm the loser I guess.
Somebody played the lottery and won. 99.9999% played and lost.
You are not a "loser" for not playing the lottery.
As if we'd have 5 or so platforms and a single command to go "presto!" and build on a bunch of devices with a slight button offset change.
I always said "how about, if you want a linux version in qt, you fire up qt creator, drag your mouse around a bit, click a few times, take 20 minutes and that's it. You then literally just walk away because the work is done".
This doesn't mean that you were going to better off without git workflows, code reviews, tests. My experience has been the opposite. It's exactly where there's no good technical leads/technical discussions/code reviews I have seen this kind of mess. Each person goes on and do their own thing with no clear direction or architecture.
At the end of the day, for me, to stay sane in the face of this kind of nonsense(as in bad engineers - unfortunately there are those even at senior positions), so what if the button that could have taken a few minutes to change now takes 6 days if we are paid to do it? As a responsible engineer, you point out the issue perhaps with a proposal for improvement. If they listen, good. If they don't, fuck it. On the other hand, if all I end up doing is changing colors of buttons regularly, each taking a week, then for the sake of my career, just say no and move on - unless of course if the pay is so good that it makes sense to spend a couple years doing it - people have to do far more shitty jobs for less. And spend some of the free time to code like a cowboy :)
That's the whole point. What some people consider to be "good engineering" is a different set of standards, a different set of qualifiers.
Let's go back to 2012. I was yet again doing web stuff.
We had this hodgepodge of jasmine, junit, eslint and selenium and couldn't commit unless it all passed
But the tests broke more then the code itself, because it was <far more complicated then the thing being tested>. So more time was spent on fixing and babysitting the tests then writing the damn software.
Alas, we finally released and it totally completely bombed. Why?
Because those test suites don't care if something "feels" clunky or "looks" wrong ...The machine responded to the interface in machine time, it didn't actually test human time, which was the only thing that mattered. We should have relied on human dogfooding, like the business books say to do. I got arrogantly laughed at for suggesting it, multiple times; that simply wasn't "engineering" to this team.
Now of course tests are valuable, sometimes. But "sometimes", that's the important thing. Understanding when to make that call is actually important. When, where, what, why, and how - not just important for journalists.
But instead, like some 18th century royal court disconnected from reality, we did ceremony. So we wrote tests, most of them bullshit. One of the tests was essentially: "Does this image on the page load from s3?"
At least that one usually passed.
Except when AWS was down or our internet went out: "I guess we can't work today, the does_image_load_from_s3 test is preventing the commit." They were a waste of time and got in the way of actual work. But we HAD to have them, we MUST, right? Nonsense.
I'm convinced the tests were there because "doing it right" was about virtue signaling. So we built a salary defending potempkin village composed of pure thought stuff.
I imagine it all like a catholic mass: Men in robes walk around, ring bells, and use special boxes to wash their hands with special cloths; it's all very important if you go to church, but that's the point, it's praxis and faith: we were coding from plato's cave, creating intricate shadows of reality representing actual work.
Symbols passing as tools: like Dumbo fetishizing the feather and being oh so worried when it falls, everything passed the most sophisticated testing I had ever seen yet the program still crashed in the user's hands almost every time. All that work was mere ceremony.
Understanding how modern computing speeds and vc capital has allowed people to be wrapped up in their own bullshit, call it programming and get away with it, is a major insight into why technology sucks today.
It's not just you, everyone agrees. It's lame now.
I don't mean, dogmatically follow unit tests (actually my statement wasn't predicated on unit tests). If you have a better approach that can validate the software is correct then I'm all ears. If you have a better collaboration tool than git, I'd be happy to try it. With all due respect, what I can't do is take your word for it that you can build software that works (covers all functional and non functional specs), and they continue to work as more code is added, and is worked on by more than one or two developers, and that you can continue to validate and roll out more software over the years even when the original developers aren't around. It's difficult, costly. It can work, it's just not the best way. The industry will evolve and come up with better tools and processes than we have today as we did before yesterday. Only thing I took from the couple examples you gave is that you've had the misfortune of working in some terrible teams. Though I still don't see how you'd be better off without the rest of the usual practices we have today. The guys who couldn't code the tests, I can't imagine them building a non trivial software well either. I'm not defending any one process. I just don't agree that we don't need any process, and that we should just write code that (seems to) works.
This is what this whole thread is about.
If tests are "taking the fun away", like you said, they're shit. Simple as that. Tests are a productivity tool, they're supposed to make your job easier by providing faster feedback. If manual test is faster than automated test, your automated tests are shit. If they're causing developers to write the bare minimum of tests it will only generate a false sense of security. This is even worse than saying "ok we don't have tests, let's be careful and test manually".
It's like that stupid saying that bad documentation is "better than nothing". It's funny how people change their minds when they spend four hours or more on a stupid rabbit hole because of outdated documentation.
On the other hand, one of the best jobs I ever had was maintaining shitty web apps written without source control, tests, documentation, patterns. Thousands of lines of code. Some of them didn't even have source code: I had to decompile the production server DLLs. I don't have any coder friend that got burned out by "bad code". Not having autonomy to improve the bad code, on the other hand, made a lot of them change jobs.
You are describing "CV-driven development", where people want to use heavily marketed technology brands in their CV as a substitute for real skill and experience.
I've found a really nice perspective on this recently: An app can be a home-cooked meal 
It's okay to build things that aren't popular, that don't scale, or that aren't economically viable, for the delight of a few users.
But that's more an analogue of "use your program before you release it".
At least they are 'dogfooding' it and not just letting others eat it I hope :)
If I'm cooking a dish that I've cooked at least three or so times before I generally have a good idea of how things should be going and I can get away with only tasting once or twice at the end.
You can write unit tests and Spring components with 64 character names all day long but by the end of the day you are completely disassociated from your contribution. Rarely is anybody there to thank you, who is grateful you made their life better, or who has some simple joy over what you created. It definitely happens (e.g. a major release) but it's not a regular event. It often doesn't feel like you just made your community a little better by producing something sensible.
I know people who do hobbies like carpentering and they hand out their (amazing) work as gifts. You can see them oozing with fulfillment when they do and going into their hobbyspace is an escape from the world of work-for-a-living.
I have the same attitude with apps as with the books I write: if some people enjoy them and I get to occasionally meet (probably virtually) app users and book readers, then I am very good with that.
re: "going into their hobbyspace is an escape from the world of work-for-a-living": for most people this is really important. For me, I go wilderness hiking every day and have a hobby of cooking so I have several hours a day away from technical interests.
I have nothing to show for my 14 months at [redacted], but I still use the trivial little app I made years ago to redirect my search queries to different sites. It took all of 5 hours to write.
As a graybeard, I've seen multiple generations of "how to do software" and the most recent are the least fun. A lot of this is driven by the agile approach. It's tailor-made for dev burnout: from the endless tight cycles that force people into an infinite loop of productivity with scant satisfaction that comes with "completion", to all of the tools and philosophies designed to juice that endless loop to be successful/workable. CI/CD, Git, TDD, etc. These all impose on the developer's creativity, independence, and enjoyment. They turn devs into cogs--assembly line workers who must not stop the line at any cost.
One example: back in the day there was a nightly build, not a continuous one. And, you checked out a file, worked on it, and checked it back in. If someone else needed it they had to wait. That obviously had its limitations and it seems laughable by today's standards. But, it was reflective of a human pace that considered devs as people vs. optimizable assets. That is, it was workable because the expectations on devs weren't insane. But, now we commit and merge. Think of how much less fun it is to spend your time resolving merge conflicts.
More to the point, the approach itself implies a chaotic pace wherein code that meets the standards of a certain box must be produced at all times. Devs must bear the cost of resolving any conflicts (literally) that arise from this chaotic pace.
Likewise, with CI/CD. And don't get me started on the monkey-work that is TDD. You might argue that it improves code quality. But, it's hard to make the case that it improves job satisfaction. If you move more work from the creative, problem-solving bucket into the busy-work bucket, the result will not be personal fulfillment.
Does agile increase productivity for companies? Sure. But, it comes at a high cost that's mostly paid by devs.
I think the bad started for me (long ago) when the Microsoft style management and so-called MS best practices began to conquer the PNW.
In the late 90's a company I was in started driving to an exit. First they hired an ex-MS Group Program Mgr, whatever that means. Next, came tons of PMs pulling one or two devs into their projects/features. Doing it the MS way I guess (at least the MS way back then).
I was a manager, I refused to dole out my team members to PMs -- everything my team did was run through me. I don't even think I let them enter individual devs into their project plans. Just the team.
This worked well for us (my team) because I knew their capabilities, interests, family commitments, likes/dislikes, etc. I could adjust resources as needed to meet the team commitments. We had successes and failures as a team.
PMs who tried put pressure on my devs behind my back would really catch it from me.
My management style wasn't any new idea, it was what we did in the Army. Assign a team a task, then the team leader ensures the task is completed by the team.
It was a good time, we were the only real team in the place as the other managers embraced the MS way of doing things.
At end (right before dotcom bust) the company started doing some Agile-lite with two-week release cycles absent stories, standups, etc. I did like that enough to put it place at the next company I worked at.
I did get burned out though, mostly because I didn't want deal with what the industry had become as the last startup I was at petered out.
I still love programming but not enough to do it in modern shop.
This rings true. It's probably why I have always had a tendency to look sideways at these efforts to turn everyone into a coder. I get it: there's demand, opportunity, etc. But, for me, there's always been a cynical element of devaluing actual coders to it.
The stuff you're talking about regarding your management approach almost seems like a relic from a bygone era at this point. So many companies now allow the process to manage devs. PMs back then frequently over-focused on the work vs. people, but even many of them have been replaced with some version of a scrum master with an even more relentless focus on the never-ending storyboard. They're driving the work over people approach without apology because it's what the process demands.
This is not to say it's 100% the case across all companies. But, there's very much an inhuman element to the process that has manifested to some degree in nearly every place that employs agile.
> Likewise, with CI/CD. And don't get me started on the monkey-work that is TDD. You might argue that it improves code quality. But, it's hard to make the case that it improves job satisfaction. If you move more work from the creative, problem-solving bucket into the busy-work bucket, the result will not be personal fulfillment.
> Does agile increase productivity for companies? Sure. But, it comes at a high cost that's mostly paid by devs.
I agree wholeheartedly with you. I'm by no means suggesting we move back to waterfall, but I am really enjoying the work I'm doing more and more of lately: embedded. Nominally it's a sort of Agile-type workflow (Kanban-ish), but because there's hardware design and manufacturing in the loop, things get planned out early and the multi-month plan doesn't change very much. New algorithm ideas pop up and get scheduled, new ways of doing sensor filtering pop up and get scheduled, but the direction of the wind doesn't change at the start of every "sprint". There are no sprints, just a prioritized/sequenced task list that gets reevaluated periodically.
(Plus, I get to go back to my old days C-hacker roots. The thing builds with a Makefile and spits out a ~32kB binary that gets flashed onto the device.)
I've always thought this was a red herring for scrum people to smack dissenters with. I've never done a strict waterfall with unyielding changes. Before agile, we would do 3 month release cycles. We prioritized what we wanted to get done in the 3 months and what we didn't get finished fell off to the next cycle. This was done in BigCompanyYou'veHeardOf and SmallSoftwareDevelopmentCo.
In my experience, waterfall project plans were never meant to be in stone, they were just a guide to give us an idea of what the project looked like. I've done a 3 year plan too. I'm sure some places implemented it strictly, probably government or heavily bureaucratic institutions.
I started writing out the whole story, but in a nutshell: my formal background is a dual EE/CompSci bachelors, followed by an Masters in CompSci that focused on distributed systems. When I was a kid, I learned to program very young: Basic around 8, C in DOS around 11, C on Linux around 13. I fiddled with electronics some, but didn't really get it. I took the EE part of my schooling specifically because I wanted to learn more about how computers worked "under the hood" so to speak.
From there, I ran a web consultancy for a while, and ended up with a client that had a more math-heavy project. And then another client showed up with a project where a microcontroller made sense, and then another... My business partner was moving across the country, and I was enjoying what I was working on quite a bit more than I was in the web/mobile space, so we decided to part ways. I still do the occasional web/mobile project, but they're generally hardware-related (e.g. the Bluetooth connectivity portion of an IoT-type system). Over the years I've accumulated probably $15-20k worth of equipment and software license, and the customers keep showing up!
We have similar backgrounds in a way. I started coding as a kid too, at age 9. First Basic, then Pascal. Picked up C in my teens. Also tinkered with electronics and was part of an online robotics mailing list that was a lot of fun. It was very hard for me to get parts, living in the middle of nowhere in rural Southern Brazil, but some folks in the mailing list were super cool and shipped me parts from the US. I live in the US now.
I'm a CS major but took electives in embedded systems in college, and those were some of the most enjoyable classes I took. I'm now working on recalling some of that. Ordered some PIC parts and I'm currently taking an edX course on ARM programming.
My only problem right now is, I have no idea how I'd get into that space having a whole career built on server-side software.
If you ever need any new hires... /u/cblum and I would like to have a word
I've always loved the carrel desk. It seems perfectly designed to encourage the state of flow in its occupant.
is it a form of abuse like trying to keep everyone under control? is it a form of psychological conditioning to remind you that you just a resource whose top priority is to be interrupted at any time so knowing that you don't really have any personal space sovereignty or privacy to your own thoughts or creativity but that you must answer and create only for the collective?
I don't feel I have a clear picture on what it's about any insights?
Open offices facilitate that feeling of persistent accessibility and production. No closed doors to slow anyone down, and no notion that you are there to do anything but work every minute of every day. So why on Earth would anyone need privacy?
This is full-on agile ethos. And, for the same reason, agile is also responsible for the reversal of telecommute policies at some companies.
When I call someone and in the background I can hear 20 other people talk, I immediately assume that the person I called is not considered important in his/her company. Because high-level work needs uninterrupted quiet time.
For agile, it's similar. When you stop having different roles, then you implicitly assume that your lead architect and your junior trainee can do the same work, albeit at different speeds. If your architect has useful experience, that's an insult. Or it means that your entire product is simplistic enough to be built purely by trainees.
So both open office and agile effectively reduce your programmers to expendable grunts.
Also (and somewhat contradictory to the above), I expect that, if your colleagues are _too_ easily available, you'll be running to them with a half baked question in your head, only to blurt it out before realising you didn't think it through, and wasting both their time and your own. But before you go knocking on someone's office door to ask a question, you really want to be sure that you know what you want to ask, and that you understand the problem well enough for the answer to make sense to you. Which leads to a better conversation, less time wasted, and better learning.
The crazy extension of this is "free desks" where basically every place is a workstation, and you don't have any space to call your own.
I have similar views on wfh, fwiw. Like 1-2 days a week is fine for me but full time remote is far too isolating IMO.
Fresh out of undergrad, open office layouts feel familiar to the long hours spent sitting at big tables in library basements, so I can see why some people like it initially. After a few years, you will be pining for four walls and a door to get anything done.
> The difference is, with your own space you can do deep work on your own terms. No longer is your train of thought constantly interrupted
What's wrong with headphones for deep work? Or appreciate not everyone has these, but where I work atm we have small 'quiet zone' booths where you can take your laptop if you really want to focus deeply.
As for being interrupted, again maybe it's just a company culture thing. Where I work people are generally pretty respectful if they see you with headphones on intensely focused on something, they'll probably just ping you a message on slack asking when's good to talk. But equally if people have headphones off, open body language- everyone feels happy to strike up conversation and there are no barriers in the way of collaboration.
So to my generation, this bouncing ideas thing is less of a requirement because we're used to doing our own work.
If you look at scientific papers from fifty years ago, most had just a few authors, now many of them list eight or ten.
The team approach seems to be taking over the world. However, I would point out that most truly great work, think Nobel prize, or truly awesome engineering work (K&R, UNIX) has till now mostly been achieved by individuals or by small teams, and that there was a lot of focused individual effort put into them.
I don't mean to sound condescending at all, because the young are going to win out by default, you guys are the future and we are the dinosaurs. You preferences and work habits will become the correct ones (whether they are better or not).
However, I would suggest that you like open plan environments because this preference has been trained into you since early grade school.
What I really hate is filling in form A, which directs you to form B, which you fill in to see whether form C is required. Form B was put in place by a legal team, who don't provide any point of contact, and who are not your friendly local legal team.
Because the process of filling out forms is so time-consuming, your engineering team uses Asana to track it. However, your PMs use an excel spreadsheet, the legal team uses JIRA, your copywriters use a google doc, and the teams that own forms B and C use separate internally created tools. You update your form-filling progress in 6 different places, some of which have bugs and others of which aren't actively monitored, so you also have to use email/slack/skype messages to follow up the right people. Some of the right people are actually the wrong people. Some of the right people reject your proposal because they didn't read it properly. A few of them reject your proposal for reasons that are actually valid, but which you could not possibly have known, because they're based on tribal knowledge which is not documented anywhere.
Filling in the forms and fixing the issues takes literally 3 months and at least 4 group meetings as well as several skype/zoom calls. One day you are finally allowed to write the code. It's 150 lines including tests, across two services, and you're done in two days. Everything that caused your proposal to be rejected would have been found in development. You quietly wonder whether you would have been happier as a bricklayer.
Anyway, I think my comments on CI/CD, Git, etc are being misconstrued. I'm not saying they don't have value. I'm saying they are frequently part of an ethos that leads to developer burnout.
For instance, your CI/CID process is one you must "appease", along with the rest of the bureaucracy. It may not feel as odious (and may even seem a relief, relatively speaking), but it contributes to your total load.
And, Git is fine. In fact, perhaps even perfect for the current culture (in philosophy, if not always in execution). Each developer has his/her own repo, you can work offline, you merge instead of locking check-outs, etc. But, that's the trick: it feels perfect because it allows you to work in the always-on philosophy of agile. Its popularity sprang up because this world of high burn-out, constant productivity demands it.
So, we can see the utility of these things and even appreciate them. But, the model they enable and expectations they create can still ultimately lead devs to burnout.
A lot of the comments in this thread are pretty funny to me. CI systems are awesome when they are working. It is a real pain in the ass not knowing when something is broken or not being able to find out when it broke.
Similarly, writing good tests is really helpful for me. I do this even on my own personal projects.
These things are so helpful to me that these comments are like reading about people who hate source control.
>writing good tests is really helpful for me
I'd wager it's partly generational. If you came up in the agile world, then you likely see the upside but not the down since there's no reference point for the latter. Because, of course it's cool that these CI build processes kick off at commit points (or whatever). And, of course, the near instantaneous feedback that something is broken is more efficient than awaiting a nightly build. OTOH, we were much more cautious about the code we checked in because the stakes were higher and you didn't want to be the heel who broke the build. "Move fast and break things" was not a thing. Instead, "be thoughtful about what you're claiming to be good code" was the ethos.
But, this is not an argument about efficiency or whether these things can be made to work. The argument is about the cost to the developer of all of these things in sum, and the philosophy they serve.
Likewise with TDD. I'm not arguing that tests don't help code quality and I've heard others say they like writing them. YMMV and all that. But, it's more load on the developer and I've definitely seen it overdone.
So, again, without the reference of a "saner" world then you don't likely have the context to fully appreciate these costs. I see them, though, and frequently hear them when people complain about burnout. It's not CI/CD or TDD or whatever that's the problem. It's that these things are frequently used less as tools and more as the instruments of a philosophy that plugs developers in alongside them as just another part of the never-ending pipeline.
Then I started working at a big company with a good engineering culture, I learned about writing tests and related tools, and had a much better experience. Now working without these things feels like driving without a seat belt, or maybe using a grinder without safety glasses (car accidents are too rare to be similar to bugs being introduced...).
I still have fun writing personal projects. It's just better with tests and CI. It sounds like you are pointing at issues with management styles that just happen to exist at the same time as useful tools like this, but I think they are almost if not completely orthogonal.
Not sure I'm getting my actual point across though so, rather than repeat myself, I'll just say the "continuous" part of CI/CD has implications on us as humans that are difficult to fulfill over the long-term. So, maybe really consider that word continuous in this context. We're just not made to be cogs in an automation pipeline that never ends, which is essentially where the philosophy (and its enabling tools) places us.
So, it can feel fine and you can see the merits of the tools, etc. But, none of that precludes the burnout that so many devs ultimately face over time.
A quick test framework that doesn't require too much boilerplate, a Github Actions CI that runs in a few seconds and help me, continuous deliver that has everything figured out for me?
I think those are a net positive.
When it makes my job a living hell? Nope.
A CI that takes 40 minutes to run? A CD that requires manual intervention and has "a line" of builds and you have to babysit your build? Testing that requires multiple lines of boilerplate?
Then I'd rather live without those things.
Now... don't get me started about JIRA and other tools, which are a tool for micromanagement.
This weekend I implemented RS232-powered RGB LED that is controlled attiny85, which reacts to strings sent to that very same RS232. A gross violation of standard, probably, but it works! It definitely added a lot of joy.
I typed “git init” only after I finished the first working version which had a regular One-color LED, and could not yet do blinking :-)
One thing that made the development of it all immensely easier was having a Saleae logic analyzer. Highly recommend it (no affiliation, just a happy customer).
- Implement a CPU and peripherals in an FPGA
- Throw together an OS
- Make a handheld game console
Why? Because it would be fun to remind myself of the "first principals" (not talking physics here) and play around with it.
For now, I'm playing around with Raylib  and Allegro  which are C game libraries.
Do you have a link to your console hardware?
Does not sound like it would keep me sane…
In academia the aim is to get a proof of principal, write a short paper, and move on. If you care about reusable code you are building a foundation for someone else's success, but not necessarily your own.
Neither side is wrong, it's just a different game.
I would say to also work in a domain you care about and like. I don’t like technology for technology sake and never did.
I like solving problems and see technology as a tool to do so. I love the company I work for, the people and the problems we get to solve. I love even more when I solve them without the need to write a single line of code.
When I started my first programming job a couple of years ago I joined a team that demanded 100% code coverage. I hated them so much. I was still learning the ropes at this company and I only saw the unit tests as a barrier to getting my work done and earning my paycheck. My first solution was to create bogus tests that always passed. That was quickly discovered and I was reprimanded. My second solution was to get colleagues who shared my hate for unit tests to approve my PRs before they were reviewed by my team. That too was thwarted.
Then one day I was working with a teammate on a new feature and we discovered a bug. He quickly opened up a test file and wrote a unit test, then he went tried a couple solutions until the test turned green. Then he looked at me and said, "When when you are working in a pile of crap, testing makes you feel more confident about your code." That was my first insight into the value of testing. Eventually I came around and stopped trying to avoid tests. I just did the damn work. Once I established trust with my teammates they began to let the pressure off my PRs and slowly the displeasure of writing tests went away.
You pointed out a few different coding practices that frustrate you. And to be honest, those coding practices are not the gospel and should be deployed only when truly needed. However I think you have a serious problem with what a lot of us call being a good teammate. At the end of the day your goal should be to get the product shipped, once you focus on getting your features out the door, unit testing and pull requests become minor details in that process. At the end of the day those are just a cutesy to your teammates to show them that you are willing to be a responsible and helpful team member. Stop trying to fight everyone so much and maybe you will enjoy your job a little more.
> "Don't make enterprise software. [...] Don't accept pull requests. Simply write software for yourself and have fun doing it."
I've recently started doing that and it's been a breath of fresh air.
I don't really like the stuff I work with, which is services. I think I've become good at it given the feedback I get from my peers every review cycle, but I really don't like it.
I felt burned out for a long time because of that.
Recently I've simply been doing what I'm interested in, in my spare time. That's learning about embedded systems, something I had an interest in in college but never pursued a career. And for fun, tinkering with old stuff that makes me nostalgic. I spent this last weekend coding in Pascal and messing around with FreeDOS :)
Believe it or not "all the edge cases" can still be perceived by the right mind. It's just that we as an industry have done seemingly everything we can to push those folks out, just look at OP as an example
I've supported ten-digits-per-year (non-SAAS) businesses without unit tests or code reviews and oftentimes deploying straight to production. As the sole SWE for my codebase. Supporting hundreds of remote installations with nothing more than SSH tunnels relayed through an ancient, colocated linux box. And the software was very good, didn't ship with many bugs (and when it did they got fixed real quick), and there were never any catastrophic, non-recoverable issues, nor ever any questions on the integrity of the system or my reporting. We were never seriously hacked (to our knowledge). Crazy times... not sure I would do it again, but.... it can be done.
My first dev job we did most of our work in vim sessions on the development server, and more than once I was asked to hotfix live code running in production. Through the grace of God and an abundance of caution nothing ever seriously blew up as a result of all this madness. (Though, ask my boss about the time he tried to move our MySQL instance onto some very early SSDs) Again, it can be done. I'm sure most of the old hats lying around have tons of stories like this.
Most of my job is replacing stuff like you describe. And it's definitely not fine. Nobody knows how it works because that single SWE is gone. It can't scale with the business and it's a huge drag on productivity because nothing can change without a massive testing effort to ensure it's not broken.
In my last job I'd advocate tirelessly for unit tests, code reviews, all that. And it was always denied. Ironic given that the other engineers I worked with were MEs, who had notebooks full of processes describing how they were to work so that engineering issues would be caught. But the software? "I don't care how you do it, just ship the feature"
As an aside, I've always found "nobody knows how it worked because XXXXX is gone" to be kind of funny: the code knows, so go read the code. It'll mean everything takes 100x longer but the knowledge is there.
My approach to that is to not ask, just write the unit tests anyways, ask a peer to code review without management permission, etc.
We are professionals. Part of that responsibility is to know the best practices for our craft and put them into practice.
We also need to be good at getting the requirements from the technical and non-technical people we work with, and being able to show consistent, incremental progress, and a willingness to quickly change direction when the requirements change.
But we do not need to get input or permission for the process we use to produce those results.
There are plenty of SWEs that just don't know any better. I know because I was one of them while writing a lot of software for a business.
I was drawn to software at a time when it seemed like we had control over things. With the advent of the cloud I feel like that control keeps slipping away more and more. Kubernetes is my new nemesis. I seem to be in the minority that perceives it as unnecessarily complex for most tasks. Someone once commented here on HN that the k8s trend made them think people are trying to pretend their code doesn’t run on hardware anymore, and that really resonated with me.
My boss recently introduced Kubernetes to our software deployment.
The first thing he said, though, is if you don't absolutely need Kubernetes, don't use it. It is extremely complicated and finicky and difficult to deploy correctly, and can bite you in subtle ways.
Then he went on to describe, for our problems, how Kubernetes was absolutely necessary to scale without constant manual intervention and configuration and deployment processes consuming our time.
I appreciated that he had thoroughly thought through the problem before adding more complexity.
I do think there are options to achieve all of that - scale without manual intervention, etc. - without Kubernetes though. At least where I work, if I look at the deployment issues we have, those are all the product of bad decisions and lack of action due to higher priorities. There's no reason why there couldn't be more automation. They're building something new on Kubernetes which looks promising (though I really hope that as an app developer I don't have to think of Kubernetes things, which just irk me), but the current platform would work well too if investment was made into automating the parts that aren't automated.
Indeed. Haven't people been doing that since, well, cloud computing?
You don't have to always cover all the edge cases. If you write software for fun, you can often just bake-in assumptions and neglect a lot of edge cases.
Working Effectively with Legacy Code
Book by Michael C. Feathers
There are lots of weird if checks to deal with a vendor that returns bad data over the api every Sunday night. Your new clean code is going to crash and burn in all of these cases.
We need to accept change. Your company eventually will move you to a new project or dismiss you. They need to put your software baby on maintenance and squeeze out the last bucks before they shut it down. Life continues with or without you. If you become pro-active, you'll have a fun ride with it.
Web development could be stupidly simple if we wanted it to be. I feel like it got too easy, and suddenly there were waves of bootcamp grads, and a lot of developers resented that.
I thought we were talking about technology. You gave a recipe for spaghetti.
Engineering has its place. But you can also make art with an engine lathe, and doing just that every now and again can be a balm to the soul.
I have written personal projects both with and without tests and every single time I don't write them I wish I had, usually pretty quickly. The time you save by not writing those first few tests always seems to be lost, and then some, pretty quickly in the extra manual testing that is required.
Pro: You can do whatever you want
Con: You can do whatever you want (including things that will bite you in the ass later)
I've also noticed that AI/ML falls into the exact same pit because many folks there are cowboy coding
There's a careful balance here though right? For most projects your first users or clients are the unit tests. Why not have a future of repeatable client/user tests that insulate from regressions and to be your wingman to navigate future iterations? Also for me, I still review and accept my own pull requests on solo projects, because it is that last step when working on my own where I know I'm at a good point looking at my diffs and the last step in introducing mistakes.
Lets change the story to be about an artist being burnt out of art industry. Suddenly he has to deal with all the grant money, politics of the gallery etc. Feels like he doesn't want to do anything with art anymore.
And somebody in the thread suggests the artist just go to the nature and paint and don't think about any art styles and acceptance of peers and trends and etc. Just give yourself to painting and don't think about anything. Just paint with a coal on stones and loose yourself in it.
And you comment would be something like "There's a balance here though, you still need to paint on canvas with acrylic or something, otherwise you won't be able to validate your art in the future for you to progress etc."
Buuuuut... a lot of the personal projects I've worked on have zero unit tests. Maybe they have a couple of tests around a complicated algorithm, but mostly... no automated testing. What they do have though is a) version control, and b) a fast-iteration platform underneath them. They're also generally well factorered into small chunks.
As an example, I have a package that takes an org-mode file and extracts time entries to drive into time tracking software a client uses. Written in Lisp, zero tests. Every month I fire it up, look at the table of entries it's about to post, and hit "go". Looking at the table of entries provides two sanity checks: first, that I properly logged my hours that month (I'll occasionally forget to clock out for a weekend and rack up a 48 hour time log), and second, that it didn't encounter a bug while doing the processing. As of around September of last year, this program is done, and does its job perfectly every month.
Another example, also in Lisp, is used for making estimates for my clients. I give it a list of tasks with 3-point estimates, it churns through and calculates all the means and standard deviations, generates a file for Pandoc to consume, and spits out a PDF. I use it every couple of months. No tests, all done inside the SBCL repl. I obviously proofread the output PDF before sending it to a client, but that's again to check for bugs and to check for brain farts.
I've worked on great codebases that have giant test suites, and I've worked on terrible codebases that have giant test suites. And likewise for no test suites. While I appreciate the sentiment, I think it's dangerous to talk in absolutes like that. While I agree there is probably some degree of correlation between whether or not a codebase has a unit test suite and whether it's good code, writing unit tests does not intrinsically make the codebase good, and not writing unit tests does not intrinsically make a codebase bad.
In all projects I have worked on, extensive unit tests were not a safety net against regressions, but a safety net against change.
Heh, yeah, that's fair, although I think there's some nuance in terms: accidental changes (to things that were working) is a form of regression in my mind. Depending on how the codebase is structured, changing the tests to match the new desired behaviour might be trivial or might be excruciatingly complicated. These days (mentioned elsewhere in this discussion) I'm working on more embedded stuff, and the only time I'm generally writing unit tests are for things that shouldn't change.
As an example, last year I was working on a custom LoRaWAN stack. As I was building out the various pieces, I was writing tests to verify that the output from generally-pure functions came out as expected. (This packet) + (This key) = (This encrypted packet). Those kinds of tests help a ton for catching stupid mistakes.
I see good value in modern unit tests when you are building some sort of automation engine, rules engine, etc. But I think a lot of people see them as a hammer and everything is a nail.
IMO a lot of places have forgotten the value of manual testing in terms of not only finding bugs but actually understanding how the product is used. Games companies prize iteration speed in terms of how quickly you can test a change in situ because we as game makers need to verify our changes by playing the game. I'm making a multiplayer game right now so I need to make sure what I do works on the server and clients which usually necessitates three copies of the game running together. Then we playtest it with a larger group weekly and playtest with even larger groups less regularly.
My impression of a lot of modern development elsewhere is that as soon as automated tests are green the code gets punted into production which seems utterly bonkers.
BUT I think we should also acknowledge that everything that slows you down in your free time has a reason for existing and most of that is communication or knowledge sharing within a team. Naming conventions, unit tests, and general documentation all exist to help other team members keep up with the pace of changes in the repository. If you're not planning to do something in a team setting or for this to be consumed outside of the work that you do then you don't need these things. But if you want to share with everyone else it's important that you don't totally ignore these things because it will come back to bite you in the long run.
Kinda like the idea for personal. Why spend so much effort to structure projects so others can use when chances are they won't or you don't care if they do. Work is governed by different realities.