Another solution could be encouraging personal projects. Many companies over-work their employees to the point that they cannot do anything in their spare time. Give employees free time and encourage them to play with new technology in their personal projects. The curious ones will have an outlet to channel their energies and you will get a rock solid stack. Also, they will be knowledgeable to take you forward when a problem that requires their newfound knowledge arises in your organization.
Agree. This is necessary to encourage learning, which helps keep people sharp, which in turn benefits the projects that they are working on in the day job. Even if the technology is the "same old".
A couple of decades ago, many software developers were content using the tools they had and knew. Few demonstrated the eagerness and openness to try something new. These few were also the ones who were better at coming up with out of the box solutions because they were willing to try something new -- both tools and/or approaches. Now, we have reached a point where the bare minimum required of a software developer is familiarity and comfort with the new and shiny, leading to the reverse problem we had early on. Now, people can spend way too much time and energy learning and trying out new tech that they miss out on the opportunity of staying long enough with a technology to learn from experience.
In other words, today we spend way too much time on accidental complexity than on essential complexity.
Hell, I'd settle for just not discouraging it so damned hard.
This. So much this!
I think the fact that we have the white-collar workforce organized after the assembly line pattern is the primary cause of unnecessary, borderline intentional complication, which includes mixing in unstable/untested components. If we could find a way to compensate engineers fairly without monopolizing their time, we'd be much better off, because the need to continually invent additional work for oneself would go away.
The bliss of a stable, low-maintenance project can be had through side projects, which often work fine for years with minimal modifications. It's really rewarding to go back to the same utility that you've written and know that the comparatively little quantity of time you spent on it is still paying dividends years later. There's an awesome sense of pride attached to that.
This does not match my experience. Slow times are for experimenting with things and catching up on less-urgent tasks.
You've led a charmed life. In my 25 years in this business, I've found that very little induces the same level of panic in management as "experimenting" with something, especially something that you can't put a solid "business case" behind and "estimate" down to the nearest half hour.
It's both a shame and moronic that often assembly line paradigms of management are applied to non-assembly line work (i.e. work that isn't the same for each iteration).
What's needed is to reframe work in terms of value instead of time wasted.
The 'Toyota style' could be argued as more responsive to worker initiative and fits better philosophically with software work. (And Agile is influenced by Toyota-way thinking too, though in a mutated form..)
Although true, this is is somewhat shooting the messenger. If as a client of the recruiter I ask for COBOL developers and all I get are Clojure resumes because the recruiter thinks it's cooler, then the recruiter won't be my recruiter for very long.
Surely where there's an issue it's that the recruiters are being asked for developers with those attributes. They're presumably being asked because the employer, for good or ill is seeking developers with those attributes.
Now, I've personally seen 4 reasons for hiring devs with cutting edge knowledge: because it's cool, because it's the only way to attract good talent, because we checked and we absolutely need that technology and because we run a research lab looking into cutting edge tech we may or may not use.
The latter two are very good reasons to seek hotness. They're also the least likely to be the reason, in my experience. Using shiny, shiny to attract talent is often a successful approach. And who doesn't like cool stuff (though it's a terrible justification)?
Younger coders tend to be more willing to just follow the lead of the management, rather than ask "is this really going to solve the problem?"
Younger coders will put in massively more hours -- this may result in problems like reinventing the wheel, piles of garbage code, etc. -- but it generally "feels good" to be the manager with an energetic, 24-hour around the clock team of young coders.
Younger coders can be paid less, and are willing to forego benefits, stability, medical plans, or even pay sometimes.
your bias is showing.
I'm not sure what you consider hot tech but I can assure you whatever it is it is based on tech that has been around for ages and has just been rebranded. I find that people that hop from "hot tech" to "hot tech" don't ever build up a depth of knowledge and understanding to tackle hard problems in an elegant way. It is essentially a major contributor to the 1 year of experience repeated 10 times phenomena.
You immediately assumed that the folks I found are just jumping from tech to tech without building anything substantial. Which is probably also bias, just an opposite of mine. You also assumed that I didn't vet them or test them on "elegant solutions". Maybe your heuristic worked for you. But this heuristic, in my current situation is thus far working for me.
Maybe that's why many of the major changes you see in tech, like Linux, Git, Vim or Redis started as a single person's work rather than a clever corporate product.
one of the markers for promotion was manage a team of x size and a > 1million budget - so they where gaming the system not the best use of the share holders money
If you don't work in that field, you're fine :D
Well, actually yes, yes I am. You can make a compelling case that this shouldn't be the case, or that they people paying me are shooting themselves in the foot running things the way they're running things, but the fact remains that in today's daily standup/weekly status report/what have you done for me lately "agile" corporate environment, I am absolutely paid to write code and if I don't write code, they'll stop paying me.
Obviously you do write code in the course of providing those results, and all of the author's examples of what you should be doing are still examples of writing code, but they are the minimum code required to deliver results.
Another example of how you're not paid to write code is the frown you receive from stakeholders every time you say you are going to refactor something, or that you are going to rewrite the API to be RESTful.
They don't care, and they shouldn't
Workaholic Walter shows up to work 7 days a week, works 10 hours days for 3 months busting his butt to write a piece of software that ultimately fails because he spends so much time working he misses the big picture.
Smart Steve does some research decides the project already exists in open source downloads it, sets it up in a week. Then decides to take a 10 weeks vacation because he just saved the company so much money.
In these two situations Steve objectively did far better, but I bet your sweet butt that he gets fired, and Walter gets promoted. The problem is no one ever really knows how much value you added, or how much work you did.
I've been on projects that were incredibly successful from a technical perspective. Objectively the product performed exactly like the stake holders wanted, the performance was great, it could run for months at the time, even while actively being configured by people completely unfamiliar with the software and not crash.
It was still a failure because no one wanted to buy it. We spent a year kicking butt and still added 0 value to the business. But I will still paid. And I wouldn't have been paid any more if the project had instead resulted in an extra $12 million in revenue.
You pay a hair stylist to cut your hair, not for the business deal you may or may not get because of it. Why someone pays you is different from what you are paid to do.
I wouldn't bet on that.
Noone is aware that Walter overworked because noone saw him. Plus, he didn't delivered anything in the end.
In the meantime, Steve gets sacked because he's obviously not working - so why he should get paid?
The point is, companies would want you to focus on business goals, but in fact they pay you for the amount of code produced (and punish you if it's low).
I know Walter. He made a big point of sending e-mails at 2 AM, CC-ing the recipient's boss, and then following up at 8:00 the next morning asking why you hadn't responded yet.
The question is whether it comes from Walter or from the company. Either let him go or move to a better environment.
However, when most of the companies's idea of "agile" development cycle is to have smaller goals and very short deadlines, you end up being a code monkey. A project may run better if you have time to design a better object structure, instead because of a tight deadline, we have to slap on 10 if and else if statements then call it a day.
when making an enhancement? add 2 more if statements, fix a bug? put some if statements under the previous if statements.
bottom line, because of the tight deadline, better code and better design means nothing. we become if statements code writing monkeys.
Well, then you _are_ being paid to write code. We can yammer on all day about how we're adding value, but the programming skill and talent has to be there first. So you're both being paid for the value of your skills (being able to code proficiently and efficiently) _and_ for increasing the equity or profit of the business.
An analogy might help. When I pay a contractor to add improvements to my house my goals is to increase my home equity. So I am paying for value. But I'm also paying for his skill. Because if I get an unskilled contractor I might end up underwater. An unskilled contractor can cause me to go backwards, where the cost of fixing his bad work costs more than the equity boost. The same dynamic applies for bad programmers. Skill is the primary driver that causes any increase in value.
> but they are the minimum code required to deliver results.
We're not looking for the minimum amount code, but the best solution. Writing tests will add overhead to your code. At a minimum you could leave them out. But the benefits of rigorous testing can justify the investment of time.
It's completely asinine and blaming to speak for others actions. Stop doing it. It's ruining this place. <-- This last part is a blanket blaming statement speaking for all others and I really shouldn't say it.
P.S. I've flagged the article given it's making "you" statements.
Dijkstra: 'My point today is that, if we wish to count lines of code, we should not regard them as "lines produced" but as "lines spent": the current conventional wisdom is so foolish as to book that count on the wrong side of the ledger.'
'Measuring programming progress by lines of code is like measuring aircraft building progress by weight.'
You are the steering wheel that alligns the overall thing's behavior with the interests of the organization that owns the thing. That's what author calls "adding value".
But it ain't all bad, see. I hear some young folks are coming up with brand new types of toothpicks and glue, hell some of em' I hear put themselves together.
Honestly, I think programming is at the point where apprenticeships are a good idea. Thirty years ago the tools where changing to fast, and eventually we can codify the correct body of knowledge to have useful formal education. But, education seems to default to handing people a bunch of tools and getting them to learn it own their own.
This is exactly why it's a good idea to try your own implementations. You get intimately familiar with the problem. You understand the trade-offs, corner cases, api design, etc. far better that way. And there might be a slight chance that you come up with a better solution.
I've heard that Feynman had a motta something like "If I can't build it, I can't claim to understand it."
Also, I like the idea of apprenticeships. At some point our academic culture seems to have shifted from liberal education to some kind of quasi-vocational preparation. This is probably in response to the death of apprenticeships when we started requiring a bachelor's degree for entry level professional jobs.
I strongly agree with that. There's still a lot of important "tactile" experience that is easier to transfer via mentoring than to learn from written material.
> Thirty years ago the tools where changing to fast
My impression is that the pace of change of tools is only increasing (to pretty ridiculous levels if you look at the web ecosystem).
> and eventually we can codify the correct body of knowledge to have useful formal education.
This should be the goal, yes. But I don't feel like we have identified much of the knowledge that's worth codifying. Instead, the industry seems to be running in circles, each iteration less efficient and more bloated than the previous one. I wonder what the way out for us is?
I'm paid to write code because someone wanted an application. I don't think I could tell the customers who buy our application that they should learn some bash or excel and solve their business problems?
I assume under some rare circumstances a developer isn't faced with writing an app (site, CAD program, OS, ...) but instead to solve some business problem internal to a business - then you have the option of not writing code.
The piece does feel incomplete in that it doesn't discuss the incentives behind the decision to "write code" or not.
The simple fact that we always seem to come back to is that once you learn the basics, good engineering just comes down to good judgment, which is only gained through years of dedicated trial and error. Is it better to write some glue code to bounce data between 8 different extant programs and make final transformations or to write your own program that handles the process soup-to-nuts? The answer is now and will always be "it depends" (and quite frequently, it involves elements of both). We need to make sure that the social incentive structures align with the engineering goals (i.e., don't tell someone that if they make something stable and low-maintenance, they'll unemploy themselves) and then we can trust people to do good, iterative work.
With the turnover rate in this industry, I'm not sure anyone plans to be around when the monstrosity that they wrote starts to become a maintenance nightmare.
Your job is to solve business problems. You happen to use code as the tool to do so because it lends itself to building better, more reliable, more scalable solutions.
Nobody pays for a lump of code, they pay for a tool that solves a problem they have.
In this case the job of the developer will be to write code and it doesn't make sense to pay high salaries because the ROI is not there.
This leads to the situation where very good developers aren't paid good salaries because in such a company they are literally not worth the money.
- NIH. If you think it's programmers who primarily suffer from NIH syndrome, check again. Just how many times you end up writing something that's already done because your bosses insist they must have an in-house solution?
- Nonsolving a nonproblem. A lot of code we write is meant to solve a problem that doesn't really exist, is created by stupid decisions higher up the chain, or sometimes just shouldn't be solved in the first place (e.g. tools facilitating an unethical business model).
So yeah, I'm often paid to write code - because the decision that code must be written is made by people above me.
I don't solve my employers business problems, I solve the customers business problems an the only way to do it is to code it in the application.
A more cynical situation, which I have seen all too often, is that they pay for the line items on the RFP. Whether or not those features will ever be used or are in any way beneficial.
These are questions that are relevant if you want me to add value in a non code monkey way. But, sometimes, when I ask these questions, I get looks that say "You're paid to write code, so stop asking non code questions." Or maybe I'm just projecting.
Of course they're paying for code. They're paying for code that does something useful, and which may or may not produce value for their business. You can't separate the two.
Lots of people? Everyone who ever bought, downloaded or otherwise acquired any software not written by themselves?
There is no catch-all architecture, so there is guaranteed to be some impedance mismatch between the expectations of your project, and the provisions of the 3rd party tool. Heaven help you if you need the facilities of multiple architectures, and try to marshal and connect disparate datatypes and calling/threading assumptions together.
As programmers, we work with general purpose programming languages. Many project-specific problems are not difficult to solve in a custom manner, given somebody with enough experience and hindsight to know how to write such a forward-looking thing robustly. It is a serious consideration whether or not to defer your architecture to generic external sources that were not written with your unique needs in mind. And even if you do, it is by far a best practice to ensure that such things live behind an application-specific abstraction separating out your project-specific code from entangling 3rd party code, allowing you to perform the inevitable migration to a different platform later.
Our present approach is "hell yeah, if it can be outsourced it should be" and that applies to services too. We do keep an awareness of what it would take to do each thing by hand again though.
But then again I've also always found that once you start coding to it, AWS gets you 90% of what you want, and you spend the other 90% of the time fleshing out the exact retry, migration, and error-handling behaviors you need manually on top of it. :-P
But - there's also value in learning new things, in new skills, in new toolsets. Maybe it's not best for the employer, but for you the employee it can help quite a bit.
There's a fine balance - ideally your employer encourages you to learn things on the job, and to try new things that may not necessarily make it to production, but improve your skillset overall.
This is something I'm coming to terms with in my career right now (~3 years in) - do I learn something to a mastery level, even if it's not the most marketable thing, or do I focus on breadth first and shoot for depth later.
This is the core of the matter.
This is something Dijkstra, Edward Cohen, Jayadev Misra, and others in and around the formal methods camp have been saying for decades. It is more worthwhile to solve the problem than to guess your program into existence and patch the errors later. To dismiss them is to say you do not appreciate the true difficulty of programming and designing systems.
In practice I don't write formal proofs for every line of code -- how exhausting! But I do often write high-level specifications in tandem with the software I'm using to implement it and often one informs the other to the true nature of the problem I'm trying to solve. And I find that as I improve in different logics and predicate calculus I can find and spot the design errors in the structures formed by code that you don't see if all you're doing is trying to "solve the puzzle."
The whole approach to guessing your program and patching the bugs later is far too addictive. It saddens me when otherwise good programmers fall into this trap. It creates work for oneself but it keeps you from solving the real problem at hand!
Nice article. Not sure about the allure of "Taco Bell," but the spirit is in the right place.
The TLA+ book by Leslie Lamport is also an interesting read even if it is specific to a particular logic for creating specifications. Any of his talks on TLA+ are also good for a high-level overview of the why and a bit of the how as well.
Here's an introduction to Z specification language that illustrates how it uses simple formalism to precisely specify requirements and design going from abstract to concrete:
Alloy is a model-checker in a similar vein as Z that's designed for easier use and automation:
Although TLA+ is popular now, SPIN was the goto model-checker for analyzing software for concurrency errors:
Software Foundations teaches you formal verification of the kind that is used in cutting-edge academic projects like CompCert compiler or seL4 kernel:
Language designers can also include lightweight, formal methods into their type systems like Design by Contract which helps knock out interface/integration errors:
Finally, since logic programming is so fast these days, one can even translate logical specs into logical programs to straight-up execute the specification:
Note: The Z, Allow, SPIN, and Coq tools referenced above all have websites with FOSS software, tutorials, etc. Just Google them.
One area that is becoming really interesting to me is in dependently typed languages and their ability to encode at least some of the properties of the formal specification in the type system.
I haven't had much time to take a deep dive into Agda but what I have challenged myself with gives me hope that this may be the way we write programs in the future (even if it's not Agda per se).
The neat thing is his method was adopted commercially with years of success despite never showing up on Hacker News, etc. It combined efficiency, OOP, basic safety, and concurrency safety (SCOOP). Most talk about adoption of formal specs in theory but he got it done in practice with DbC in industry. Now, tools like SPARK, Ada 2012, and Perfect Developer are using it. You can do it in C, C# and Java, too.
Btw, in case you missed it, his Theory of Programs defines most aspects of programming in an elegant, precise way using set theory:
Learn from the master:
Far as whether they're good, Im not sure as formal methodists have endless arguments on it. The critics basically say you have to put in enough spec and proving work that you might as well do the real thing in a prover then extract it. Chlipala actually does something like that called Programs as Proofs I think. Memory gettimg fuzzy. Just remember dependent types vs proving is in dispute at the moment.
This. This is the thing that took the longest to sink in and had the biggest impact. There were a lot of cool languages, tools, platforms, and systems along the way, and I was stoked picking up each one and coding -- but after decades of that, I realized I was focusing on the wrong thing.
I think the thing that convinced me was when I got to start watching lots of technology teams in the wild across multiple organizations. So many times I would see conversations and tons of problem-solving effort being spent on the tools to solve the problem instead of the problem itself.
A couple of years ago I was teaching tech practices to a team that was deploying on iOS, Droid, and the web. After we went over TDD, ATDD, and the CI/CD pipeline, I emphasized how crucial it was not to silo. When I finished, a young developer took me aside.
"You don't understand. We have coders here who just want to do Java. They want to be the best Java programmers they possibly can be."
I told him that he didn't understand. Nobody gave a rat's ass about people wanting to program Java. They cared about whether the team had both the people and technical skills to solve a problem they have. It would be as if a carpenter refused to work on a cabinet because it wouldn't involve using his favorite hammer. You're focusing on the wrong thing.
Sadly, once you get this, the industry is all too happy to punish you for it. That's because the resume/interview/recruitment world is interested in buzzwords and coolkids tech, not actually whether or not you can make things people want. This means sadly, in a way, if you continue growing, it's entirely possible to "grow out of" programming.
Your insight really hit home. Especially since I've been on a new software dev position for about a month now. This is the overwhelming issue that is already reared its ugly head, is massively frustrating and I feel (nearly) powerless to stop it.
Writing code is just the only way to do the above since all attempts at making it less text file driven have failed so far.
Some of us do CAD/CAM/CAE, the proven tools either don’t exist, or are targeted towards companies like Boeing and GM and cost a fortune.
Some of us work on console games. The environment is pretty close to the bare metal/hypervisor, and typically, there’re significant costs involved in integrating any third-party stuff.
Some of us program embedded firmware. Only recently the hardware became fast enough to reuse those proven tools ported from PC, you can’t use those on a PIC16, just not enough resources.
That’s just what I personally did/doing over my 16 years in the industry. I’m sure there’re other fields where the proposed approach is inapplicable for various reasons.
I sum the quote up as, "Systems are sentient beings like the One True Ring, and they will absorb you. Soon, though you believe you are thinking freely, you will actually be merely a part of the System, thinking what it wants you to think." So true!
But ... Taco Bell programming still creates a new system. That's a flaw in the article's premise.
If you solve a problem by stringing together 11 tools, then yes, you should get some benefits from reusing preexisting tools. But now, you have a system with some rube goldberg characteristics, plus you've written a bunch of "glue code" (which is "new code") in the process.
Those systems can often turn out to be more complex.
And another thing worth pointing out is: most employers hire you to write code. That's the job spec. So don't be surprised that that's what we end up doing.
And expect those changes to take an hour (maybe two).
Doesn't matter what it does. It is literally impossible to sell even $20 worth of the solution the person advocates. Go ahead: link me to anywhere in the web selling a 5-line bash file without anything else. I'm waiting.
You can do a lot in 5 lines, too. lalalala. waiting. While I wait I think I'll make some money coding something people actually pay for. /s
Seriously though, while the author's point might stand in many sys admin and even systems integration roles - most of the software world actually pays for deliverables: the clearest example of which is consumers doing so. People would rather spend 20 hours cursing, and then give up, rather than pay for a systems integration script that generalizes and solves their problem. It's what the market demands. This article really would make sense if it came from the person paying -- but it doesn't. nobody who is paying actually says the words the article chose for the title. Yes, you are paid to code.
Or when somebody can write a 5-line shell script in Bash that:
a) typical non-programmers can and will use
b) produces enough diagnostic information that anybody (programmer or not) can use to troubleshoot when something inevitably goes wrong - whether it's user input, network connectivity, a missing dependency...
I get what he's saying, but he's talking about a pretty small subset of what we really do.
After you write something like that you don't turn around and sell it on the web.
I think its great. My future support burden just decreased. That is a big piece of code that will result in zero bug reports.
Except they might have input in the processes.
I'm not employed as a professional software developer, so I still don't actually use the helpful UNIX commands. Takes all the fun out of it. :P
I find this very hard to believe. Are you sure your awk implementation was okay?
awk '/pattern/' file
Computers do what you tell them to, most of the time.
It's easier to do that with the customer than with the spec - specifications can't talk back to you.
Can you dig a canal with a tried and true shovel? Yes, you can, but you will need a lot of time and many shovels.
Can you cook up bioinformatics analysis in Shell and Awk. Yes, you can, but you if you want a large scale analysis to be done in a reasonable amount of time, you roll up your sleeves and reach for compiled languages, distributed systems and so on.
There are downsides in introducing new systems, but it should be weighed against the upside of suitability and efficiency.
In our industry? Your bioinformatic analysis can probably be done well with shell and AWK (however idiotic those tools are, but that's a topic for another day), and this solution will be more efficient and can be even scaled up. But no, most people will pick the "Big Data" solutions requiring a person full-time for just configuring it and managing the machine clusters it used, all because it's trendy these days. There's this saying that "if it fits on a consumer-grade hard drive, it's not big data", and it's absolutely true.
What we need is more focus on efficiency across the board, and that means things like how much it will cost (in money and time) to develop / update / maintain the solution, taking into account the actual problem that's being solved, and not some imaginary version of it.
Also, a food for thought: over the past 15 years, computers got a million times faster. The software we use maxes-out the hardware, and it offers still essentially the same set of features we had 15 years ago, or 30 years ago. Where did that million-fold power increase go? It definitely didn't go into valuable work we can do with our computers.
 - coming back to probably the single biggest fail introduced by UNIX culture - doing everything in unstructured text. Every tool you use thus has to contain a half-assed parser for the input data, and produces output in a random, undocumented and often inconsistent format; a lot of shell scripting is just adding additional parsers to glue those tools together. Sad thing is, people know how to deal with structured data long before UNIX, but it became forgotten, save for the PowerShell team and the web folks who try to backport JSON into UNIX-land.
 - except in scientific computing, video games and maybe CAD tools.
It's true, though, that you should use a more complex tool if it's necessary to get the job done. It's just not clear that the complexity users or developers are dealing with are intrinsic to the problems being solved. It seems more social than anything. :)
Also for orgs that do the developing for other orgs it isn't really optimal to offer the quick and efficient solutions unless competitors are doing the same.
The company I work for is super successful so there must be something to it.
Engineers who build things from scratch do not understand a very profound fact about software engineering: a computer performs menial, repetitive tasks so you don't have to. Be lazy and use someone else's solution.
When I was 14 I remember writing bubble sort from scratch dozens of times, as well as input scrubbing methods and binary search trees. I didn't know about the concept of libraries. Granted the internet (sourceforge, github, etc) wasn't around and I was writing everything in BASIC or Pascal. By the time I graduated high school I understood that writing code was painful.
Consequently, it's sometimes alarming that most systems I build rely on thousands of lines of code that I didn't write and rarely ever inspect. But I have to trust other developers in order to get my work done in a reasonable amount of time. I don't even consider myself a good programmer, mostly because I hate writing code. Nevertheless I do deliver useful, valuable systems. I know my problem has already been solved by someone else. I like being lazy. It's my best quality.
This isn't addressed in the article, but avoidance of Taco Bell programming is a symptom of a disease. The disease is ignorance; mostly in managers who are out of touch with the technical landscape in which their teams work. Any time I hear of a team building their own tools or doing significant amounts of de-novo work, I try to limit my dependencies on that team. You're never going to ship that shit, buddy, but I solemnly admire you for trying.
I also doubt most people write software from scratch. We use a well proven language, ide, compiler and deployment tool. We use existing libraries, frameworks, databases, caches and services.
Why doesn't that count as Taco bell programming? When's the last time you wrote code for all these things instead of just assembling all of it together?
This article is forgetting the lesson learned, highly configurable generic software that doesn't need a code change to adapt has actually failed us, and is the source of failure. That's why Domain Driven Design was born. Write simple software that target a single domain only. Write one of those for every domain. Do not try to use the same software with multiple tweaks and config hacks for all your domains.
> Gall’s Fundamental Theorem of Systems is that new systems mean new problems. I think the same can safely be said of code—more code, more problems. Do it without a new system if you can.
I agree with the basic idea here, yes. But I would have to disagree strongly with the last part.
The problem with Do it without a new system if you can is that we need judgment to decide when to make a new system and when not. Sometimes just writing a small piece from scratch instead of using (and repurposing!) a poorly-matched prior system is a hell of a lot simpler.
Resumes and interviews seem to be heavily focused on if yu can write code. And there will be lots of talk about how much code you have written in the past.
which I guess is what the article is really railing against - but what it comes down to is in theory you're not paid to write code, in practice you are.
Some time ago, I went for a architect role at a telco which was the largest Progress shop in the southern hemisphere. The interviewer asked if I knew Progress and I said "No but that is only knowledge" and talked about successful projects and later found out I was hired because of that sentence. But I knew 15 languages at the time so what's another language, and sure enough, 3 months later I was reviewing Progress code. You have to convince them to hire you because of your perceived value. I have interviewed a lot and it always cracks-me-up when I ask if they know some technical thing and the job-seeker goes "No, but I've heard of it" meaning they are unwilling to admit that they don't know one of the million of technical languages/platforms/etc. We aren't Leonardo daVinci, who was probably the last person on Earth to know pretty much everything. It's OK to say No. So say No and explain why that doesn't matter.
Regarding the tools we use, I agree again. I have used the same DBMS for 15 years. I know it, I feel safe using it, I know what it likes and dislikes. The majority of all my coding is in stored procedures in the DBMS and I don't really care what the front-end is, whether it's C or some new-fangled hot platform. The lower level your coding the better. Find something that works at a low level and stick with it.
The tone of these types of posts always make me chuckle. And, I see similar types of articles on here often. When I read these, I feel like Dad is mad at me for being a bad programmer.
It's much easier for many people to come up with these articles than it is to write something technical.
I write code since forever, addicted to it. I happen to be in a field that pays good money for my addiction. So far, despite the stupidity of the profession, things turn out mostly OK. Why does someone pay me? Don't care. As long as the relationship is beneficial to me, it's OK. I feel I have more in common with starving artists in other fields, than some corporate official looking out for business interests. So I'd be a starving programmer in a parallel world where programming is not a hot profession. Now, if I had my own business, then I brought all these nontechnical issues upon myself. But then, I'll pay myself for whatever reason I want.
One time, it came the day I was supposed to enter the job market. So I looked at what companies - around me and worldwide - are doing, I read the stuff about "solving business problems" instead of "just writing code"... and I realized that if I were to really care about this, I would be unemployable. Because most of the "business needs" in our industry are trivial bullshit stuff that's at best neutral (more often harmful) to society, and only exists to make money for some entrepreneur. I don't give a shit, and will never give a shit about those problems. So - through many years of depression and burnout - I learned to ignore most of the business goals and focus on doing my part the best I can.
For example, right now I'm paid for developing an application that shouldn't really exist in the first place, and makes little sense to anyone - not to developers, not to people who will end up uing it - except the management. But well, management decides. So if I'm making this application anyway, I'm focusing on making it as useful as possible. Management asks for features. I do the ones that make sense and politely suggest changes in the ones that are idiotic. They're happy. I'm paid. When they're not looking, I'm venting out on HackerNews.
So yeah, to people constantly writing about focusing on delivering business values: please tell me how to look for values worth delivering, in the sea of moral bankrupcy the current job market is.
Now, if I was a partner in a business with some equity stake, then I would mop the floors if that's what I felt would increase the value of my stake. But as a wage worker, when you tell me to mop the floor, I tell you "that's not my fucking job".
People act in their own self interest. Why is this so god damn difficult to understand? Fuck!
1. Leaky abstractions
2. Software bloat (using a massive framework to solve a simple problem)
3. Oops, I picked a lemon, now lets roll back all that glue code.
4. Code dependencies.
5. Upgrade hell.
It's a fad, it too shall pass, assuming programmers still remember how to program non-trivial things after the fad.
Of course a Software engineer is paid to write code - if they're not, you probably want a different person.
Maybe we need a new job title: "Business Value Expander"
Requirements: Glue together everything else that's already been written by other people, and make the business rich.
What could possibly go wrong.
Quite the opposite! Most of the poor quality is due to excessive complexity, little care for reliability, use of too many libraries and frameworks of questionable quality.
All of this comes from the idea that a developer is paid to write code.
A lot of strong engineers that I met in Amazon and elsewhere would use the minimum amount of code and libraries required and spend as much time reducing complexity and deleting other people code as writing new code.
I do analysis, planning, testing, development, client presentations and when I raise this concern with my TL I always get the 'cross-functional team' argument.
You just do everything you're asked to, in order that the business makes money - code ? Meh, anyone can do that.
How many other professions suffer the need to be 'cross-functional' ?
The "full stack developers" have got it covered
(I'm gearing up to build a big distributed "fabric" and reading the SaltStack docs; it's a weird feeling of mixed happiness and disappointment as I discover all the problems they've already solved for me.)
It seems obvious but I'd never seen it put so starkly.
If this were true than 99% of software wouldn't be such crap.
1 instance on aws...sure hope our availability zone stays up next year.
When hardware was expensive the idea was to make everything as tiny and efficient as possible. Now that hardware is cheap the idea is to make everything as scalable as possible so as to take advantage of all that extra hardware you can now afford.
The next phase of this evolution is to go back to making everything tiny and efficient again as we reach the theoretical limits of horizontal scalability. A good way to "force" such practices is to make your developers use single (or just limited) hardware for everything.
I doubt your boss is coming from this angle but I've actually witnessed this put into practice by forcing a small development team to run their code on Raspberry Pis. It really brings home the idea that, "No, really: Your code is slow and inefficient."
It makes people re-think their architectures and often forces them to divide tasks up into smaller pieces that can be written efficiently. Also, nothing says, "ditch your ridiculously complicated and slow build process" than forcing it to run on embedded hardware.
If your build forces a multi-core embedded system with a gig of RAM to thrash and crash you need to re-think things a bit =)
Also... well I'm not sure what HN's policy is on sources/linking but let's just say that you could search right now for "Classic Shell Scripting pdf" or "Learning the Bash Shell pdf" and find those fairly quickly online...
I'm curious to know how many companies Brian has worked for as a software engineer. The majority of the places where I've worked just want you to code. They don't really care about adding value and just want the job done as quickly as possible so customers aren't upset (usually because the sales team promised a feature that should take 2 months in 2 weeks).
It's the main reason I quit and started my own company. I was tired of getting into battles with management over the well-being of their own code base and I was tried of being forced to write terrible and hacky code.
They also want you _doing_ something every hour of every day - and learning/studying/researching/prototyping/experimenting/refactoring/migrating doesn't count as _doing_ anything, because there's no "business driver" behind it. So you have to invent things to do because there's going to be a daily standup at 8 AM tomorrow morning where management engages in the daily "who can I fire because _I_ have to _do_ something, and firing somebody is something" ritual.