Hacker News new | past | comments | ask | show | jobs | submit login
Professional Corner-Cutting (2016) (ometer.com)
175 points by MississippiGary 30 days ago | hide | past | favorite | 130 comments



The hook of the article leaves unanswered the question of why Steve Jobs made a big deal out of the backs of cabinets. The answer is that Steve Jobs knew how to sell luxury items to people who identified themselves with the quality of their work, whether they were accomplished creative professionals or cube-dwelling (now coworking) "knowledge workers" or MBAs who take sommelier classes or aspiring writers grinding on their novels at Starbucks. It was partly about the experience of using an Apple product, but it was just as much, if not more, about vicariously identifying with the standards of craftsmanship. If it was only about the experience, and the craftsmanship was a means to the experience, then the backs of the cabinets would not have mattered. They mattered because the craftsmanship was not just a means but was fetishized for its own sake.


Article is spot about its topic - tradeoffs. The professionalism is in picking the right corners to cut.

Steve's marketing was IMO a whole load of bollocks, like all other marketing. it gets attention now when Apple is a trillion $$$ company, and superficial minds wanting a simple answer attribute it to the most visible thing - Apple's marketing. no one bought Apple products for Apple's marketing or advertising. Its for the weirdos. Same theme as other Apple marketing e.g. the bicycle for the mind metaphor, the 1984 ad, etc.

This article is good and its not about the backs of cabinets.


> no one bought Apple products for Apple's marketing or advertising.

That statement would require quite a source. Of course a lot of people bought Apple products because of the ads, like for any company that is a serious advertiser.


Apple had been doing great ads since 1977 [1]

Apple revenue not so great till 2000s [2]

What changed ?

[1]: https://www.macworld.com/article/670956/the-14-best-apple-ad...

[2]: https://www.statista.com/chart/4574/apples-revenue-since-197...


I think the revenue chart is a bit misleading. Apple was very successful with the Apple II and the early Macintosh models (otherwise they would have gone the way of Commodore and Atari), it's just that the market was much smaller in the 1980s - also, I doubt that the graph is inflation adjusted.


You usually need multiple things combined for a successful business. A good product, a good brand, good ads, pr, etc.


Social media and it's amplification of the desperation to show off. Apple products are cool/expensive and I am cool/above average/have buying power/<insert_your_wannabe_reason> to have them while I sip my pumpkin latte at Starbucks. They are also high quality machines but I don't think for everyone that buys them, the motivation is solely the software/hardware stack.


It is not about the backs of cabinets. It is about telling your customers how big of a deal the backs of cabinets really are.


The back of Apple cabinets are are actually pretty good even in the Tim Cook era.

[1]: https://www.cultofmac.com/320883/why-samsungs-design-sucks-i...

[2]: https://ioshacker.com/iphone/image-compares-ugly-samsung-bea...


On the first, having the off-center hole helped with thicker plugs. The iPhone had less of these issues as cables were thinner (and more expensive) in general, while micro USB had a flurry of cheap cables in the while that could be much thicker.

Apple never really cared for ports practicality, we've had the same discussion with the finewoven cases that wouldn't allow for regular thickness usb-c cables. Other makers made more of an effort on that front.

The second would be a better point if iPhone's repairability was on par with Samsung's. Granted Samsung also became worse with time, but it was in no small part because of what the market leader could get away with.

That said, I think your general point stabds: Apple cares about the back of the drawer. But not in the way I personally wish they cared.


There is no excuse for the top of the phone having things like misaligned microphone holes with the sim card tray etc.

That is shown in the first link but only if you scroll down to the second photo.


The first is not really the back of the cabinet, it is more like the design on the front.

The second… more like the back of the cabinet maybe. Although the better style there might also be somewhat functional; the internals are clearly quite different, one could imagine that the nice packaging in the Apple case might be a result of only wanting people to use approved parts.


>The first is not really the back of the cabinet, it is more like the design on the front.

The metaphor seemed to have gone over your head.

>The second… more like the back of the cabinet maybe. Although the better style there might also be somewhat functional; the internals are clearly quite different, one could imagine that the nice packaging in the Apple case might be a result of only wanting people to use approved parts.

This philosophy applies to their entire product line and has essentially been the case since Jobs came back. Hell looking at the first 1984 Mac this philosophy was there.


Tbh from a repair perspective I think I'd prefer Samsung's design where it's obvious what each component is and exposed solder joints that might let you desolder something if needed, compared to Apple's nicer looking bunch of black boxes where you're lucky if there's a label on something to tell you what it hell it even is.

Also TIL Iphones have two separate batteries? Quite unusual.


That was the iPhone X from 6 or so years ago. I don’t think newer iPhones have 2 separate batteries.


I don't understand why someone would even want the ports to all be on the center line.


It's aesthetically pleasing.


Helps you find ports by feel, too.


Cutting the right corners is known as job smarts in the research on expertise, and it's one of the markers of expertise.

Perhaps Steve Jobs wasn't an expert in engineering so he didn't know which corners to cut and therefore avoided cutting any corners out of fear of cutting the wrong one?


It's in the link on the relevant bit [0]. TLDR: his dad built stuff, including as a hobby, and he liked overengineering. He liked he dad very much and also got on board with overengineering.

[0] http://thenextweb.com/apple/2011/10/24/steve-jobs-obsession-...


I’ve always found the term “overengineering” as used to mean “unnecessarily overspecc’d” a little odd. Engineering is all about appropriately selecting and sizing components. Overspeccing something is underengineering it, and usually happens because you don’t have time to accurately size things so you throw the kitchen sink at it.


Good point, overspeccing was the appropriate wording.


> If the technical debt is a problem, 1) we shouldn’t have put it in there, and 2) we should include it in our estimates and address it.

Yes. Don't tell your boss that you'll need to take time to address technical debt. The boss will always say "No don't do that, just add the feature". Sometimes they add "We'll fix that later."

Later never happens and eliminating related technical debt is part of implementing the feature.

Don't ask your boss about this stuff. It's part of your job and it's something you understand that your boss probably doesn't.


The name "tech debt" has always been a bit of a misnomer. Financial debt does come with interest, but it's structured, proportionate, and you can just go pay it off with sufficient money. Tech debt is unpredictable, needs a lot of context to comprehend, needs even more to fix, is subject to the mythical man month, and taints everything else it touches in your product. It can be worth accepting such a structural flaw into your product, for a time, but it's certainly not like asking for an extension of credit.

It's a struggle to come up with a better name, though.


Tech debt is more like being in debt to violent mobsters, who will show up at your house unannounced at the most inconvenient of times, during which they will hit you with a wrench until you pay whatever is in your wallet right now. They will return soon enough, because whatever payment you just made on the spot is never quite enough to settle the debt.


It's an incredible share that we've accepted the "technical debt" metaphor so deeply into our vocabulary that we can't discuss the practical problem without devolving into interpreting the metaphor.

The practical definition is clear. It's a lack of maintenence of core abstraction. Leftovers from different designs, deeply embedded into our solution. Sometimes we plan for it to live in a corner somewhere, but more often we discover the failure of some abstraction all too late.

It doesn't work like debt at all. We didn't sign a termsheet, and we didn't calculate the impact. We need to move beyond this hopeless metaphor such that we can actually discuss what software needs, because right we're staring at a field emptied of nitrogen by years of farming and saying it has a "resource debt"


> We didn't sign a termsheet, and we didn't calculate the impact.

Which is very much like going to the bank and trusting it that it's going to be ok without looking at numbers; or going to a mob shark for a loan, but it's actually Darth Vader and he'll make sure to alter the deal any way he sees fit.

Evaluating what tech debt is going to cost (through coupling and accidental complexity) is definitely part of the job. That responsible engineers yelling it's going to cost a lot are then ignored is part of the systemic cultural problem.


I don’t think I’d agree that financial debt is by definition structured given the concept known as “unstructured debt.”


This is not the right answer for a real business. Sometimes we'll fix that later is perfectly acceptable. If that was good enough for Facebook it can certainly be good enough for others.


And sometimes — quite a lot of the time, in fact — never fixing it is the way to go. It sucks, your users will keep complaining about it, and it'll nag you at the back of your mind forever. But if fixing things properly means somebody else ships first and you have no users and you get cancelled, that bug still wouldn't have been fixed.

But sometimes your product is so buggy nobody wants to use it. It's all trade-offs, just like… well, engineering.


"We'll fix that later" is at best a hopeful fantasy, and at worst a lie intended to manipulate. It doesn't happen.

We need to be clear here. If what needs to be fixed is something that will inevitably cause serious problems down the road, then it should be fixed immediately. If it may cause minor problems then maybe you should never fix it. And there's a lot in between.

The biggest problem is that bosses aren't in a good position to understand the code you're looking at. So if you're a professional, you'll make the call as to whether or not some tech debt is worth fixing now or not.

The boss's job is to communicate the urgency and importance of your task, how it fits in with the company's goals, etc. Your job is to carry out the task with a reasonably balanced ratio of quality to time.


> We need to be clear here. If what needs to be fixed is something that will inevitably cause serious problems down the road, then it should be fixed immediately. If it may cause minor problems then maybe you should never fix it. And there's a lot in between.

I feel like a lot of conversations around technical debt miss this nuance.

There are plenty of systems I’ve worked on that have been either (1) running for years without issue or (2) deprecated and deleted without ever addressing most of the “debt” issues/tickets that my team and I created in our tracking systems.

I think a lot of engineers take pride in their craft (which is good). Until you learn that a cabinet with a rough back is still a great cabinet, you’re doomed to feel uneasy about leaving something “unfinished”. So many people feel they left some “hack” in the code they want to fix. Which is different than a bug. There are endless improvements possible. YAGNI.


I'm sorry but no.

Something that will inevitably cause serious trouble down the road doesn't have to be fixed immediately. It can be fixed later. The proper decision for the business is whether the "interest" on waiting is worth it. There are many examples of this in the real world. It's simply not true that "fix later" is always a lie. What's more likely to be the case is that as new requirements are discovered there will be churn and it'll turn out it wasn't actually worth fixing this thing that you thought would cause serious trouble "down the road".

I'm a "boss" and I'm a software engineer. My bosses are also software people. My CEO is a software engineer. What I expect my team to do is to give me the tradeoffs. What I don't want them to do is make business calls. If we lose a huge business opportunity because we can't make a milestone with a customer because an engineer made a decision on his own to spend weeks dealing with some corner case and the whole company goes down the drain that engineer is not professional.

Again we're not talking about situations where this would be unethical, we're talking about a tradeoff a business makes to go forward with certain limitations of some software in order to meet some business goal.


Its up to the engineers to factor in maintenance work as part of their estimates. Business people assume perfect software and react badly when confronted with the messiness of software development.

Also, Facebook being a real business is questionable.


Meta's market cap is 1.2T, it has $60B cash, revenue 143B a quarter... Not a real business? I mean maybe the original team in FB should have made sure they didn't accumulate technical debt and FB having failed would be a better outcome for shareholders.

Engineers need to give the business the options. Not unilaterally decide that they're not going to accumulate technical debt. That's my point.

And trust me, business people don't assume perfect software (at least if they've been in business for any amount of time). [EDIT: What's maybe different about] business people is they a) look at things as a negotiation. b) want to get the max value for their investment c) likely have seen a lot of engineering projects from the outside. Engineering needs to work with the business not go and decide what's right for the business. This doesn't mean you don't have ethical responsibilities as a software engineer in many situations but there are also many situations where the business needs to call the right balance between engineering effort the things like quality or feature sets.


I don’t think biz is in any position to decide that. They can tell us when something needs to be delivered by, and we can decide which corners need to be cut to achieve that.


The IEEE code of ethics

https://www.ieee.org/about/corporate/governance/p7-8.html

Would seem to preclude working on a site, like Facebook

> 1. to hold paramount the safety, health, and welfare of the public, to strive to comply with ethical design and sustainable development practices, to protect the privacy of others, and to disclose promptly factors that might endanger the public or the environment;

Given that sites like Facebook have been used to spread medical conspiracy theories (health of the public), the whole business model is privacy violations, and the site was used to enable the Rohingya massacre (safety/welfare of the public),

https://www.pbs.org/newshour/amp/world/amnesty-report-finds-...

But I mean an ethical engineer wouldn’t get in the way of a place like Facebook because they wouldn’t work there in the first place.


I'm not an expert in ethics but to me this stance goes too far. The Internet in general is also used to spread medical conspiracy theories. If I work on network switches is that also unethical? Compilers? The job of regulating this sort of stuff is the government and the courts. There are laws concerning privacy, and speech, and companies need to operate within those laws. I would say an ethical engineer should not break the law or help the company they're working for break the law.

I would interpret IEEE's statement as it affects a single engineer to e.g. ensure that they are following best practices in protecting user information, e.g. if an engineer is asked by Meta to take some shortcut that exposes people's information publicly that would grounds for refusing to do that on an ethical basis.

That said as an individual you ofcourse have a choice where you want to work. I can see someone not wanting to work for Meta. I'm sure there are many ways in which Meta also supports safety and health, e.g. some (a lot) of useful and good information is also available on the platform. We should look at the totality of the company's impact, not just one aspect of it. That said I agree that hate amplification and disinformation on the Internet is of concern. It should be up to governments to take action on that. I think with Meta and other Internet companies they have immunity in the US for being sued over content others publish on their platforms.


One reason to have an ethics code is that the law can’t possibly cover everything. Law is read in a fairly hostile manner (in the sense that people are looking for technicalities and loopholes), and also has to concern itself with enforceability, jurisdictions, etc.

Ethics are self-enforced mostly, and they are to be followed in good faith. An ethical engineer should not break the law, but nobody should break the law, that’s the bare minimum for existing in society.

If you thought the internet was overwhelmingly harmful and had little-to-no redeeming value, yes, working on network switches would be unethical.

> I would interpret IEEE's statement as it affects a single engineer to e.g. ensure that they are following best practices in protecting user information, e.g. if an engineer is asked by Meta to take some shortcut that exposes people's information publicly that would grounds for refusing to do that on an ethical basis.

I don’t see any particular reason to think they just mean exposing personal information to the public, when they talk about protecting privacy. I’d tend to assume Meta, like every other entity, is something we ought to protect people’s privacy from.

It is up to you to interpret the meaning personally. But nobody is going to punish you if you don’t obey the code, so why look for an out? Just don’t follow it.


Most unethical things are also illegal. I think the ethical code is reinforcing the idea of personal responsibility. If you're an engineer and you're asked to build a bridge and cut corners that's unethical behavior, and it's illegal and you will actually be punished in the event the bridge collapses. Being an ethical engineer in this context is pointing out the bridge is not designed properly and might collapse, refusing to sign that plan, and making a complaint with the right authorities. Bribery or other similar conducts are also illegal and unethical. I guess there may be some gray areas that are not illegal but are unethical by some morals. I guess an example might be a gambling company or a tobacco company. Tricky.


> Sometimes they add "We'll fix that later."

The answer to that answer is more questions:

- how will we fix it, properly, later when we can't fix it now? Won't there be more features to be done later?

- what if we fix it later and it creates its own issues and then someone will ask why did we touch it when it was working? If it breaks now, we can fix those breakages because we are already touching it.

- if later, as part of which release?

Don't actually ask these questions - they will piss-off your boss even more.


You know I wonder… why do so many of us feel like we need our boss to understand?


Most of us work in organizations where the boss is a former engineer who understands the problem of tech debt perfectly well. If your boss is a former dev and they aren't taking it seriously the problem is usually that they're a bit too keen to say yes to other parts of the business.


Because the boss says they want to understand, drags you to endless meetings to understand, but then manages to makes arbitrary decisions based on none of that information.


Perhaps it's not a close relationship. Perhaps the boss is insecure about his management skills and has to constantly check on the team, to make sure every variable is under his radar.


Because their opinion of us may contribute to how comfortable we are at work, how well we’re compensated, and whether we get promoted.

If they don’t understand what we’re doing or why things take as long as they do, that’s bad for us.


That's all well and good until you have to explain that the reason building something they think should take a week will take 6 months because you have to fix tech debt, or avoid adding new tech debt.


A professional balances the need for quality with the urgency of the task. Your example shows a clear imbalance that indicates the person is not a professional; that person is someone who can't be trusted with that balance.


Sure, you need to balance urgency and quality, but that doesn't mean you shouldn't ever communicate about that tradeoff to stakeholders.

Let me give a somewhat concrete example. Someone from business came to me to ask how hard it would be to fix a bug. It wasn't super urgent. The user experience of the bug wasn't too bad, a minor annoyance. But it seemed like such a small thing that they thought the fix couldn't take more than a few days. When I informed them it would take weeks at a minimum and more likely months, they were very confused, until I explained that a design made in the early days of the company (before I worked there) made this particular bug extremely difficult to fix. We would either have to redesign an entire system in a more robust way, or work around the limitations of the current system which would be faster, but still much more difficult than ot should be. And doing so wouldn't help at all with fixing the myriad other problems caused by this design choice.

And although I wasn't involved in the original design, I think at the time it wasn't a terrible decision. It allowed the company to get something out that worked pretty well, and helped the company grow quickly. But now that technical debt was slowing down other development.


One of my frustrations is how people often seem to drift towards one of the extremes on either side of this argument. Technically, the approach described in this article is called 'pragmatism', but in practice, people have used that term to describe the bad kind of corner cutting.

And whenever you're arguing against someone drifting too far to one of the extremes, you'll often get lumped in with those on the other: to a perfectionist, you'll be considered a lazy, sloppy worker, whereas the corner-cutter will consider you to be super pedantic and not the type to 'get things done'.

On the other hand, when you do get to have a proper discussion on whether a corner is worth cutting — i.e. if it will actually affect the user's experience — and you end up being able to save time without materially affecting the end product or future maintenance work, that is enormously enjoyable.


IME, even the word “perfectionism” is weaponized against us who actually mind risks, results and sustainability of work, beyond of just doing it for the sake of completion and compensation, like the other group. Why is it safe to “denounce” perfectionism in the workplace while the opposite, of calling out a shitty job, is seen as offensive and of bad taste? “Perfectionism” has become shield and shelter for the lazy and mediocre.


Either label needn't be used: if someone thinks the effort isn't worth it, you can have a discussion on what risks will or won't actually play out in practice without having to label anyone anything.


Exactly. But for the managers who don’t have skin in the game and only care about cost and deadlines, once someone denounce your diligence efforts as perfectionism they will be biased against your steering proposals. That’s the real problem, specially in tech, where explaining trade offs to outsiders is not always trivial to do without much oversimplification. I’ve been in meetings where even attempting to have a high-level discussion over an unrealistic scope was taken as unnecessary problematization. Let alone the fact that cutting corners and costs is generally mindlessly rewarded due to lack of vision and an absolutism towards short term gains.


Because in work places they are trying to make money. We want the most impact for the least effort.

There is absolutely a place for the kind of craftsmenship your are describing, but if you don’t understand the incentives of your workplace you are in for heartache.


Oftentimes managers will make demands that interfere with ownership of the code. Such as requiring something to be finished in half the time it would take to do it without technical debt. Over and over and over again.

I have had a lot of clients like this. It's very common especially for lower budget projects. You can literally cut corners you don't really want to or just get replaced by someone who does.

They will often also insist on certain technical decisions like choice of programming language or framework.


More often than not, you have n > 1 implementation solutions possibles. The article said to factor the constraints into the choice of solutions, not the solution itself.

Like if it would take 6 months to implement a backend, but just 1 month with Firebase, and time is a more important than data ownership, you go with Firebase. Then you prepare a plan if you ever need to migrate off Firebase. If you don't have time to do a custom UI, you go with a components library. That also means knowing the range of possible solutions (aka mastery). If nothing else, you reduce scope.

There's a plethora of ways to respect time/monetary constraints without resorting to shoddy code and duck taping.



Now just add another one and you have the conjoined triangles of success.


A cabinet maker has the option to say no because they're the one in charge of accepting the job AND implementing it. At least that's the way the article made it sound.

Most of us are cogs in the machine and are given much less autonomy.

So what is the "professional" to do if given a job and insufficient time to do it?


Our profession doesn't always work the way you describe. Some companies want cogs, some want professionals.

I can't give you advice beyond that, there are too many different circumstances. But in most of the places I've worked, it's been developers, not managers, cutting corners. Sometimes these are good choices, informed by product context and trade-offs. Sometimes it's a perceived urgency that doesn't actually exist. But it's almost never a micromanaging boss asking for something specific.


> But it's almost never a micromanaging boss asking for something specific.

They don’t ask people to cut corners specifically, they just create a culture where speed is paramount above all else. Where the only metric of success is tickets completed per day.


I'm sure these managers exist, but there's plenty of places where they don't. My recent experience has been that the managers I work with, when asked, almost always say yes to extending technical work to do a better job and that it's inexperienced developers who either assume they can't have more time or don't know how to ask in an effective way; these are professional skills after all.

This doesn't mean that there's never a need to hurry, external deadlines exist, but these are the minority in a healthy organization.


Yah. Article’s advice about “owning the implementation” is great and all, but limited to places where you’re not blocked from or fired for doing it.

Notably, from my small sample size (only small startups) - I don’t think I’ve known anyone who was fired for doing it. So it seems like it’s not a thing we can do… but I bet it’s something we can do in a lot more situations than we think.


Advise them of the issue. If they still want it, their funeral.

Anytime I move away from this frame it's not worked out for me.


Exactly. If you choose the approach with no tech debt but takes twice as long, how are you going to hide that from your manager?


Articles in this spirit pisses me off so much, and they’re increasingly becoming trendy. I hate it because they promote the wrong ideas. Downscaling and re-scoping a project is NOT the same as corner cutting. You can almost always come up with an execution strategy for doing a good job while utilizing the available resources to perfectly fulfill the agreed upon requirements. The notion of what’s perfect is always relative. I’ve seen and experienced first hand, over and over, people dismiss basic risk management, due diligence and responsible planning and execution as “perfectionism”. Everyone likes durable, safe and robust products for themselves. Everyone appreciates delightful and frictionless experiences for themselves. But somehow, it’s only a minority that actually cares of pursuing such qualities for others. I’ll repeat what I’ve said in another comment: denouncing “perfectionism” has become shield and shelter for the lazy and mediocre, and poorly worded articles like these just serves to validate and excuse poor craftsmanship. So before trying to weaponize “perfectionism” towards those who actually care about results, people need first to identify their own shortcomings and biases.


Perfectionism in a beginner can often mean focusing on the wrong thing, spinning wheels and going down rabbit holes, leading to burnout.

Perfectionism to an expert often means the opposite. The difference is the expert knows what is worth time and effort.

Somewhere in the learning process, a beginner must intermediate, and then become an expert.


I agree with everything in this article. However, software developers may be craftspeople, but an infinitessimally small fraction of them are treated like craftspeople by the businesses that employ them

The reason software engineers want to push decisions to their managers is that their managers believe in nonsense like slicing up tasks into their tiniest constituent parts and meticulously estimating how much time they will take. Practices like lean and scrum and kanban are managerial conventional wisdom about how to manage a factory, not craftsperson conventional wisdom about how to make good things efficiently

As long as the people running the show are going to hold their people responsible to fiddly metrics and demand to know every detail of decisions that affect how their software people are spending their time, they will act according to those incentives, avoid making decisions where possible, quibble about every little detail

Software should be a craft. Treating it that way produces better software. In order to foment a mindset of a craftsperson, you must treat them with the trust and respect that we give to craftspeople. Most modern humans don't even have a script for interacting with craftspeople these days, and business jocks drunk on rebranded Taylorism certainly don't apply one when managing people they view as a means to better valuation


> A professional developer does thorough work when it matters, and cuts irrelevant corners that aren’t worth wasting time on. Extremely productive developers don’t have supernatural coding skills; their secret is to write only the code that matters.

IMO the true mark of a professional, a truly talented, engineer is knowing which corners to round off before they cut you. Anyone can cut corners and leave the world full of problems for the next guy. But then the assumptions change. A queue gets really full, or we start needing utf-8 for emojis, or someone wants to rearrage the field order in a CSV.

A great engineer would make a system that just works in those cases, because they can be (at least in go, node, python) implemented in just as many lines, just as complex of code, the only thing required is foresight (or its cousin, experience). Many will say YAGNI, but in my experience these things almost always come up (and I'm sure there are many others). Sometimes being a great engineer means reading between the lines of product designs, past sevs, and experience to figure out what the real feature ought to be.


This is ambiguously stepping into "overly general / building for the future" territory. Perhaps that is not what you meant.

There is a fine line where I do agree with you. Cases such as creating a relation table for categories, when you could have made a string array field instead. Or structuring a code in a way where you are not preparing for the future, but also not painting yourself into a corner.

Examples such as that do come to mind. But as I said it's a fine line.

This becomes exceptionally tricky when you are building towards a vision, but only 40% there, and having the code structured for that vision is hard to shake..


The given examples of utf-8 support and reading headers from CSVs are things that (depending on stack etc.) can be nearly or actually free if you build that way from the start. I read the original post as saying something like "good engineers don't shoot themselves in the foot as much".


One challenge is when eg you're doing a code review and see that someone set it up with a non-utf8 column. In that specific case it's probably easy to fix, but it's still strictly extra work even if it hadn't been if it had been set up as such in the first place.

I struggle with making that trade-off in terms of what's worth pointing out. Often what I'll do is point it out, but with an explicit disclaimer that I'm mostly pointing it out because it's good to know for the future, but that it's not a blocker (for larger chunks of work).


Yeah, that's the kind of tradeoff I have a hard time with as well. It's like .. well, you don't strictly need to.. but why not?


It’s much easier in biz software, where what the customer wants is very likely the same as what everyone before them wanted, even if they haven’t come up with all the requirements yet.


Am I hallucinating, or does anyone remember the Steve Jobs story as the exact opposite? He used backs of a cabinet as an example of when it’s okay to cut corners, since nobody would ever see it. I don’t remember where I heard it or what the context was, but it was something along the lines of him trying to get one of the early Mac engineers to ship faster.

There was a lot of corner cutting to get the original iPhone launched. They barely made it in time. The parts that mattered were polished to perfection, but you wouldn’t want to use an iPhone 1 today — all the other parts you take for granted now would be painfully visible then.


You are correct. Steve definitely believed in doing things properly because you know they're there, regardless of who can see it: https://folklore.org/Signing_Party.html

> Steve came up with the awesome idea of having each team member's signature engraved on the hard tool that molded the plastic case, so our signatures would appear inside the case of every Mac that rolled off the production line. Most customers would never see them, since you needed a special tool to look inside, but we would take pride in knowing that our names were in there, even if no one else knew.


That was done in Amiga 1000 cases 1985-ish. I would be surpised if they were the first.

https://www.commodore-info.com/computer/item/a1000/en/deskto...


The parent folklore.org link mentions this happened at Apple in 1982, end of first paragraph.


And, Jobs visited Amiga in 1983 which also suggests the idea came from him.


I think you got it wrong. That's why the motherboards on macs for example are "beautiful" and black.


A cabinetmaker creates the polished front and the unfinished rear. A software/hardware company needs everyone being part of the polished front at least in spirit. I liken it to people dressing for their job.


I haven't used an Apple product in some time, but to me it always felt like their software was the unfinished rear.


lol it's true but every single tech company does it. Most DIY PC "gaming" computer hardware companies do it with their motherboard firmware/software. Microsoft does it with practically everything because their customer isn't the end user, it's the business. Apple also does it in their firmware/low end software (sometimes making laughably goofy bugs/oversights where the edge case is "you did something apple doesn't do by default").

You see it an awful lot with internet routers lol.


I think that's two very different things - a complete set of features done poorly vs. an incomplete set of features done well. In general it's best to err on the side of the latter, it's going to give the customer a better UX.

What exactly was the original iPhone missing? The only thing in the OS I recall was copy/paste. The other things we'd take for granted today likely weren't conceived then, including the App Store. The main issues was that it was a frustratingly slow experience, so OS/proc performance, and EDGE which was the best widely available at the time was insufficient.


> The other things we'd take for granted today likely weren't conceived then, including the App Store.

IIRC, one could get apps on other platforms like Blackberry, even over the web. Apple's anti-innovation was a closed-garden storefront.


Better yet, you could download apps and transfer them to your phone manually, or download them from a webpage even - straight onto the phone. It's really sad how something so easy and straightforward was eliminated in the name of "security" lol.


It didn't have 3G IIRC


But they had a legitimate reason for not including it, including lack of network availability and poor battery life for the modems at the time of release. It's not something they could have innovated around.


“When you’re a carpenter making a beautiful chest of drawers, you’re not going to use a piece of plywood on the back, even though it faces the wall and nobody will ever see it. You’ll know it’s there, so you’re going to use a beautiful piece of wood on the back. For you to sleep well at night, the aesthetic, the quality, has to be carried all the way through.”


Not only would a professional carpenter use plywood, it may be a sounder choice because of its stability/weight ratio.


Yeah, it's a weird metaphor, especially since plywood is a premium, structural material. I wish my Ikea furniture used plywood. Instead, you get fiberboard.

It's also a good example of how the metaphor this blog post is predicated on can fall apart. Your customers care about durability, but they make purchasing decisions based on cost and outward appearance. In a world like that, where you can't satisfy all requirements at once, you inevitably end up cutting corners on the things the customer cares about but can't measure. But is that right? That's how you end up with $200 furniture that lasts a year or two in a home with children or pets.

It's also easy to neglect cumulative costs. Back to software: does it matter if your app uses 100 MB of memory when it could be using 1 MB? On an individual basis, no, because RAM is cheap. Cumulatively, when every other app developer thinks the same, and when you multiply it by billions of devices, your decision might have actually cost lives if you consider the increased emissions and countless other distant externalities.

A milder version of the blog's claim is definitely true. You should pick your battles. But it's all about trade-offs, there are few problems that truly don't matter to anyone.


The point isn't about utility, it's about aesthetics - it's about how the product is finished. What you use in the back should be the same as elsewhere.


Even furniture from The Old Masters doesn’t have beautiful veneered back panels, but plain, unfinished boards.


Sure, but I'm just explaining Steve's reasoning here. People are saying it's a bad analogy without understanding what he's actually saying.


On the back of the Mona Lisa should be ... another Mona Lisa?


A plywood back would be prone to water damage, however


The best option is plywood for the dimensional stability, varnish for waterproofing, and solid wood edge trim to give some resilience against mechanical damage that would otherwise risk delamination. This also hides the ugly part of the plywood.


Thank you for finding the quote. I stand corrected. I think I misread this at the time as him saying that no one will see it, so it doesn’t really matter. It’s an interesting experience to wake up one day and see that the meaning of one of those old stories was exactly the opposite of your takeaway.

In hindsight there does seem to be some truth to this. What surprised me about the Hacker News codebase is that pg almost never cut corners. To this day I still find new features I never realized I wanted. On the other hand, pg never coded anything unless it was absolutely necessary for whatever he was trying to accomplish (or at least he presented his few public pieces of code as such), which is something I’ve had trouble emulating. Coding is just too much fun sometimes.


The correct answer is: I'm not, you're not either.

I've oddly asked a carpenter why he was using plywood, his response was: I'd prefer not to but this is what people want nowadays. (meaning this specific customer)

If you are a software engineer you, with pride, rewrite half the company in rust and the other half in elixer. Tell the board you wont be able to sleep.


I don’t think software is like a cabinet at all. A cabinet is a well-understood thing, the main job of the cabinet craftsman is to implement the existing cabinet idea.

Cabinet makers don’t do sprints because everybody knows what a cabinet is, there isn’t any need to aggressively iterate on the design.

The software equivalent to buying cabinets is buying existing software. You don’t employ a bunch of engineers to install cabinets.

If you went into workshop with a bunch of mechanical engineers and asked “I need you to reinvent the idea of holding plates,” that might take some sprints. It would be a silly thing to do, but it would make the analogy fit.


“Any fool can build a bridge that stands. It takes an engineer to build a bridge that barely stands”.

Absent trade-offs I can do anything.


or as I like to tell my product folks "I can build you anything a computer can do, but you might not like the cost"


The ideas aren't as universal as they are made out to be. "Technical tradeoffs" are often not just technical, but also a business decision. They can have significantly different risks, costs and implications for business strategy. You may not have the insight into these things like someone who makes it their job. The "tenons" comparison frames these decisions as being trivial details that happen in a vaccum which smells like strawman. There are generally more factors to consider in tech decisions, and unlike cabinetry there is often a great deal of uncertainty in how it will play out in the future. This advice could be reasonable in the right company and role, while completely setting someone up for failure in another. There's irony in the article being about corner cutting and tunnel vision.


I like the last paragraph. Will remember that when an engineer slows down a project talking about code smells and revisiting architecture decisions.

> Professional software developers are performing a service for others. That’s the difference between a professional and a hobbyist or an artist.


> code smells and revisiting architecture decisions

This is exactly what we're dealing with on top of shipping new features. An architectural decision that got changed 6 months later so the code had to be thrown out and re-written.

Don't blame the engineers, blame the architects. If you have any.


Why blame anyone at all? Did everyone work to the best of their knowledge and beliefs? If not, why? And if they did, how are they to blame?

I make incorrect decisions all the time. I will continue to make them. Because I do not shy from making decisions and neither should anyone else.


You’re correct, I was thinking of a specific software architect when I wrote that.


There is certainly a a balance to be struck. Are the code smells a structural issue in a piece of software that's supposed to be used and modified for a long time? Then those code smells endanger delivering the service in a medium term and should be addressed.

But are we talking about a one-time migration script or a fad mobile app? Then who cares, the service will be delivered regardless of the quality of the code.


There are people working in software who will _always_ want to refactor something into "clean code", no matter how many times it has been done before. It's just never going to reach a point where that person says "I'm happy with it, let's get back to building features". If you want to build features (at all!) then you have to sometimes tell that person "no, we're not refactoring this again".


I'd probably drop the "or an artist" in the retelling though. Lots of artists are definitionally performing for others.


Found this particularly tricky with hobby projects around cost. I can spend a bit more time on clever engineering to fit it into free tier but that saves me what. Price of a coffee? For loads of extra complexity and time


Cabinetmakers were focused on what their customers cared about.

More likely, they focused on what their customers would pay for. People with Steve Jobs money might be willing to pay a premium for finished cabinet backs.

Most people don’t have Steve Jobs money. But people without Steve Jobs money are no more or less likely to appreciate finished cabinet backs than people with Steve Jobs money…it’s not hard to imagine Charlie Munger having a mid-western view on finished cabinet backs.

Since end users usually don’t see any more or less of steaming turd code than well structured code, there must be something besides “cabinet making” at work. My gut is that strong opinions about code are a way to make the banality of code writing interesting. Strong opinions add melodrama to the mundane work that doesn’t matter much beyond the paycheck that comes with it we all must do.

Getting paid is the meaningful metric of professionalism.


> If the technical debt is a problem, 1) we shouldn’t have put it in there

I guess it depends on your definition of tech debt, but there are lots of examples of things that aren't tech debt at the time but become tech debt over time. You can make the best decision possible today, but in six months that npm dependency (for example) is still going to be incompatible with that other npm dependency you needed to upgrade, and you need a few days to figure it out.

A cabinet maker might not put "make tenons straight" in a sprint, they would include it in the estimate and not tell the customer, sure. But they would definitely tell the customer that their lathe tool broke and they needed a few days for repair before continuing work.


> You can make the best decision possible today, but in six months that npm dependency (for example) is still going to be incompatible with that other npm dependency

This is a solved problem. You either understand the dependency enough to maintain it yourself, or you firewall it through interfaces. Or you tie yourself to sensible projects that gives you time for upgrade. Or you pay for support. Every decision should be taken with a complete understanding of pros and cons, and risk management to reduce the latter.

This cabinet make should either have a backup tool or be confident they can do the repair in a way that doesn't impact their consumer.


You do need to understand the pros and cons, and wow there are some incredible time and complexity costs to maintaining all dependencies yourself, or trying to firewall all dependencies through interfaces.

I wasn't talking about blindly doing stuff without understanding consequences and risk managing. I was rebutting the point from the article that you shouldn't ever have tech debt. You will, no matter what choices you make today.

Firewalling all interfaces to dependencies for example can be future tech debt if you need to move fast to stay ahead of competitors, but all the indirection is making development too slow (or causing developers to find other more enjoyable work). Can't imagine working on a React codebase where React is completely abstracted behind our own interfaces to all of it.


> I was rebutting the point from the article that you shouldn't ever have tech debt. You will, no matter what choices you make today.

I agree

> wow there are some incredible time and complexity costs to maintaining all dependencies yourself, or trying to firewall all dependencies through interfaces.

I'm not saying you should firewall everything. But if it make sense, you do so. You also need someones that understand each core deps enough they can patch it if the maintainers won't resolve an issue (happens more in the npm world). Even with React, you can extract the business logic and reuse it elsewhere. The goal is to avoid depending that much on projects that have less stability than yours (some bad experiences here with React Native). How you do it is contextual, but you should keep it in mind.


Ah I see now, you took issue with the example specifically, rather than the point. You're right, I see how it's a bad example, and what you've laid out is a very good way of handling dependencies.


In a lot of organizations I feel like #2 is often the difficulty. However, interestingly there is an opposite phenomenon where people are too focused on #2 (understanding customer needs) and thus fail at #1 (owning implementation).

I've seen it happen where non-technical users of internal tools request very specific features that get built just because and at whatever cost without the team asking what is the actual problem this feature solves and how can we best build something to address the problem given our knowledge of the technical side. It's often requests for tools that allow people to do things manually when the team should just spend some time automating away the need to manual intervention.


> A professional developer does thorough work when it matters, and cuts irrelevant corners that aren’t worth wasting time on.

The problem, it seems to me, is that, while a furniture maker may have a reasonably stable idea of what their furniture will be used for, I think the makers of truly useful software can have no reasonable idea of the uses to which their software will eventually be put, and so must design so that the program can accommodate the demands put on it, no matter how unexpected or bizarre. How many modern security issues (I ask rhetorically) stem from design decisions rooted, explicitly or implicitly, in the assumption that the software would never be exposed to hostile actors?


There's a reason everyone knows who Steve Jobs is and no one knows or cares who the vast majority of these corner-cutting "professionals" are


Everyone knows that one aspiring Austrian painter and do not know most other heads of states. I'd say not falling into disrepute is a clear secondary goal for many corner cutters.


Haha, fair point. Anyway, I reckon that for Steve Jobs that attitude was at least as much a branding strategy as a sincere personal conviction. Though then again, it's not so easy to say where Steve Jobs ends and the iPod begins.


Craftsmen work alone and are responsible for the entire product. Most software deve work in teams, sometimes large teams.

The entire Craftsman movement under Raskin, Morris etc. started up because of the move of craftsmen into factories, where they worked on parts of the whole and could therefore not be craftsmen anymore.


I remember a customer who would often sigh during meetings and repeat "we're not building the Space Shuttle here, guys" whenever one of the engineers would obsess over an irrelevant detail.

Exactly what I think of when I read this.


> A professional developer does thorough work when it matters, and cuts irrelevant corners that aren’t worth wasting time on.

I think the most pertinent question here is: whose time?


I'm going to have to disagree with the entire premise of this piece.

If you are a cabinet maker you have many individual clients who all want what you're producing. They each have their own budget and preferences that you can work with to get them something they want to buy. The incentives and constraints are clear here. If you can make something the client wants at a price and quality point they can afford you will do well.

Many (most?) software developers create a product to satisfy the needs of many customers and non-customers simultaneously. Furthermore you almost never interact directly with customers; instead developers interact with a panoply of different business interests such as engineering managers, product managers, project manager, product owners, etc. Even worse software doesn't have to be done to ship it as it can always be fixed "later". With the advent of Agile, Scrum, and SAFe it's clear what the business wants is not someone good at a craft; they want an assembly line.

So what are the incentive structures and constraints here? Every other person has incentives for career advancement, bonuses, raises, etc. Most people are too far removed from customers to be directly affected by them. How many times has a cabinet maker been told that the hope chest they're working on for one client also needs to double as a bank vault. Oh and by the way it needs to be done by the end of the quarter (right after layoffs of course) because they are hoping to be bought out. Code bases end up being a fractured mess of bad abstractions, infinite abstractions, and rewrites because developers are trying to accommodate the impossible demands set forth by the business in a way that doesn't cause their job to be abject misery.

TLDR: most software developers are not expected to be craftsmen (or women). The company determines, through incentives and constraints, the quality of code it will produce. An individual software developer risks burnout if they think quality above and beyond what the company allows is within their responsibilities or capabilities.


What I've seen mostly is developers bringing complexity, then cutting corners after that because they don't want to deal with it. In Coders at work, Douglas Crockford said to spend the sixth cycle - whatever the cycle is - on refactoring. It's a sound advice if you care about technical debt as an engineer. Spend some time to revisit the code architecture to see if it still fits the problem you're solving. Instead of spending three cycles on something that could have been done in one because of how fragile everything is. Or having a long list that is tied to the same root cause.

Sometimes it means redo your React marketing website in Node and EJS (or the equivalent) instead of trying to make it SSR too.


> In Coders at work, Douglas Crockford said to spend the sixth cycle - whatever the cycle is - on refactoring. It's a sound advice if you care about technical debt as an engineer.

My point is that very often it's not up to engineers. If a company doesn't incentivize this refactoring and engineers have to - as some sibling comments suggest - inflate their estimates then the code base will deteriorate over time. Even if your team sets up a policy of doing refactoring ~15% of the time this will be overridden by business interests more often than not.

This is essentially a corollary of Conway's Law. You should not expect code bases to be better than the business incentivizes it to be. I'm speaking from personal experience here; this is burnout territory. Keep in mind these are very general observations and every company is different. In some companies what you and Crockford suggest is possible. I'd wager it's the exception rather than the rule though.


I really love that Monty Don's in the hero image of the referenced Amy Hoy post (man on far right)


TL;DR.

Cutting corners sometimes makes sense -- that's why timeboxing is important. But always?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: