For example, I worked with Bill Schelter (GNU Common Lisp). In 1980, prior to the Internet, we were working on a problem (tail recursion). He was using the emacs editor and he found a bug.
Bill stopped what he was doing, fetched the sources for emacs (using ftp), found the source of the bug, fixed it, and sent off a patch. After that he restarted emacs and continued right where he left off. It was a virtuoso performance. It was like watching Victor Borga (https://www.youtube.com/watch?v=dKeqaDSjy98).
I know a guy who plays guitar. He can listen to a performance and repeat it as easily as I can type what I read. He can transpose keys, add riffs, change tuning "on the fly", and even play thru with a broken string, adjusting chords to the new "5 string" form. He also knows guitar characteristics, pedal effects, etc., basically "all things guitar".
Great software engineers I've known seem to have complete command of all of the ideas, tools, and techniques. They understand their tools "all the way down to the metal" and all the way "up to the esthetics". You can see great people in all crafts (like woodworking).
They also write beautiful "crafted" code. I'm keyboarding a FORTRAN punched card deck (for an historical museum). You can see from the code that the author wrote clean, careful, and clear code. You can almost feel how good he was.
Greatness isn't about "team player", "hard working", "good at time estimation", or anything "social". Greatness is a property of the person, a deep love of the craft, and a focus on "doing it right".
>> Greatness isn't about "team player", "hard working", (...) Greatness is a property of the person, a deep love of the craft, and a focus on "doing it right".
In your history I see a great deal of "hard work" and "team player". Moreover, to be able to perform like that, I'm quite sure he has been hard working for a long time. Of course, there is talent and virtuosity which I'm not downplaying. My point is that when people see such a virtuous performance (in all fields, like you mentioned), where there is clearly a lot of talent, there is a tendency to downplay the hard working part.
However, hard work doesn't always produce greatness. It would be misleading to say that hard work is equivalent to all other hard work. It's not. It's synonymous with the phrase, "practice makes perfect." Yet the phrase should really be amended to "perfect practice makes perfect." Aimless, bad practice will led to aimless, bad results.
Mental focus and barricades are huge components as well. The greats are just as human as the rest of us. Comparison is mentally debilitating for everyone, especially if "talented" individuals have a higher rate of growth than the average. Yet I have to wonder, does the individual who "desires greatness" aspire to achieve validation for ego-boosting fictional accomplishments or truly wish to push the edge of their capabilities to their respective human limits? The former have no chance at putting their energy into prospective efforts. The latter may continue to develop and create something the world hasn't seen before, and or realize that they can do things they had believed they couldn't before.
Perhaps a talent in an area is simply the ability to use your practice time on those things that actually take you further.
I do notice the paper defines it as "someone who is willing to work for more than 8 hour days to deliver the software product". I'm thinking meh. I think hard work at some point is a requirement to achieving excellence in most disciplines, in something like martial arts, or music, pretty much everyone I consider to be great has spent some years where their whole life was the discipline. I'm not really aware of someone who got there by raw talent without the practice (there might be examples). My experience with software engineering is similar.
However that's not the same as saying you'll become great by working 10 hour days, i.e. that sense of "hard working". You need to be in the right mindset.
Also this sort of folklore is cool but you have to consider the context, great musicians are not necessarily great in any style of music or with any instrument. That said, some people are surprising skilled in broad areas and the skills do tend to transfer within the discipline to some degree. On top of that, to the novice, an intermediate may be considered great. You really need to have some perspective to be able to recognize greatness and appreciate it.
EDIT: random btw is that most 6 string guitar chords work just as well with a missing string. Most simple chords have 3 notes in them and so with 6 strings there's enough redundancy. If you know where the root, third and 5th are in the chord (which most intermediate guitar players know) you'll immediately know whether the missing string removes any of those (even if it does the chord will still sound fine, you don't necessarily need all those 3 notes). Transposing on the fly is a little harder, but you don't really need to be a master to do that... So stuff like that may seem like magic but isn't necessarily so... Playing by ear is cool, usually people who start playing young are better at this, especially if this is how they started. This skill, to some degree, is also reasonably common...
Of course the proliferation of free engineering tools makes it possible for them to have two lives, one doing the "hard work" that satisfies colleagues during the day, and the other "hard work" that develops talent at night.
I've also seen many young engineers plunge straight into "hard work" consisting of running the CAD terminal and fighting fires, and those engineers were recognized as being productive but lost their fundamental knowledge and did not become great engineers.
TLDR: Scott and his younger brother were both enrolled in piano class. His brother took to it and is now an internationally known musician. Scott meanwhile started off trying equally hard but never got anywhere in music - but won writing competition after writing competition without it feeling like effort to him.
To be the top of the top, you must have natural talent and nurture and bolster that with hard work. The common denominator of all the exceptional people is hard work.
In my experience the most brilliant people aren't brilliant individually or brilliant as a team player or brilliant at X or brilliant at Y, they're simply brilliant. The one exception to this observation, in my experience, is that I've met a few brilliant mathematicians who are very narrowly brilliant.
It would be also inefficient for you to spend time learning how the tool works, setting up the environment etc. when there are people who know it already.
That would be far more difficult if the source code was in a poor state. Some libraries are easier to patch than others, I can think of a couple of libraries I've wanted to quickly patch and soon (not soon enough) realised that it's unpatchable because it's a mess (or perhaps even worse, is too clever by half and has far too much indirection and not enough comments) and would need extensive surgery.
That's not to diminish the strength of your story about Bill Schelter but only to point out that it's easier to do your best work if others are doing theirs.
In your example, this Emacs bug fixing, there is a lot of people who work hard but would just think that the bug is probably difficult to fix (they would fear of it taking too long to do) and decide not to do it.
I know personally how much anxiety can block you from doing things that help you progress. It also makes decision-making slower. I think playfulness can help to alleviate it, but it's hard to maintain in a "professional" setting.
Do people like this exist today? I mean, the complexity at every level in the stack seems to have grown exponentially(maybe?) for decades. Are there engineers who are experts covering every layer across modern hardware architecture, CPU instruction sets, modern operating system architecture, cybersecurity, networking protocols, web technology, database systems, file systems, cloud infrastructure, quality control, Data Science, AI, AR, VR, Machine learning, UI framework design, wireless interfaces, etc. And that's all before you even consider the non-technical skills of software engineering. The bar moves higher every day and I'm not sure this level of expertise is achievable/maintainable.
Let's say a software engineer writes a C++ application that stores documents in an inverted index and he also takes care that the program is stored in a docker container and deployed to Elastic Kubernetes Service via Jenkins. Good luck understanding your IDE, C++, Tries/Radix trees, docker, Jenkins and EKS "all the way down to the metal".
It may be possible for a small subset of those but if the expectation for great engineers really is to deeply understand "all that they touch", I wonder how many can keep up with this standard.
Edit: Maybe I read too fast. Anyways, intuition is sort of the precursor to greatness. You know those designers that pick out colors innately? No amount of color theory studying will get you to that level. You are probably describing someone with an intuition for debugging.
One of those things in life everyone has to learn eventually. Don’t fight the tide, find the waves that are certainly for you.
Obviously no Microsoft employee is going to say something like "Lazy liars who treat people like crap make great software engineers."
At the very least you'd expect a study to have a control group in order to filter out these useless questions. For example send a similarly worded survey to a group of grocery cashiers and ask them if they think attributes such as "Hard work, honesty, integrity, long term thinking, being a team player." make someone a great engineer.
My hypothesis is you'd end up getting similar answers from taxi drivers as you would from Microsoft employees and consequently that there isn't much value derived from this survey or their selection of participants.
Anyways, I'm by no means an expert and could be misjudging this research but frankly it looks to be beyond useless.
Side note: I also thought the negative phrasing methodology used was interesting. They asked “is it possible to be a great engineer without this quality?” which I think is most tuned towards qualifying and ranking attributes.
Will it tell you individually how to become a great engineer? No, there are hundreds of posts, many on medium.com for that. This paper may not be perfect, but I found a lot to consider inside and “beyond useless” is pretty far from my judgment.
*or cab driving
This is standard language in academic research papers. It is there to sell the importance of the research to the reader, in particular to journal editors or peers who review the article. It is mere puffery.
I wanted to tear my hair out. Not only was the output or ultimate deliverable of this gargantuan activity an infographic of largely non falsifiable random words (trustworthy, reliable, predictable, etc etc), I did a quick back of the envelope that the approach to repeatedly solicit feedback on every employee from every business line and every customer they intersect could generate 5 trillion outbound survey “friction” onto their customers and stakeholders in a given year give or take 2 orders of magnitude. What does that cost in terms of non-enjoyment? Does it even work, and at what response rate at that scale? And what does it cost to maintain, since surely the Canadian treasury at least sort of requires or at worst encourages banks to employ trustworthy employees. Worse, how implementable are the outcome and/insights? For a real open requisition is evaluating “trustworthiness” at time of hire possible? Is it possible to compare two candidates on a relative or absolute basis on this metric? And if they are hired, going forward how is that tracked? More surveys?
Perhaps the case wasn’t as bad, and I’m sure the OP HB post of this paper has subtleties in survey design or methods I don’t appreciate, but just to pick on them by cherry picking the first few attributes that stick out: passionate, systemic, data driven, focused, hardworking, persevering, etc?!?!?
Worse, where’s the actual data? Like when Google apparently decided to get rid of middle management it felt like a reversible scientific experiment with measurable outcomes and metrics. This doesn’t. For example, from personal experience, in investment banking being hard working especially at the very lowest and newest ranks is similarly lauded and required and emphasized. But with a fairly simple app that tracks second level activity of which app is open and when and can assign a rule a rule based score, when reviewing a few years of data, the data contradicted the folsky saying.
The data showed above XX hours per week for me was non-restorative, and further above XX hours would create some incremental inefficiency increase, and above even that was a “tech debt” like factor which later required recuperation (which could also be calculated but maybe that was an accident). In English: if (simple numbers) working 10 hours leads to 8 hours of work and 2 hours of goof off for some productivity, maybe working 14 hours is 10/4, and maybe working 17 is 11/6 and also requires 2 hours of rest at a subsequent trailing period. Depending on what’s going on, maybe it makes sense to do that, maybe it doesn’t. The software engineer view would be much more interesting.
It would be fascinating to compare even arbitrary metrics of consenting individuals across a large pool. Who gets the biggest bonus relative to hours worked? What time do peak performers come in and leave at and are there any wfh days? Even trivial but theoretically transferable insights would help: “what percent of top performers drink no caffeine or tea” or “pair programming and velocity”. Heck, buzzfeed baiting analysis like “do top engineers type faster? A 10 part slideshow on why you should switch to a mechanical Dvorak keyboard” or “Can you type XYZ words per minute? Find out if you are dragging your team down”.
Anything but more lengthy repetitions of “trust” and “respect” and whatever the word of the day is.
It seems to me that this debate is related to another question, “Why do so many software projects fail when you don’t see any skyscrapers collapsing under their own weight?”
If you could build a skyscraper according to rules limited only by our imagination, then perhaps we’d see a lot more of them collapsing. And given that frequency of failure, we’d see a lot more debate about what makes for excellence in skyscraper-building.
Maybe a better question to ask is, “What makes for a poor software engineer?” That question seems a lot easier to answer. A poor engineer takes a long time to write unreadable, unmaintainable, buggy code that only partially solves problems that no one has. Ironically, such code often goes unnoticed by its very nature: it takes a lengthy period of time to even come into existence, when it does exist it is so buggy that its failure seems related to bugs rather than that it solves no problems, and the unreadable nature of the code is mistakenly attributed to essential complexity rather than incidental.
And that last point is where I think we get closest to an answer: Good engineers reduce complexity. Bad engineers magnify complexity.
Tolstoy wrote, “All Happy families resemble one another, but each unhappy family is unhappy in its own way.” It seems to me that the same could be said of software engineers.
Why do we not ask this of other professions? Because other professions have metrics and regulatory bodies.
Why are no skyscrapers falling? Because they are well regulated and required a huge infusion of cash to build. A software company with such a large cash infusion is also likely to not fail.
Software engineers are best analyzed on a bell curve and once identified according to such plotting relative to other developers the criteria becomes clear. The problem, for many people, is that selective bias prevents people from seeing the objective distinctions. If not for those blinders the question of what makes an excellent software engineer is largely self answered.
Absolutely. Four out of six software engineers are average and interchangeable - they get tickets of average complexity done, and make mistakes, some of which are caught by other average programmers in code reviews.
One out of six engineers is junior, or is effectively junior, and is a drag on teams until they get up to speed. They can only do simple tickets, every pull request must be carefully monitored etc.
One of six engineers is one standard deviation above the rest. If a new system or module or library is to be built, the team defers to them for architecture. They notice subtle problems others miss. They are up to date on the stack and know what features and problems are coming down the road. They tend to do the needed work outside of what the standard ticket requires.
One feature I have noticed about engineers one or more deviations above the mean is their focus, when needed, on the build system. They work to make a simple, working, fast as possible build. If the build is breaking, or gets too complex, they will address the problem in the build system and then get back to work.
So how are you plotting this bell curve without having criteria to begin with? Sounds like you already decided what you were looking for.
I think this is a great question to use as a thought exercise. We don't see the designs that fail, as they don't pass the review? Using the analogy of software engineering being the design stage (vs build/construction stage), this would be a closer comparison. How many skyscraper designs fail before they end up being "uptaken".
Also, just because buildings don't collapse and fail catastrophically, I assume there are many flaws in the design that get "worked around" during construction. Many flaws (bugs) do likely end up "in production", but they are more of a technical debt type of issue that will be a burden for building maintenance and/or future tenants.
The massive amount of regulation surrounding skyscrapers building. Including legal responsibility.
I think the whole "10×" programmer debate makes more sense when considered in context of Norris Numbers. If the task is on one side of the complexity wall, both programmers perform roughly the same. But if the task is on the other side, one programmer will flounder around while the other will succeed, resulting in the fabled 10× performance.
Sounds awfully like the Myth of the 10x Engineer.
But I don't want to derail the thread ;)
That's basically what "The Mythical Man-Month" (the origin of the 10x developer) says: "Always recruit star programmers if you can, but most of the time you can't. Therefore learn how to best utilise the average ones."
"SLOC per day" is the opposite of productivity: terrible developers can produce tons of bloated and unnecessary code.
The algorithmic HR startup that inevitably follows will be gamed into oblivion.
Greatness is by acclamation, not quantification.
That guy is clearly a top 10 engineer and he hides little about his thought process. His appearance on the Joe Rogan Experience was great too.
Personally, it reminded me of how people say "it's not a sprint, it's a marathon". Yeah, but Haile Gebreselassie runs the marathon at an average of 20 km/h. Some people really do operate at a heck of a limit.
Often it simply takes some time for solutions to new problems to percolate, and you need to take a step back regularly during the process. Forget about the problem completely and do something completely different, interspersed with short bursts of research and reading on related topics. Sooner or later the contours of a solution will start to form and you will (hopefully) be able to realize it much quicker and at higher quality compared to 'working hard' by pounding yourself and trying to force things.
I cannot count the number of times I almost gave up on a problem after working myself into multiple dead-ends, and almost instantly seeing a path forward after taking a few steps back and allowing my brain to work itself out of these dead-ends.
Not spending 1 week of 'hard work' on a bad solution can save you months of work in the future.
Absolutely love it.
I wonder if that might be missing the broader point of the saying. Yes, it's true that some people do operate "at a heck of a limit" but it's much more true to say that human beings, compared to other animals, have distinct advantages and disadvantages. Bear in mind that few other species can actually /generally/ run marathons -- we evolved the ability to sweat and engage in endurance ran /in the pursuit/ of persistence hunting.
Maybe it's just a personal thing these days, but I'm even more curious about min-floors than max-ceilings. Not just because of patterns like this, but because the thought leadership trope of "let's look at a leader in the field and figure out what we can learn from them" is not very useful. Figuring out why it's so easy for me to get into decent aerobic shape pretty quickly as a human being (for millions of years, it was useful for hunting) is a lot more enlightening.
So to get back to your point, JC is probably even more of a "marathon runner" than he looks like. I've seen that with all great engineers (and founders) -- they have the patience and opportunism to pull solutions out of a different problem space from scratch for a six to seven year stretch of time.
It's not about working hard. It's not about working fast. It's about merely surviving -- and getting creative with how you go about doing that. And it is almost always about efficiency.
He is so agile he can run the marathon as a sequence of sprints. I wonder if software developers will be expected to do the same.
Like should you hire a guy to manually set a field in new entries in the database every night instead of writing a short script to do it for you? That is a non code solution, so your answer would be yes. But I'd say no, that is a dumb solution. You could say "But don't follow this rule when it obviously does apply", then I say that figuring out when the rule applies and doesn't apply is a part of being a great software engineer. Short platitudes like what you wrote sounds great but are actually mostly nonsense.
A good engineer stops unrealistic/bad ideas before they even reach a point where people start to think about how the idea could be carried out in detail.
Adding to the ball of mud is often the easiest but not the best option.
You see this is in many shapes and forms - whether it's a discussion what makes one a true "10x" engineer, what lifestyle habits will result in excellent engineers, what personality traits to look after when searching for world-class software engineers. etc.
Having been in many other (technical) fields outside software engineering, I've never seen the same mentality or obsession there.
It's possible for a single Software Engineer to stand up an entire system by themselves. Most other fields don't have anywhere near that amount of leverage.
In our own small little worlds, we build software that sort of fits the business, and collect a check. Woah, how’d that happen? I wonder if we’re like, amazing or something?
How long should it take to do X? If X is building a shelf, servicing a power plant or designing a car headlight I would imagine at least within the field the order of magnitude is pretty clear. Sure, better tools and experience may make someone twice or thrice as fast or the new guy is taking double the normal time, but such spans fit neatly into the human brain.
But now observe how neither of these statements seem off:
Getting my new web-based CRM tool written and published took me the whole afternoon.
Getting the new web-based CRM tool written and published took us 24 months.
Now add to that the fact that nobody really understands software top-to-bottom (or at all) and it is no surprise that everyone is confused all the time, builds weird mental models of things and skills while vaguely striving to achieve the first scenario and not the second.
Honestly, I envy the young Mavericks who code shit and get work done. They at least sleep at night.
Ignoring the burden shifting of on call rotations, I often find the young mavericks don't sleep at night -- they're fire fighting and debugging their work.
Note: after a stroke of lucidity, I stopped accepting those kind of missions.
Being a “great engineer” is not strictly determined by technical ability; in fact, social ability plays a big role in a so-called “individual contributor role.”
I understand some people enjoy being just implementers, though. I don't think that's enough but to each their own.
Let me give an analogy. If you're paid to play basketball wouldn't you know what makes a great basketball player? If you're playing pro basketball, you can be a great basketball player. The only reason why you can;t be great is because there are certain things outside of playing basketball that you have to worry about.
I guess i'm asking cause it's kinda annoying when people tell me what makes a great engineer. Anyone I know who has been doing this a few years knows what makes a great engineer. It doesn't mean they want to be a great engineer because the trade-offs arent just worth it
In basketball, it is quite easy to prove that some people are simply unable to ever be a great basketball player; there has never been an NBA player shorter than 5'3, only 25 shorter than 6'0, and only 10 under 5'10. Are you really telling me NO ONE shorter than 5'3 ever decided the trade offs were worth it?
No, at some point, it doesn't matter how hard you work or whether or not you have any outside issues; some people simply don't have the genetic attributes needed to be great.
Now, everyone can get better, but we all have some limit to our abilities that we approach but don't cross as we work harder and practice. Not everyone who isn't great should feel like it is because they didn't choose to be great, some people (most of us, probably) just aren't capable of being that great.
Of course, most of us aren't that close to our limits, and it is impossible to know what those limits are.... but they are there nonetheless.
There's a simple cure for this: Height classes
You can still be a GREAT basketball player if you were only 5 foot 6 inches tall. Is the best 5'-6" basketball player in the world, not great? They just can't compete against the physically taller players. What if you made a league exclusively for players in between 5 and 6 feet tall?
Floyd Mayweather is a great boxer. But if there weren't weight classes in boxing, and you put him in the ring with some journeyman heavyweight, he'd lose. He's just too small to win.
It's the same thing.
Anecdotal, I'm 6'1. At a middle school age, I was _hounded_ into playing basketball (until they realised I had no talent whatsoever, or any inkling that I wanted to attempt to improve that). There's definitely a selection bias in who gets started playing basketball in the first place.
Back on topic,
> some people simply don't have the genetic attributes needed to be great.
I think this is a massive stretch. I think it's fair to say that some people don't have the attributes to be in the top 0.01% of the their game (in baskebtall, there's a few hundred(?) players in the NBA - that's the equivalent of the John Carmack/Peter Norvig types that have come up here), but is there any proof whatsoever of any form that says that an "average-to-poor" engineer can't become a "great" engineer - (in basketball terms, going from the person who can't dribble to the person who absolutely dominates your after-work league consistently)
This has had a bigger impact than you'd imagine :-) For example quite a few tall football players have been nudged when they were kids to play basketball. But luckily some more open minded coaches let them play and found out that they were actually good. The best example is Jan Koller: https://en.wikipedia.org/wiki/Jan_Koller
2.02 (6'8), so definitely towering above most other people :-)
But the things is, genes do matter at a certain level. I really think you can't be an NBA player if you're 1.50 (I think that would be about 4'8). No matter how fast or resilient you are. You'll just be outmuscled, dunking will probably be almost physically impossible.
And the only example that has been given is that someone who is legally considered disabled cannot compete in the top 0.01% of professional sports. I had a bit of a search, and Jahmani Swanson  is 4'5, and would absolutely wreck pretty much every single non-NBA basketball player you would ever meet in your life (I am aware that the team are an exhibitionist team, and not an NBA team). I'm sure there are hundreds of other examples of people who won't be able to compete at the top 0.01% of the activity in question, but are unquestionably "great".
I would certainly not call someone who is only top 1% in of the general population great... 1% of the world's population is 70 million people... there are an estimated 27 million or so software developers, so more than half of the top 1% of software developers in the world don't even write software. I would hardly call them great.
His take is that someone short may never make it to the NBA, but that shouldn't mean we discourage them from ever throwing a ball at a hoop.
You play basketball in a very static environment. The rules don't change from game to game, or team to team. The duration of a match is constant. How well the team performs is a well known metric.
People do not do engineering in a static environment. The needs asked of one changes from team to team. The quality metrics vary from application to application. The people he/she relies on varies even when you stay in the same team. An engineer who is really useful in one company may be harmful to another one.
(I don’t disagree with your larger point! Just that basketball is an example of a “very static environment“ relative to programming)
I would strongly disagree - basketball is a highly dynamic environment, simply because you have an opponent. Everything you are trying to do, you have someone else trying to stop you. They are as adaptive as you are.
Because of this dynamism, many players who were great in previous eras couldn't survive in the modern era (and vice versa, really)... for example, most centers who played even 10 years ago would be abused in the modern game by opposing centers shooting threes. It simply wasn't part of the game 10 years ago, and now everyone does it. There was literally a player (Roy Hibbert) during the transition who was a star before and suddenly became unplayable as teams adapted to the new environment. He was out of the league in 2 years.
There are 6’6 bums in the NBA, top draft picks, that had less of a career compared to say a 5’3 Mugsy Bogues.
In Basketball, part of your greatness is how well you optimized for the cards dealt to you.
Tech is not that much of a meritocracy like sports, it’s a way to have a living. I’d almost say, the tech industry is not equipped to assess greatness. Since it’s literally people’s livelihoods, we won’t (and shouldn’t) try to analyze this. In sports, if you come out of a top school as a top draft pick, fans will eventually say ‘hey you are a top draft pick but you do about the same thing as someone that went way later in the draft’. We in tech won’t ever go ‘look Facebook, your app looks like the same shit everyone else builds’. It can’t hold up to that scrutiny.
Open source is probably the only place where you can objectively assess.
It’s not a competition, because if it was, and tech was a sport, fans would tear it up as to what is a 10x engineer. Speaking as a sports fan, they are savage.
Reframe the question, who are our Bachs and Beethovens? Were they great because they created a lot of good work fast? Is good even enough? None of that shit defines greatness.
Our best developers came from open source. The stuff they made was undeniable. Is being super productive the same as making undeniably useful programs(languages, tools)?
Look at our greats and I think you’ll see they made a breed of work that sits outside workplace metrics of productivity, the same way a Hitchcock was genre defining.
- their top characteristic is "writing good code"
- they censored the specific VCS
- they describe their insights as (inter alia) "ecologically valid"
This is HN, so... I'd approach the title as "when is x10 possible?"
- insight of a far simpler or faster etc implementation (e.g. better choice of modularity, factoring out hidden commonality, dynamic programming);
- insight about a better way to achieve the end result - might be no code at all
So I'm talking about insight. What personal qualities lead to insight? (note that there isn't always a better way to be found, so personal qualities don't guarantee it)
- be really really smart. Especially, large working memory, to be able to see connections. Or at least be able to load in (cram) the information temporarily.
- the ability to step back and notice the bigger picture. Executive function.
- ability to manage stress, so they have the breathing space to do that (the wisest engineers will choose a workplace that makes this easier)
Skill in proofs is a predictor: flexibility, handling complexity, managing the top-level purpose as well as the detail (forest-for-trees), noticing connections. Mathematical knowledge itself can help sometimes too.
But probably the skill in any intellectual discipline is an equally good predicator e.g. philosophy, history, law
The Pygmalion effect, or Rosenthal effect, is a psychological phenomenon wherein high expectations lead to improved performance in a given area
I'm generally liked where I end up working, but tend to just do stuff on my own. I'm polite, but don't go out of my way to interact with others typically.
I do check the boxes on the code quality stuff however, especially writing code that plans for the future.
> “At the end of the day, to make changes [to software], it still takes a developer, a butt in a seat somewhere, to type [Source Control System] commit. — Dev Manager”
Not entirely relevant, but this opening comment just struck me as funny.
I found this amusing but not necessarily wrong.
Source: ADHD engineer. I don't actually take Adderall. Tried Ritalin briefly but stopped as it doesn't affect me at all. Might try Adderall and report back.
This whole time estimation thing is akin to predicting when the next hurricane or earthquake will occur. The main problem is business people don't understand that so they place this unrealistic burden on engineers.
A manager or business guy who needs constant and very accurate time predictions is a sign of a bad manager that is overly reliant on engineers and lacks understanding of software. A good manager should have the technical knowledge to make a technical guesstimate himself (that will also likely be wrong) and have the foresight to be able to manage delays and allow for buffer time.
A great team of people creating a product consists of both great Technical product managers and great software engineers. A rockstar software Engineer alone may not have the ability to manage the politics of unrealistic expectations.
This myth needs to die. Can you predict if an item will take closer to a month or a decade? If true then it is far easier to predict than hurricanes or earthquakes. You might not make predictions as accurate as management wants all the time, but most can predict how long things will take within a factor of 3x or so and it will be within that margin most of the time, a person who could do that for hurricanes or earthquakes would be the greatest genius in history.
And yes as you get more skilled your predictions will become more accurate. Hence accurate predictions being a sign of skill.
Let me spell it out in an example. Sports. Horse racing or basketball. You have a team of highly skilled players with a bunch of information quantized, including height, weight, score statistics, rebound statistics, biography... etc. And these guys are in a game with a very very controlled set of rules under exactly the same time pressure and everyone still fails to predict the outcome.
In software you have a product. The product is usually not concretely defined and you have a complex code base and you can never be 100% sure exactly how the new product will integrate with that code base... you're also not 100% sure how the code will be put together to define the product. Additionally are you 100% familiar with the stack? Do you know every possible primitive of psql or ruby or python or C++ that you could be using to create your project because I pretty much guarantee you every basketball player more or less knows every possible move and rule of a basketball game.
You're also working with a team that includes people that you have much less information on than normal. You worked with a guy for what at most two years does that give you accurate statistical information to the degree of say a basketball player? Also there's bound to be people you're less familiar with working on the project as well. Are you interacting with other teams as well? Does the outcome of your project hinge on the completion of a feature by an entire team outside of your own?
People can be experts on horse races or sports. Even then they can't predict things accurately. If you were to start making bets on software development dates of completion. You will also massively fail because not only are there more variables in a software project... but you have much less information.
Chaos is a phenomenon that happens to systems we have close to perfect information for. We find that if we have the perfect information of all the particles in a weather system except for say one particle. We find that information about that missing particle will make our mathematical calculation wildly inaccurate.
For software we don't even have anything close to perfect information in a system with multitudes of variables. Chaos will throw any prediction off.
>but most can predict how long things will take within a factor of 3x or so and it will be within that margin most of the time, a person who could do that for hurricanes or earthquakes would be the greatest genius in history.
3x of what. 3x can be big or small depending on x. So if I predict a project will take one year I can be off by 3 years under your logic. If I predict a month, than I can be off by 3 months. If I predict a week, 3 weeks. 3x is pretty horrible if you ask me, it's easy to make guesses within these parameters.
I predict that both a hurricane and an earthquake will happen in a century. I'll only be off by 3x or 3 centuries. Actually I can do better than that. I'm 100% sure multiple earthquakes and multiple hurricanes will happen in the next century and I am 100% sure that I will by 0x off let alone 3x.... Look I'm the greatest genius in history.
3x is not a reasonable margin of error.
It's all politics. Estimation is a purely political game by EMs/TPMs/PMs to try and shirk responsibility for engineering outcomes.
Here's another neat tack to try in this conversation. Ask for individual examples of "good estimators," and ask for details about what makes them good estimators. You'll rarely (if ever) hear that someone was so accurate that it materially impacted a project in a positive way. What would that even look like? "I was sooo accurate that the client success people were able to say that our new product would be ready 6 months ago, and now didn't have to send a follow up email to update the timeline!!!" The only answers I've gotten are that people who are good at estimating are the ones where nobody ever really has to look at their schedule and there is no drama because they are always getting things done. This is highly contextual and rarely has to do with that person's individual estimating skill. It has to do with their project, team, EM, PM, experience relative to teammates, etc. Recently I was given an example of a client-side developer who is far more experienced than the back-end guy building his API, so he's always just sitting around waiting for the API to be finished to build his features. He almost always does 3/4 of his tasks ahead of time and then just waits for the API to be done and puts his ticket in as completed at that point. So he's not really estimating at all, and he's not accurate, it's just that the project is bottle-necked on the back-end dev speed regardless, so nobody ever cares about his estimates.
It all just seems so wishy-washy and bullshit. It's just about pushing people to work hard and get more done ("you need to hit your estimates, think of them like commitments!"). Framing this all as somehow single-handedly the engineer's problem is just a sign of someone who doesn't know how to operate a software development team effectively. Estimation is a team effort in scheduling, delegation, planning, and communication.
Why in the world (if not politics) would anyone invent a concept where it punishes someone who over-achieves and chooses to work harder for a short period of time? ("you should try to be more accurate about estimates, even if you have to slow down your work, and over working like this leads straight to burn out, be careful!) It's all just nonsense invented by MBAs who have no actual experience in anything except inane "policy" and "oversight."
Software project estimation is a real thing, but it has nothing to do with one person's ability to predict the future. It's an analytical data methodology far more than an individual, experiential skill.
That could get tedious. On the other hand, we did do a lot better than normal at hitting our dates.
(I believe this shorter-than-three-weeks idea came from Extreme Programming, but I'm not quite certain of that.)
In my teams, we generally set 8 points as something that would take an entire day. Every number after that jumps up in relatively large increments as they are more difficult to accurately determine
Any point with 1-2 is estimated at < 1 day. Some are literally 20 minute fixes, but ... you don't always know that up front, you just generally know it may be pretty small. 2 might sometimes go up to a day.
A 3 is assumed to be a day or two.
A 5 is assumed to be 3-4 days.
An 8 would be 1-2 weeks.
Anything higher is backlogged until it's broken down into smaller segments.
Not sure how well that compares to usages by other teams, but that's one data point for your question.
No we are back to the original, overall question...
To your first point: Yes, that happens sometimes. When it does, your estimates can be wrong. (Hey, they're estimates - they're not prophecies.) If that happens very often, though, you might add a fudge factor for "that kind of thing". Maybe something like "unknown surprises crop up most of the time, and when they do, they take about 20% of the effort, so we'll make our best estimate, then add 20%". That won't be perfect either - sometimes it will be 40%, and sometimes 0. But, you know, estimates...
Agile exists because a very large number of people dispute this.
But at the end of the day, the question "can we do this in the allotted amount of time" still gets asked and answered.
What agile got right is realizing that the error bars increase superlinearly with duration, and that scope isn't fixed - so frequent estimates with frequent course correction. But you're still estimating.
Second, Agile doesn't ask people to estimate ("respond to change over follow a plan"). Management asks people to estimate.
Jeff Patton says it best in User Story Mapping, the "client-vendor anti-pattern"
> It's the client's job to know what he wants, and explain the details to the vendor. It's the vendor's job to listen, understand, and then think through a technical approach for delivering what the client asked for. The vendor then gives her estimate - which in software lingo actually means "commitment" ..
> The real tragedy is the client understands their problem better than she's able to predict what will solve it. But in the anti-pattern, conversations about problems and solutions are replaced by discussions and agreements about requirements. No one wins.
> Try showing up at your doctor's office and giving her your "requirements". Tell her the prescriptions you'd like written and the operations you'd like scheduled. If she's nice, she'll smile and say, "That's interesting, tell me where it hurts."
> In my head, I picture a continuum where on one side is the word waiter, and on the other is the word doctor. Try to make your working relationships more like doctor-patient and less like waiter-diner.
Sure, we can true-scotsman, but in practice agile asks you to estimate.
In the client-vendor context, you can sidestep that with a fixed price bid. Somewhat. If you're bad at estimating your fixed price, your business will burn.
In the employee context you can sidestep that somewhat as long as you consistently deliver more value than you cost, but even then, making choices requires having an idea of opportunity cost. If you can't give that idea at all, there are usually better uses of the money.
In almost all contexts, you are compensated for your time. Almost no one likes writing blank checks.
I think it would be more accurate to say that those who pay for your time will ask you to estimate on the value which you expect to deliver in said time. That seems like a fair question to me.
And I could write a book on it here (several others have!), but I think your comment points at the heart of this entire problem, namely the disconnect between the work being done and "those who pay you."
If they were truly invested in the value fulfillment cycle, they would never ask for an estimate, they would be clear about what hill they needed to be taken next in service of the product or customer (i.e. "here's where it hurts, doctor") and then help you agree on the smallest possible experiment to take the hill.
in most corporate environments, that wont cut it though... what management wants, and product owners are preassured to deliver, are large wins and overal product milestones, not incremental updates (outside or bug fixes)
> the disconnect between the work being done and "those who pay you."
yes, i think in many places, there is a fundamental tension between the "corporate thinking" and "agile thinking" and without real syncronization of methodologies and culture, any kind of "buy in from management" will usually lead to dysfunction and overall dissapointment
This is a good indicator of how much time you should spend estimating.
Estimating is not and never was part of the job.
Don't be guilted into thinking you aren't great just because you don't have a crystal ball.
Every framework that seeks to scale agile is ultimately a malignant attempt to retrofit it into an older method that is more conducive to reporting.