> I’m in the relatively rare position of having worked on both Chrome and IE3. To add all the truth @aboodman wrote, both teams were amongst the best I’ve worked with, but Chrome was much healthier, happier and more supportive. Senior industry folks helped.
> I mean even at Google (on a different team) I was a "technical lead" in my 20s, and let me tell you, I had noooo business leading anything technical of any importance. But this is very common! We would never accept this in other fields. Would you live in a house built entirely by junior carpenters in their late 20s who built one or two houses that barely stood up? Would you drive cars designed and built by junior engineers?
I find this kind of funny, because this is what happens right? I was under the assumption that architects typically design the building plans and do all the engineering, and a construction crew (which can consist of people mainly in their 20s) will build those plans under the supervision of the lead engineers/architects.
So, in the same way that many senior software engineers don't write much code, don't architects/civil engineers typically refrain from using power tools to build the actual building? If this is the case, then software engineering is very akin to other engineering disciplines in this regard.
I feel like the author of this tweet is conflating craftsmen with senior leads. A craftsmen is somebody I would expect to have been working with the medium for 10+ years, and continues honing their craft throughout the years. Whereas engineers and architects are typically more concerned with the abstract ideas and overall outcome. An engineer/architect can be a craftsman, but I don't believe they need to be synonymous.
A software architect who doesn't code anymore is worse than useless in all of my experience. They're flimflam salespeople who build PowerPoints full of buzzwords that aren't even necessary, and then delegate everything to developers who actually code to try to build their Frankenstein contraptions.
"But we have to have websockets, it's in the presentation!"
A big part of the problem here is people who become architects because they're bad at code. I agree that practice is important, but it's surprising how slowly competency degrades.
Not having coded in years, I always check with my lead dev or architect when something the team's doing doesn't make sense to me. 95% of the time, it doesn't make sense to them either.
> I was under the assumption that architects typically design the building plans and do all the engineering, and a construction crew (which can consist of people mainly in their 20s) will build those plans under the supervision of the lead engineers/architects.
Draftsmen can also draw up the plans rather than an architect. There are different educational requirements as well as trade requirements before each can begin practicing said trade. (You can't just call yourself an architect just because your last employer gave you that title! both require professional licenses too.)
Plans are approved by a city official and permits are given. A contractor is hired to build based on the plans, and they hire subcontractors. The result often has to change due to unforeseen circumstances, and usually those changes don't result in a redesign. If the contractor sucks balls, the result will be a tire fire unfit for disaster relief housing. If the contractor is good, nearly everything will go as expected, it will be delivered on time and on budget. In between those, a lot of shit gets swept under the rug.
I think the issue is that a lot of those "high level" decisions are also actually made at the code level. It is more analogous to an architect being a city planner, and coders are creating entire buildings (and not just installing the dry wall, but electrical, layout, elevators, plumbing, earthquake stability, etc)
This thread really seems to be burying the lede which isn't just "we didn't have crunch" but the more specific claim that it was engineers with at least a decade of experience having deep technical involvement that made the difference.
I have delivered successful software for 30+ years without sprints or any other ceremonies. I simply keep a prioritised “to do” list and only work on what is the #1 priority. I do it when managing development teams and when working as part of a team. It’s simple. It works. And when asked to change priorities, it is clear from the priority list what the consequences will be. And usually people will change their minds when they can easily see the consequences. It also means that you never need to say no. You simply add the new task to the priority list and negotiate whether it moves up/down relative to other tasks.
Yes I give estimates when asked. I use the priority list to negotiate the number of deliverables agreed to. The list makes it easy for biz people to balance time vs. deliverables.
Yes if #1 is blocked then I work on #2 until #1 is unblocked. And if #2 is blocked then I work on #3 etc.
I use a single prioritised list to manage teams. The list is public to the team and I rearrange it as needed when new discoveries/obstacles warrant it. I also let team member choose which tasks they want to work on. As long as it is one of the high priority tasks. I don’t know if individual team members have their own list. I don’t care how they do their work as long as it gets done at high quality.
I also publicly list tasks completed and the people who completed them. It makes it very obvious to everybody who the most productive people are. It’s a great motivator for the team because everybody wants to be #1.
Unlike IE, Chrome initially built off WebKit, so a lot of the work in writing a renderer was already done. Obviously a lot of work with V8, multiprocess IPC, etc. still went into the effort but still easier than starting from scratch like what IE did.
While this is true, and I'm generally fairly anti-chrome, this does disregard a huge amount of very real, and very hard engineering work.
I think V8 was not actually that complicated - a lot of the "deep" computer science they originally wrote and talked about in marketing material was from the 80s, and things like the hidden classes were picked up almost immediately by JSC at least (as in just a few weeks of non-crunch time). The original JITs for V8 and subsequently JSC were essentially template JITs, i.e not particularly complicated (for JSC the big issue was having to support x86, x86_64, and Thumb2, and managing the security implications on iOS). Even the GC imo wasn't super impressive.
The primary reason that Spidermonkey and JSC didn't have the JIT or hidden shape concepts wasn't engineering complexity, but simply that at the time the apparently important perf problems were with other parts of JS and the DOM - so huge numbers of objects (the argument for generational GC), and non-dom property access (which recall was not optimized in V8 at the time) were just not considered that important compared to other parts of JS and the browser.
But when you get to WebCore and the browser the Chrome folk did a lot of very hard work, the process separation was also conceptually simple, but practically a huge amount of very complex work. You can contrast the difference in time it took to get JS engines performance up to the same order as V8 vs the time it took any degree of process separation in WebKit and Gecko. The first multiprocess Safari was just two processes, the app, and the render process - just getting WebKit going with a single separate render process was a lot of work, it was at least two full releases before true multiprocess was a thing in WebKit (WebKit's multiprocess model makes the process separation part of the engine, vs. the blink model of having the app be responsible).
Then for sandboxing, WebKit obviously got to leverage the Mac+iOS sandboxing that was built into the system. The chrome folk had to build an entire sandboxing system for windows, without kernel support, or in fact any support. Which was another gigantic amount of work.
So kudos to the chrome team, they did a lot of work, and a lot of it was very hard, and shouldn't be reduced or dismissed.
It started with Spyglass, who were Mosaic licensees, but apparently wrote their own code. The story of Spyglass is told here https://ericsink.com/Browser_Wars.html
"Management made the decision to transition our business completely and pursue the market for web browsers. Tim Krauskopf, the founder and head of development, asked me to write a web browser. I started work on Spyglass Mosaic on April 5th, 1994. The demo for our first prospective customer was already on the calendar in May.
I ended up as the Project Lead for the browser team. Yes, we licensed the technology and trademarks from NCSA (at the University of Illinois), but we never used any of the code. We wrote our browser implementations completely from scratch, on Windows, MacOS, and Unix.
We were not the first Mosaic licensee, but we were the last. Prior to us, a company called Spry took the Mosaic code and tried to sell "Internet in a Box". People still seem to get Spry and Spyglass confused because of the similar names."
"Internet Explorer 2.0 was basically Spyglass Mosaic with not too many changes. IE 3.0 was a major upgrade, but still largely based on our code. IE 4.0 was closer to a rewrite, but our code was still lingering around -- we could tell by the presence of certain esoteric bugs that were specific to our layout engine.
Licensing our browser was a huge win for Spyglass. And it was a huge loss. We got a loud wake-up call when we tried to schedule our second conference for our OEM browser customers. Our customers told us they weren't coming because Microsoft was beating them up. The message became clear: We sold our browser technology to 120 companies, but one of them slaughtered the other 119."
I'm not advocating for death marches, but a small project being built in a company that has a money machine printing billions of dollars a quarter isn't reflective of most environments.
Microsoft had a machine printing billions of dollars but their environment was quite different.
I was in IE team on dev tools. We shipped every 6 months, while chrome shipped every month. In the last 2 months of 6 months release, we had feature code freeze. Even small bug fixes weren’t allowed. First month was planning. Our dev pace was slower than Chrome. Chrome shipped a lot more in 6 months than IE did.
Chrome had an insane testing suite that tested pixel perfect rendering and Perf benchmarks. Doing this on every pull request is just amazing.
IE had some work to do. I was surprised to learn that the E2E test results were hosted on some engineer’s dev machine in his office. We used to run tests on our machine before committing to source depot. IE CI/CD had a lot to desire.
My takeaway from IE vs Chrome was that Chrome had good leadership, they highly invested in all kinds of tooking to encode their security, UX, performance bars.
IE leadership was sometimes far away from the user experience. They didn’t care about the little big things.
Yeah but many of those things are a direct consequence of budgets. If you are willing to throw 100 incredibly well paid engineers at nothing but testing infrastructure then you can get that sort of result but it's implausible to expect a normal company to invest the same amount. Even today Microsoft couldn't match Google's investment in Chrome and simply gave up.
It's worth observing here that Chrome was not just unusual in having access to a huge budget and highly experienced senior engineers, but also in the fact that it was the passion project of Larry Page. The Chrome team could take their time, because Page simply wanted to do a browser and always had, end of story. I worked at Google at the time and I recall vividly that Page/Brin never emailed the entire company in the way most CEOs do. They appeared at TGIFs but emailing the whole firm at once? They just didn't do it. With one exception (that I can recall at least): the day Chrome launched. Not only did Page email the whole firm but he sent a short, informal and clearly ecstatic note proclaiming that Chrome was "spectacular".
Schmidt opposed the idea the entire time, so Page and Brin just hired a bunch of browser engineers behind his back and funded what was basically a black project. Schmidt later stated he was basically forced to change his mind "because it was so good". In reality he was forced to because he'd been there before and it would have been self destructive to try to cancel an already running project the company founders were so in love with.
Remember, at the time Chrome launched it:
1. Only ran on Windows.
2. Couldn't print, amongst many other missing features.
3. Had no extensions and wouldn't for quite some time.
4. Had numerous page compatibility problems with many major websites that simply didn't work at all, due to its KHTML heritage, UA sniffing etc.
5. Had a pitch so complicated and developer centric you needed an entire comic book with the developers in it to try and explain why anyone should care.
Not surprisingly given the above, it had a brief usage spike as people tried it out and its usage then collapsed to nearly nothing, taking (iirc) years to rebuild to its initial launch spike level despite being heavily pushed via the google.com homepage.
Also, Chrome was entering a market with largely stagnant competitors, unlike the IE3 team, who were (or believed they were) fighting a successful, energized, passionate company that posed an existential threat to Microsoft's business. The IE team faced time pressure. When Chrome appeared there was Safari, which didn't care about the Windows market and was there mostly as a feature tick box for Apple's operating systems. The IE team was still disbanded at that point. Firefox was an inspiration but basically funded by Google anyway, and was struggling with tech debt.
I'd say Chrome 1.0 was competently executed but in any kind of normal company, with normal goals (like usage or profit), in which the project wasn't the passion project of an invulnerable CEO, they would absolutely have had to crunch like crazy and would then have certainly been cancelled or gone bankrupt despite that, due to the near total focus on engineering over all else.
It worked, in the end. I'm writing this from Chrome after all because Safari is inexplicably buggy even though I'm using a Mac, and with time they did get to add plenty of compelling features. Also being able to outspend their competitors meant taking control of HTML5 and putting other companies in a position where all their budget is taken by trying to keep up with whatever random features they threw into the spec. But anyway, the reason the Chrome team could live normal 9-5 lives is nothing to do with senior engineering leadership. To claim otherwise seems kinda unfair on everyone building new products who does not have a time/money budget set by a near-obsessed CEO who also happens to control the google.com homepage. A good example of a team it'd be unfair to was the Android team, who also had very experienced senior engineers with OS dev experience, but whose position was far more similar to that of the IE3 team. Android's creation was typified by enormously punishing hours and huge amounts of crunch.
The twitter thread was literally a reply to a story of a company that had a money printing machine and still required a terrible death march to ship a browser.
> I mean even at Google (on a different team) I was a "technical lead" in my 20s, and let me tell you, I had noooo business leading anything technical of any importance. But this is very common!
So these big tech companies have a caste system. And no I don't mean the Indian caste system, which obviously has its own controversies. The caste system is really a form of social proof.
Did you go to MIT, Stanford, UW, Waterloo or CMU? Ok, you're in the club. You can join TI (Technical Infrastructure). Out of college you'll be L5 in 2-3 years (the same level an external hire with 10 years of experience will have). You will find yourself on the better projects with more promotion prospects.
This kind of premature promotion is to find the 1 in 20 of these people who are truly talented enough to continue getting promoted to L6-8+.
I worked at Google for 7 years and never went to college either. I was never hampered by that in my career. If anything people were more impressed that I was doing this without the degree than if I had actually had a degree to point at.
Sprints are only really necessary when there's a deadline to hit.
For the longest time, one of the advantages Google has had as a software house is that when you're an industry leader, deadlines are soft because all you're worried about is somebody playing catch up, not your need to catch up to somebody else. As the company has grown in size and scale, calcified a bit, and branched out into spaces where they aren't the leader, the culture around this sort of thing has changed. They are not, for example, the leader in Cloud, and the life of a Cloud SWE is markedly different than the life of an ads or artificial intelligence research SWE.
Sprints are just a formal way for management to check in that the team isn’t down a weird rabbit hole / getting them to do regular check ins with the team for visibility.
Sprints were actually invented for team and culture that is below average. The goal is to keep putting pressure while pretending not to do so. It works well on boring projects or subpar teams. If you did your recruiting right and didn’t end up in pointless projects then you don’t need sprint bullshit. A high powered team is typically self-motivated and can get things done without scrum master asking them what they were upto.
If you replan at each sprint iteration, how does that help ensure a multi-sprint deadline is going to be achieved?
My impression was the sprint is the deadline, so it emphasizes to not have deadlines in favor of small incrementalism. Hence, if there is an actual deadline, sprints I think are exactly the wrong tool to help
sprints deadlines are like milestones. the mantra of strive to deliver something in each sprint helps to prevent the failure mode when a year goes by and everything is in 85% readiness, but the last 15% will need another year (or more!)
Sure, but if you plan out 10 milestones two weeks apart, is that not just "waterfall" at that point?
To be sure, my main point is to disagree that sprints help hit deadlines. They might even hurt.
For example. I'm not sure if that failure mode is really changed via sprints. if you get 10 months in, and 15% left, nothing changes that it could still take a year. Further, the sprints discourage lifting your head up to unblock, design and plan that 15% before you actually start on it. So seems a bit waterfall like to feel incremental stuff was getting done, but you still underscored that last 15%, and sprints encourage you to realize that very late in the game.
I think a big flaw in scrum is error analysis of expected completion dates. Instead of compounding uncertainty across sprints, often that velocity is given no error bars and is linearly extrapolated months into the future (and in a perverse way increases estimate precision erroneously).
So I've written an example in a comment here https://news.ycombinator.com/item?id=32507364 , but to try to emphasize the difference between waterfall and, um, non-waterfall is that historically (as far as I know) the scope was fixed in waterfall (and releases just got delayed more and more), whereas agile (based on "extreme programming" right?) says the opposite. Release often, in time-fixed increments, and try to always release something working that slowly but surely approximates the end goal.
This of course implies that there's an end goal (so it's not surprising that there's a rough plan with so and so many milestones already laid out), but allows for moving that goal. Of course, as I vehemently argued in that comment, doing this by force in a top-to-bottom way, spending more time chasing this constantly moving end goal in planning session, than actually working on delivering something is a folly.
(Of course there are very turbulent situations where a small company/group fights for its own survival, pivots every second week to something else, throws out half of the existing stack every quarter, etc.. etc... but in that case the problem is not with "agile". The problem is trying to do something that has clearly no market traction without sufficient money and a sane business plan.)
The author is using the word a bit differently, but the fact that "sprint" has been normalized as a unit of work for software development, and developers are expected to be in a "sprint" more or less at all times, has always been a source of the deepest absurdity to me.
A "sprint" is, almost by definition, a pace that's sustainable only for short periods. The fact that developers are expected to perform sprint after sprint endlessly, to view "sprint" as the default baseline pace, seems a ludicrous abuse of language.
I think you're being overly literal with "sprint".
And no, developers are not expected to be "in a sprint more or less at all times".
To the contrary -- in between each sprint, there's a review, a retrospective, often a weekend, then regrouping and feature analysis, planning poker, decision on which features to impement, and getting it up on the board.
That's the whole point. Instead of a long marathon without revision/reassessment/regrouping, it's a series of short sprints interspersed with revision/reassessment/regrouping. The metaphor here is the distance traveled, not the speed. Sprints are shorter distances, that's all.
Nobody has ever intended "sprint" to mean that engineers are going faster than in a marathon, anymore than "waterfall" means engineers are all tumbling down white water to their deaths at the bottom. :)
Perhaps the word "sprint" translates primarily to "short distance" in your mind, but I suspect if that's the case you are unusual in that regard, and most people think of the defining characteristic of "sprint" as "moving fast", not "short distance". To suggest that the choice of that word does not imply a sense of urgency and moving quickly seems, to me, totally implausible on its face.
> And no, developers are not expected to be "in a sprint more or less at all times"
Again, perhaps this is your experience. In my experience, in every company I've worked for that uses sprints, the end of one sprint is separated from the start of the next by, at best, a few hours, and often less than that. Planning meetings generally take place during sprints.
And to point out that sprints may be separated by a weekend seems, at best, totally irrelevant, as in my working life, weekends are not part of working time. "You are in a constant sprint at work, but hey, you get to rest at the weekend" is not a compelling proposition to me.
> To the contrary -- in between each sprint, there's a review, a retrospective
"in between" is overstating things a bit. The current norm seems to be two week sprints and all these things are done at the beginning and end of them. To be honest, I personally do feel like this gives the feeling of 'sprinting' because it usually means at least 2-3 days of the sprint are 'wasted' on meetings and that counts for a lot when it's 2/10 or 3/10.
And a lot of the terminology around this kind of process are geared to give you a sense of urgency. Right back to "Extreme Programming".
> The current norm seems to be two week sprints and all these things are done at the beginning and end of them.
Everywhere I've worked that did "agile" did it this way. The ending of a sprint is always the start of the next one; teams are always in a sprint. If that part of the process can't be changed or removed then one of the best things to do is decouple the sprint cadence from other events. Eg don't put demo, pointing, planning, and retro on the same day. Try to do pointing mid-sprint, if you know what's coming up, so that you're not cramming it in at the end. Don't release to staging/prod at sprint boundaries, do that mid-sprint. The worst is to clump all the meetings, distractions, and stresses together.
I mean whether you clump them or not they're still time taken out of a very short period. I wouldn't (and haven't) minded so much with longer sprints, but 2 weeks minus planning/retro minus "one person takes a Friday off and another person is out one of the weeks" always leaves me feeling cramped.
I think the best structure I ever had was 3wk "actual work" sprints and 1wk to address tech-debt and do planning.
I think the language is accurate and that's the problem (or not depending on your perspective) - sprints are all or nothing, and a way to create an artificial sense of urgency. "It's the last day of the sprint, better work late/cut corners because 99% done is failure". And then if you do your reward is one more work item next sprint, to ensure everyone is working as hard as possible all the time.
I've always chuckled at the nomenclature that's crept in around software development. You've got "agile", "sprint", "scrum", "extreme programming", "sustainable pace", and probably other athletic metaphors I'm forgetting. There are various kinds of "stories", often combined into "epics", "retrospectives", "personas" and other literary metaphors that I'm likely forgetting.
"Track Star" and "Literature Major" are not stereotypes that I recall from my peers getting started in software, back then I didn't know many other computer nerds who had done more sports than my half-hearted JV swim and football, which is pretty weak. We for the most part had all been spending way too much time in front of the computer to be truly serious about athletics and literature.
But I'm old, and times change (which is a good thing).
Of course it would also be darkly funny if a bunch of literate athletes invaded the software business when it became lucrative/high-status and immediately started fogging up the windshield with a bunch of vague metaphors to obscure their lack of technicality. Who knows.
It is a poor choice of word. But what's interesting to me is nobody notices the end and beginning of sprinting requires rest, regrouping, planning, getting ready. You can't just stop sprinting and start sprinting the next day like you're ready for a brand new race. Sprints should require reflection, a re-set, getting prepared. Instead it's just "the next 2 weeks of the same bullshit", without proper preparation, without clearing the schedules to allow for streamlined work.
When sprinting, any bump in the road can make you stumble, losing the race. We need more focus on clearing the bumps out of the road before a sprint.
if memory does not fail me, original sprint was 3 weeks long followed by "spare" time till the end of the month (mostly meetings, planning, review, etc..)
Was the word 'sprint' chosen to describe a never ending crunch? Or just to roughly imply quick forward motion. Unlike waterfall that often slows for large portions of time until the next sudden and irreversible release?
Perhaps it would've been more accurate to use something like 'unit' in education. Which is just a subdivision of time and effort roughly equal to a week or two.
To me it is clear that Age != Experience. Early in my career I have worked with many people, while being much older, were not as emotionally intelligent as I was. I have also worked with many people who were younger than I was who were more emotionally intelligent. That is what leadership mainly boils down to.
Also, technical decisions should not come from the top. A leader should look at what something puts together and say "yea you covered your bases, go for it."
My current TL is older than me by a big gap but my experience on many technical things out-ways theirs and so they defer to me. Their experience on the social side of things is far better than mine and I always defer to them for planning and comms advice.
Basically: Age is a very poor proxy for how competent a leader is at running a team, making technical decisions, and many other factors that come to building large products. Selecting by age and putting the oldest person in change of a project is a horrible idea. Case in point: most governments are gerontocracy. How confident in the wise decision making powers of our political leadership?
I did. Or rather, I saw teams that thought they were dong it, because they used pivotal-type tracking tools and stories. But it always degraded into broken-down-waterfall.
"Agile" means different things to different people.
I would argue it's agile if you release early&often to continuously incorporate feedback, even if you don't play Planning Poker in Scrum Sprint Planning every two weeks.
>"I mean even at Google (on a different team) I was a "technical lead" in my 20s, and let me tell you, I had noooo business leading anything technical of any importance. But this is very common! We would never accept this in other fields. Would you live in a house built entirely by junior carpenters in their late 20s who built one or two houses that barely stood up? Would you drive cars designed and built by junior engineers?"
one of the strangest and most baffling things about the entire industry tbh. Like, would you ever expect a 25 year old guy to command a spaceship? Yet in software you have these weekly "I'm 40, is my life over" posts. In most disciplines people correctly acknowledge that there's a sweet spot of skill and experience that overlaps somewhere in your late 30s, 40s or even 50s, yet in software very often we recreate Lord of the Flies, leading to chaotic project management.
Circa Apollo 11 the average age of a NASA engineer was 28. In 2009 it was 47.
There is this constant infantilization of people where once you’re an adult you don’t think anyone younger than you is capable of anything. It’s a big problem that this has been creeping into how we treat people like children at continually older ages.
Nobody knows how to do things when they start doing them, and they won’t until they do start. Delaying this to an older age makes this harder not easier.
There were a lot of WW2 and Korea draftees in the workforce during the 50s and 60s. Their early adult life experiences were significantly different from that of today's Americans.
If you read the OP, the word "sprint" is used in its less formal definition of "working long hours for an extended period of time", not in the formal description of an AGILE time-boxed work period.
This has never happened to me or anywhere I've worked (except see below) because we always had reasonably defined goals, maybe at the high-level only, but they were clear and we could work on them. Except for research projects you should have a pretty clear understanding of what's needed because business is a pretty dull, straightforward thing.
Okay, the exceptions: management failures. When them who should be doing their job are too fucking incompetent to pull their finger out of their arses. Years ago I worked at one place where they had 3 home-made frameworks floating around the company. A framework was clearly needed, management just dithered, so 3 different programmers just got on with it and did the necessary creation. Triple the work done, and an inevitable political war was brewing over which one would be used. And this was a medium sized company producing accounting software, not some inexperienced startup. Oh yes, I do have other stories like this... Crap management is a curse. It's always management IME. Fuckers seem to breed like rabbits too. Yes I am bitter.
GP was criticizing having entire teams including team leads who are too young to have much experience, when there are people around who do have that experience. How do you get from that to "Delaying this to an older age"?
> Circa Apollo 11 the average age of a NASA engineer was 28. In 2009 it was 47.
That has absolutely nothing to do with "treating people like children at continually older ages"
Circa Apollo 11, the average age of the American population was 29. Today it is 38.
In addition to that, NASA had its budget increase over 1000% between 1960 and 1965, so it had to hire a lot of new people for doing something nobody had done before. That is a very different situation than in 2009, when its (inflation adjusted) budget was around half the peak and had been slowly declining for almost 20 years. You don't hire a lot of people in that situation, so your workforce automatically ages.
In 1965 Margaret Hamilton was in charge of all of the Command Module software for Apollo at the age of 29. It wasn’t just young underlings led by experienced engineers.
> Circa Apollo 11 the average age of a NASA engineer was 28. In 2009 it was 47.
I find this highly dubious because circa Apollo 11 the space program employed its largest workforce ever, including bazillions of mechanical engineers making nuts and bolts.
A more meaningful comparison would be what the average age of the managers and leadership are/were.
It kind of makes sense. Space program was in its infancy, so I would also expect the workforce knowledgeable about space to be in its infancy, because it's not like there were experienced people putting satellites in orbit in the '20s, let alone an education pipeline for it. And those were back in the days when people got pensions and had loyalty to companies and whatnot.
By 2009 the industry was old enough to have people close to retirement age who had studied it in college.
>It’s a big problem that this has been creeping into how we treat people like children at continually older ages.
Isn't this a bit flimsy? Eventually, we'll treat all people younger as those who have room to grow -- isn't that to be expected? "like children" is relative.
I don’t understand how this comment relates to my tweet. I agree with you that nobody knows how to do anything and won’t until they start. That’s why you really want to spend some time working with highly experienced people, in whatever career you choose.
I’m not talking about some vague concept of “leadership”. Most of the experienced engineers I worked with on Chrome weren’t even officially “leads”. They were individual contributors just like me.
I’m talking about the nitty gritty day to day details of writing robust reliable software. Class or function here? How should I test this? How should we detangle this spaghetti code? What order should we conquer this bug list in?
If you want to make a good software product it’s best to have people on your team who have made many software products before successfully.
"Circa Apollo 11 the average age of a NASA engineer was 28. In 2009 it was 47."
That data point is from a 2006/2008 report^1; that data refers not only to NASA "engineers" but to all NASA "civil servants". It probably does not include the JPL as that is a NASA contractor, managed by Caltech. JPL employees are not civil servants.
After reading the 2006 report, as updated in 2008, I am having difficulty understanding how this single data point from 2006 is evidence of the "constant infantilization of people". Some further "evidence" would be appreciated to help me understand.
Below is how the report explains the age distribution of civil servants at NASA in 2006/2008. (This does not tell us what is the age distribution at NASA today. To get an idea of just how dated is this report, check out the quote the author pulls from the Strauss and Howe (2000) book titled Millennials Rising.^2)
"Even today, NASA has an extremely low attrition rate. With an average rate of 4.6% per year, the NASA workforce does not turn over very quickly.11 This rate is below NASAs own historical rate, far below the private sector rates, and even below those rates of other Federal Agencies.12
Most attrition comes from retirement-eligible employees. The attrition rate among Boomers in their forties is extremely low, about 1% per year and members of Generation X have about a 4% attrition rate. The Generation X attrition rate at NASA is lower than the Generation X attrition rate at all other Federal Agencies.
To fill critical positions, there has been a low level of full-time, permanent hiring at all times.13
Importantly, the demographics of this hiring have changed over time. Priority was given to fill critical positions to minimize gaps in core competencies, so the trend in hiring has been to hire older, more experienced workers who could best fill those gaps. This has meant that in 1993, NASA was hiring primarily members of the Boomer generation and in 2005 is still primarily hiring members of the Boomer generation.
2. "In obvious contrast, members of the Millennial generation tend to have starkly different views of employment and government.
Entry-level youths will be attracted to solid companies with career ladders and standardized pay and benefits. They will be less attracted to consulting, contracting, temping, freelancing, or new business startups. Millennials will be less inclined than Xers were at like age to take big career risks or turn their personal lives inside out to make more money.5"
Being a "tech lead" at Google is nothing like commanding a spaceship. It's more like being partially responsible for a team of 3-5 mid-20s engineers building the dashboard and reporting for space shuttle wind tunnel test results (or whatever they do with space shuttles).
Personally, I was a tech lead at Google pretty consistently from the ages of 26-35. I got better at it, and responsible for more, over time. It was a good learning experience for me and even when I was inexperienced at it, I was saving someone else some time.
Any advice for someone who just started as a tech lead at a new company? One of the devs interviewed for the spot but didn’t get it and is ten years older than me. It’s mostly younger Junior devs but I want to do a good job. Oh and it’s remote right now.
My plan so far is to take every engineer who is local out to lunch individually to get to know them, and encourage them to use the yearly education/convention stipend to get us all at a convention together later this year.
The main leap imo is you need to think about the project as a whole, not just your piece. You need to "see around corners" to identify anything (even if it's not technically under your purview) that could block the project, then raise the biggest stink necessary to fix it before it becomes a problem. Some lead devs do this naturally, some never seem to be able to.
Do not let your first project with a new company fail. You won't recover from that, even if it's not your fault in any way. Succeed and you will become known as a person who gets things done.
Also be a mentor to the other devs. But I assume you're already doing that if they hired you as a lead and based on your ideas.
I always did weekly 1:1s with everyone I was TLing, even when we were all in the office together 5 days a week. I never felt it was a waste of time. I'd let them set the agenda, and if they didn't have anything on their minds (rare), we'd just chat about whatever was going on in the broader team, their lives, whatever.
Figuring out people's strengths, interests, and where they need to grow is key. You want to give people projects that are going to help them grow by pushing them a little bit out of their comfort zone, but not too much, because you also want them to succeed. Someone on the team needs to work on presentation skills? Giving a presentation to 10 people is a good opportunity to learn from mistakes. But if the audience is 100, maybe don't set them up for that kind of failure. Someone wants to grow into a TL role? Encourage them to host an intern. Etc.
You also need to understand people's career goals. You might be an (in Google terms) L5 TL with two other L5s on the team and a couple L3-4s. Who wants to get promoted on which timeline? Should the high-impact, interesting L5 project go to the L4 who wants to get promoted, or to the L5 who doesn't care about getting promoted but just wants the most interesting project?
A lot of that sounds like manager stuff, I know. When you're a TL, your manager is your partner as well as your boss, because the two of you are working together to keep the team happy, productive, and successful. And you'll both have information and perspective the other doesn't.
Google has a great viral slide deck about how to take credit for things as a TL. It works better visually but I'll try anyway. Basically, if the project is to deliver a big square, the TL's job is to cut differently shaped pieces out for teammates to do based on their skill, interest, what will be a good growth challenge for them, etc. Then, as a TL, you do whatever scraps are left over (imagine cutting a big circle, a rectangle, and a smaller square out of a square). So the IC work you do might seem like weird odds and ends, but if you get the whole thing done and everyone on the team was happy and productive, you succeeded as a TL.
I interviewed at a YC startup recently and the founder was making ageist jokes to me because I was older than 25. Tech is literally insane regarding age. The assumption seems to be that once you get a single grey hair you can't use a computer any more? But I can tell you when I was writing code at like 18 or 20 everything I did was unsalvageable garbage.
It actually amazes me how much goes into writing quality code that is usable in production. I would be very surprised if any junior engineers could do it without messing up a bunch of stuff (which I certainly wouldn't fault them for as they're still learning!) You would need a good mentor if you wanted to avoid that I guess.
…that all this supposedly stable production code with tests and architecture is useless. because you throw away the stack entirely after 5-8 years, because the API has changed, because the tech is not fashionable, because the new SOC/PCI/Bale regulation mandates things that only exist in AWS.
Just hack the minimum together, speed of execution >> durability of the stack.
In the IDF (Israel's military) due to mandatory conscription there are plenty of young people in various leadership capacities. So yeah, you can be 19+, an officer, leading people into combat. Or flying a jet/cargo plane/helicopter. Or leading a software project.
That said, there are older people around in more senior roles and the way leaders act there is very structured but leaders are also expected to improvise when necessary.
It's somewhat amusing when you're dealing with those guys on some sort of technical collaboration. You might have a team from some big tech and then a bunch of "kids" in the room.
They also have explicit leadership training and military discipline. Most of the Israelis I met at TAU (post IDF) were much more disciplined and mature than even American grad students at similar ages. Wouldn’t surprise me at all.
Side note, sometimes I get “flashbacks” of my time in Israel and miss it. In those moments the US feels sorta like a surreal place detached from reality a bit too much. Silicon Valley/California is the worse for that.
I’m curious, which flashbacks do you have? What do you miss about Israel? If you want to share. I want to go there one day so I’m genuinely curious to know more about your experience. :)
It's really more like a "reverse" deja vu. It was my first time traveling overseas and experiencing a very different culture. Tel Aviv was also my first time living in a "major world metropolis". Many of the sites, sounds, smells, and people made a very strong impression (the mangos were just amazing). Tel Aviv's vaguely similar to a more edgy Miami with less mansions and more falafel.
Israel overall has a fascinating mix of different cultures from east and west (with no small amount of tension at times). Many of my assumptions about the world were shattered and your sense of history changes (for an American at least). I had an Austrian roommate and a Palestinian neighbor in the dorms. If you visit, definitely try to get off the beaten path, visit a Kibbutz, goto a Shabbat dinner, etc. Try and visit Jordan too!
Israeli's are generally a passionate lively people, but also serious and pragmatic. They can be... prickly too. It makes sense how Israel has so many startups and technology per capita -- lots of smart motivated people paired with discipline gained from IDF service.
There are 20 year olds who demonstrate fine leadership skills and maturity. There are plenty of 40 year olds who do not. Find the best people you can regardless of age.
Also, often times the only way to get that experience in the first place is to be put into the positions of leadership to develop your skills.
Ok, but what is the histogram on that? The point wasn't "someone young can't possibly do X" but a combination of "it seems strange that we have an industry built almost entirely of young people" and "somehow this industry believes old people can't do things". Maybe there is a good reason for this, but it is certainly strange: it is like we actively don't want experience that I would have thought should count for a lot (in architecture and planning) while demanding sometimes impossible amounts of experience in things that I'd think wouldn't matter at all (using the new, shiny framework or programming language that those truly experienced people are probably avoiding anyway unless it really really offered something they hadn't seen in their decades of development).
Part of it was the speed of advancement of tech - at the beginning there wasn't anyone available but the kids and by the time those kids were old enough to get into management/positions of leadership, they were graybeards and some new tech was the important thing and the only people using that were the kids.
We're finally getting to the point where there's not much "new" each year or decade, so it's starting to slow down again.
Uhm… I’ve seen the problems with people climbing the corporate ladder too fast.
I used to work with this person in his early 30ies and they were in charge of the infrastructure. This person started as a developer and then was tasked with managing infrastructure, while not having never actually worked as a sysadmin and/or having done operations work.
Well… after a while it became clear that the limitations of the infrastructure were a reflection of the limitations of this person’s knowledge and understanding of infrastructure.
The percentage goes up only marginally with age/experience, and that still doesn't keep cultures from hiring older people with zero experience into leadership roles. The culture specifically opts to select older individuals despite there being enough young people with natural leadership skills in contexts where both populations have no experience.
Maybe you’re a father? Own a flat? I’m 40, little CEO, father, house owner and voter, and it feels way to young to own any of these responsibilities ;) Man we have the world in our hands, what do we do now…
None is a bit bold. My college buddies (& lots of people in that peer cohort) founded companies in their 20s and I think they tended more often than not to demonstrate lots of maturity and leadership skills. Certainly more than some stories I’ve heard leaders of bigger tech companies.
My point still stands either way you interpret it. A 26 yold has had 4 years to build leadership skills at college (or even earlier) and another 4 in industry. That’s 8 years of experience which feels substantial. Think about it. There are kids who come out of high school with more coding experience because they got attracted to it at a very early age. There are also kids who do similar kinds of things around leadership.
Lafayette was a major general in the American revolution at 19, he had plenty of leadership skills in his 20’s. Not saying it’s _common_ but historically it seems at least possible.
A title is meaningless without context. Knowing he was a major general does not tell me anything about his leadership skills.
Often at companies (And I presume in other situations as well), people are thrust into higher level positions purely because they happened to be in the right place at the right time and someone above them left.
I’m saying he had the skills because after he was thrust into the position, he lead military operations, then left for France to convince them to contribute more to the American revolution (which they did). Maybe that doesn’t count as true leadership to you but I know it would clear the bar at any company I worked at.
Realizing how often people exaggerate their own abilities and working in a "good" company seeing how Senior+ devs operate, I no longer trust any claim of someones skill unless it's by someone who I already trust or there is sufficient context and evidence to support the claim.
Too many people/companies claiming to be "world class" or whatever when they are above average at best.
I have no knowledge of Lafayette (Not American) and no context or understanding what he did, so my default judgement is that his skill is exaggerated.
I'm not American, but Lafayette was certainly successful at achieving his military and political goals.
It's worth noting he had been trained in a French military academy for officers since 11, and was commissioned as an officer in the French Musketeers at 14.
Additionally he was extremely well connected in the French nobility and clearly had the expectation that men would follow him.
As for his leadership capabilities, Washington cited him in a letter to congress when he was still officially an adviser for rallying US troops during a retreat after being shot in the leg. His record is pretty good: https://en.wikipedia.org/wiki/Gilbert_du_Motier,_Marquis_de_...
I played sports at a relatively competitive level for a long time. I do not remember seeing players with significant leadership ability.
Maybe this is true on a relative scale (E.g. good leadership skills for their age), but I don't believe it's true on an absolute scale and an absolute scale is the only useful scale in a professional environment.
I also believe that leadership skill is domain-specific, so having significant experience in your domain along with good leadership skills (Real skills, not from courses or workshops or whatever) does not at all seem feasible by the age of 20.
A person with leadership skills can still benefit from experience.
My current company is lead by someone with good leadership skills, and successfully managed a commercial team. Now he's trying to lead a company, and it is clear that he does not have the necessary experience to make the correct decisions.
Just wanted to point out at JPL there are many 25-year-old people commanding spaceships, rovers, and landers. Sometimes room-fulls of them. Not a great example.
Yeah, I was "flight software" on a control shift for a Dragon space capsule when I was 26, and the "mission director" was probably a year or two younger than me. Other shifts had older directors though.
Flight software is a fun role because they're the first person anyone asks when something goes wrong. "What's causing that sensor to glitch out, is it a bug?" "Uh no, probably not. Hey I'm just a software engineer, but is it possible the pyrowhatzit is actually melting right now?" "Oh crap"
There were teenage lieutenants and commanders (of ships) in their 20’s in the Royal Navy in the 18th and 19th centuries. Even a sixth rate ship had well over 100 men on board and smaller, unrated ships could have a complement of about 100.
I do think there are lots of advantages from experience, both on the technical side and with organising teams but I don’t think it’s totally crazy for young people to sometimes be very useful and productive. Plenty of the companies YC funds are founded/run by people in their (often early) 20’s and some end up being managed well and others poorly. If it were the case that young people so obviously weren’t able to do it, I would expect YC would have changed their behaviour by now.
Isn't the main reason just that failures in the vast majority of software aren't that bad? Like if you build a bad house it could collapse and kill someone. If you build a bad calendar app/email client/text editor it might... be a bit difficult to use? Crash sometimes?
And then people wont use it because it's bad and your company will go under, problem solved. Some other 20 year old (or 40 year old) flying by the seat of their pants who happens to be good at development will step in and build something that works.
Obviously if you're building medical device firmware, or aircraft control systems or whatever then it's a whole different story. But generally companies building that sort of thing do have a much more systematic and dedicated approach to quality. And there are industry regulations that enforce that.
And still there are so many leak of personal infos and startup X having been hacked and Y millions of data about personal infos and account infos being leak in the dark web. But like nobody really care. It does a few articles in tech news and some tweets and that’s it. (Now I’m not saying that I don’t think it’s bad but the millions of people around me don’t think it is)
- Sure, a senior might not use that NPM package coming from Zorglub89 on the internet, but then they’ll also execute slowly because of the hurdles of checking for these packages.
- Sure, a senior will perform external security audits regularly, but a lot of seniors are also in it for money or career and will be rogue.
In most wars until comparatively recently it was common to have 25yos in charge of 100s to 1000s of men, including on occasion general officer rank. Even with modern peacetime career trajectories it is possible to have a 25yo captain commanding a company of ~200 soldiers.
99% of software cannot collapse and kill you if it is built incorrectly.
In fact, this idea that incompetent people have never built buildings before is just wrong. There are plenty of examples from history of unqualified people somehow being given the job of constructing something that then collapses killing dozens of people.
There is some safety critical software and I hope that that is written by experienced people. But basically all buildings are safety critical.
Although, I believe that info is now out-of-date and I don't want to engage in celebrity stalking behavior. Let him have his happiness. I honestly don't begrudge it.
I'm blessed down here to be sitting through stories about living the one life you have, right?
> We would never accept this in other fields. Would you live in a house built entirely by junior carpenters in their late 20s who built one or two houses that barely stood up? Would you drive cars designed and built by junior engineers?
I find that observation interesting. I can actually tell, when reading about project failures, that no one with any real-world experience had a hand in it. Someone came up with a hare-brained idea, froze out anyone who would dare to say "yeah...but," and went full-throttle.
I mean, I wouldn't bet my life or savings on it. But sometimes it is amazing to watch what some dedicated people can achieve at full throttle, even if/when they make mistakes.
> Like, would you ever expect a 25 year old guy to command a spaceship?
Genghis Khan was 20 when he started assembling his army. You can have leadership at any age. Some organisations such as the military bring in young people to directly be leaders. You need to look at people's ability, not their age.
apparently according to Google most Mongol leaders died in their 30s, the demographics in the Golden Horde were somewhat different than today. The guy who leads the Taliban in his 20s isn't exactly the Mozart of terrorism, it's just a dangerous job. More importantly comparing world historical figures to your average modern day senior project manager is kind of wild. Everyone in the software industry may think they're Alexander the Great, but they're likely not. Most senior military staff is also old.
What I'm saying is obviously that if you looked at merit, on average, software teams should be older than they are, not that it's physically impossible to have a good leader who is young.
if you have that much merit in tech at 30/40 and not working pro bono, you already have enough money to retire, or work on whatever you want/own your own company, instead of slaving away for a megacorp.
I think the employees in their 20s are valued because they have the time and the energy to "power through". They work hard and party harder. But this powering through results in things that, yes, work, but are engineer-hostile and hard to maintain. They also quantitatively produce more output, useful or just overly complicated.
Us fossils think more before we do anything, because we ain't got all life, and we definitely want to keep it as simple as possible, as we don't want to be getting calls after dinner time - at 6PM :)
> Yet in software you have these weekly "I'm 40, is my life over" posts.
I'm not sure, I feel like on average there's a decrease of openness/neuroplasticity or call it whatever you want with age: people sometimes get set in their ways, choose approaches that worked for them in the past and will be less likely to jump into new technologies as often - having families and other life responsibilities (and having a better ability to balance life and work) will lead to a bit less exploration.
This doesn't apply to everyone, of course, and even then the premise isn't set in stone, but I'd say that you sometimes exchange the desire to jump into bleeding edge technology and innovative solutions for picking approaches that have minimized risks and are more likely to work out long term, a bit like: https://boringtechnology.club/
Not to say that either is better than the other, there is definitely a lot of merit in having experience and having worked on lots of systems in the past. I just wish that the industry itself didn't move ahead so fast and we focused more on the quality of what we have: e.g. more like PostgreSQL, rather than the new seasonal NoSQL database, or looking at React after a few months and finding it pretty different already.
Biases in hiring, of course, are likely to just make the conditions worse for everyone. Ideally, you'd have a mix of engineers of different seniority, that could successfully collaborate and learn from each other. Disclaimer: the biases in my own post might be construed as ageism in of itself, though I'm also kind of speaking of my own experience - as I gradually age, my focus shifts towards building rather than exploring.
I am afraid you have an unreliable source. The Constitution was ratified by the states, not signed by delegates. At the time of ratification, the average age of the delegates was 42. James Madison, for example, was born in 1751 and was 36 years of age in 1787. Alexander Hamilton was born four years later and was 31 in 1787. There were only four delegates in their twenties.
> If we're being VERY generous, we've consistently lived past 40 for the last 2,000 years.
That's not... that's not how statistics work work at all.
Life expectancy was 30 because half of all babies died, and on top of that childhood diseases took out a bunch more. Eliminating this has been the vast majority of life expectancy increase.
If you made it to puberty in antiquity, you were pretty likely to make it to 60 or so. Y'know... assuming you didn't live in an area the Romans or Mongols wanted.
Do you have any sources on that? It l that's an interesting way to look at it. I would have expected life expectancy to be way lower than you suggest by your calculations. (IE: 9/13 babies died*, with 60 as full life, meaning 18 yrs expectancy).
* All I know is that humans used to produce a lot of babies because a lot of them would die, but my googling sucks
Until the middle of the 20th century, infant mortality was approximately 40–60% of the total mortality. Excluding child mortality, the average life expectancy during the 12th–19th centuries was approximately 55 years. If a medieval person survived childhood, they had about a 50% chance of living 50–55 years, instead of only 25–40 years.
9-to-5 here is figurative. No one is tracking hours for software engineers at most companies. It just means: you start working in the morning, and you stop working in the evening. As opposed to working until late at night and/or weekends.
I understand that. 9-to-5 is a common term, but I wonder why - many mention clocking out at 5, does that mean they arrive at 8 in the morning? Or maybe some have lunch on the keyboard while continue working.
In my last two jobs, one in the Bay Area and one in Vancouver, my usual schedule has been arrive at 9, take a 45-60min lunch, leave at 5 and I've never had anyone tell me I'm not working enough or producing enough output.
In the Bay Area I’ve always rolled in around 11 and left around 5. Maybe 6 if I really need to get something done. When I worked for a remote company I did maybe 20-25 hours most weeks. Everyone’s always been very happy with my work. I’ve gotten offers for seed funding from founders in exit interviews. Penny pinching your hours is a cargo cult.
Bay Area here. No real hours however "business hours" are 9 to 5, that generally means any meetings need to be within that time frame (ideally actually more like 10 to 4). It is common courtesy to also not schedule 12-1 for lunch, but that gets thrown out the window a lot and so working lunches are the norm. Even if I don't work through lunch, I usually only take 20 min or so.
Most people are in the office (when they are actually in the office, which isn't often these days), from 9ish to 5ish. I've worked at many companies that offered dinner at 7 to encourage people to stay late, but that only seems to have the effect of having people start later. No one really tracks hours as long as the work is getting done. Plenty of time I'll cut out of work around 2pm for something and then check in later in the evening for a couple hours.
General trend is people in their 20s work longer and later. Most people over 30 at any tech company I've worked at are out the door before 5. But also people over 30 tend to more reliably be in the office at or before 9.
I live in a Southern European country where the standard for office work is 9-18 with a one hour break starting between 12:30 and 13:30. On my last job as employee we were more flexible. I was starting at 10 maybe with even a 90 minutes break but I was usually in office until 19 or 20, which was OK because traffic was insane before then.
It refers to "being at work", not "actively heads-down working on something".
If you come in at 9am, do work, have lunch, make coffee, work more, suffer meetings, work, chat at the water cooler, work again, and leave at 5pm, you're working 9-5.
Software engineers are usually not tracked hourly. There are common exceptions such work done for government programs, or contracting. Even in these scenarios, hours are not usually tracked by any authoritative system. In the end, the only feedback you get is based on softer metrics like availability during business hours, or on time completion of work.
In my experience, lots of engineers will show up far after 9 AM and leave well before they have reached a full day of work. Its a very privileged system that exists because it is so hard to hire engineers. At least for now.
Here in Budapest (Hungary) it varies what each company requires, but the general behavior was (before COVID) that developers show up at the meetings and do whatever between. 1-2 hour lunch breaks were common. Folks who smoke spent also at least a hour per day AFK. After lunch getting a coffee easily used to take at least half an hour, etc.
For example at EPAM (big outsourcing company, badge tracked time spent in the building) and at LogMeIn as long as the sprints were "green" everything was fine.
I really don't understand the "sprints" approach to development - I've never worked on any project where it makes sense, either the things I've been working on take less time than a "sprint" or they take longer, and that's for any sprint length you can produce.
Real software requires some degree of planning, and sprints seem to be more an attempt to avoid that planning. I don't mean to the level of gantt chart hell (I've experienced that as well).
Sprints, at least as they actually occur in the real world, seem to actively harm any large scale projects, and increase the overhead for long term projects if you can get them to fit.
the team should not pick a task until it's clear what to do and until it's broken down into something that they think/agree can be delivered in one iteration.
this is usually called refinement. (as the raw user requirements are transformed into a concrete design and then tasks)
if something is so unclear that the design/planning itself requires prototyping something, then that can be added as a task for the sprint. then when it's done, on the next refinement the team is in a better position to find a good design.
Before Scrum we had 3x 1-hour team meetings a week to talk about what we're on and what we might need help with. After we went "agile" we moved to having 3x 1-hour team meetings a week with the same. Power to the developers. :)
Ha, I remember the first project I encountered that "went Agile." The weekly team meeting (which everyone complained about) was replace with daily standups (30 minutes), a two-hour retrospective (useless) every 2 weeks, and a 2-hour sprint planning every 2 weeks. Power to the developers indeed.
So basically, you are able to find the time to criticise your team process in a completely unrelated HN post. But when there is dedicated time for that - in the process itself - you call it "useless" ? ;-)
Not unrelated: the entire comment thread is about developer process.
And retrospectives were useless in the most literal way. The time spent every week going over "What worked well? What could have gone better?" did not materially improve the team's effectiveness.
The next retro is the place to discuss why these retro do not yield to any measurable improvement, and agree on a better process/schedule that works for your team.
If your point is that your team processes are sooo perfect and sooo amazing that there is nothing to criticise, review or change, I'll have a hard time believing that.
If your point is that no matter how much time you spend on these meetings, nothing changes, I'd argue that you need to spend even more time and energy on them on them as you clearly haven't figured out how to efficiently gather, analyse and act on self-reflection of you team processes. The retro itself might be the starting point of your thought process since it seems so "useless" and yet is absolutely essential if you don't want an external "manager" to do that job for you.
The team was not particularly more effective after doing the Retrospective ceremonies for many months. The themes that emerged on day one were the same every sprint and the same as for any software org: more communication needed with users, more alignment with stakeholders, faster feedback helps, etc. Every team I’ve worked with has strived to improve on these, but I never saw any actual improvement that I could trace back to the Sprint Retrospective.
I think they wanted the best people on it and accept whatever they wanted/needed? I mean it is one of their most important products to dominate the web.
A sprint means to go as fast as you possibly can, and is associated with exertion to the point of exhaustion. There's a reason managers love the word sprint.
> Would you live in a house built entirely by junior carpenters in their late 20s who built one or two houses that barely stood up?
Well carpenters don't build houses by themselves. You might be talking about a framer, whose job is definitely important, but not really more important than most other of the trades needed to build a house. But actually by their late 20s a framer can have a decade of experience, more than enough to become a journeyman and frame with one or two assistants. It depends on when they apprenticed and became journeymen, where they learned the trade, and what their contracting and business ethics are.
> Would you drive cars designed and built by junior engineers?
Designers don't really build cars, but junior engineers are involved. Typically not leading themselves, though, as the handful of big car manufacturers can hire senior engineers to oversee them, and screwing up the launch of a car won't make it easy for you to work for one of the few competitors. People who make custom cars have probably been doing it for years as a hobby, and often aren't engineers at all.
In the non-software world, if you build something, there is a direct tangible result of that thing coming into existence. Like, if you build a cabinet, the worst case that happens is the shelves fall apart, so you don't need a whole lot of guarantees as a consumer. If you're building a car, there's a shit-ton of regulations (today, anyway) and potential lawsuits. If you're building a house, there's the code, there's 30 different specialized trades, all kinds of restrictions on who can do what and when and how, and a hundred government inspections.
But like in any trade, you can pass a test and still be a lying cheating piece of shit. (If you're a government contractor it's hard to tell if you're the former or incompetent) There is no magic spell or development lifecycle that stops shitty things from being shitty, or makes things automatically good. But sometimes there are regulations that enforce due diligence, and sometimes a contractor earns a good reputation through their results and word of mouth.
Sadly, in the software world, there are practically no regulations, no [serious] trade groups, no apprenticeships, no unions, no threat of class-actions, and very very rarely any substantive real-world consequence to shame a company into hiring competent workers. Bottom line: if you work somewhere and you are not satisfied or don't feel the work is challenging enough, get better and move on. The only way to escape monotony is to raise yourself up.
estimation. you can look back over the previous sprints and see how much on average has been done in one. then you do reference class estimation of the incoming tasks. (story points are used for this.)
wasn't it delivered with the principles? individuals before process? guys talking to each other informally and adding features they found useful (feedback from users, delivering value quickly)
If there's one much-believed software industry trope I wish would die, it's this idea that building great software requires constant heroics, crazy hours, mandatory crunch time, living at the office, and sacrificing your personal life and loved ones. That's how undisciplined and/or disorganized clowns do it, not professional software teams.
When someone says, "Wow, we worked nights and weekends, guzzled Mountain Dew, pulled 48 hour coding shifts, drained our mental health, and half of us got divorced, but the result was this kickass video game!!" it's not admirable--it's sad. That's just not how it's supposed to be done, people!
EDIT: This seemingly well-received comment seems to have ended up perma-locked to the bottom of the page. -weird!-
I don't believe for one second when people say "I worked 120-hour weeks for 6 months!" Simple math tells you this is a farce. Even 100-hour weeks is not sustainable, unless people want to claim they literally did nothing but wake-commute-work-lunch-work-commute-dinner-sleep for weeks on end. Not buying it.
Oh yeah this is absolutely true. I've voluntarily done ~100-hour weeks, and even in my 20s it destroyed me, I needed multiple weeks to recover from even short periods of intense "crunch".
The idea that you're living at the office and actually being productive is just laughable. It is absolutely not helpful except in brief emergency situations.
I've done 80-hour work weeks in blocks ranging from 6 weeks to 6 months. I did literally nothing but wake-work-sleep-wake-work-sleep, with time for food and similar necessities.
I didn't have many blocks like that, but those were some of the most productive (and personally fulfilling) times of my life. They made my career. Those allowed me to level up each time in a very significant way.
I also had long breaks after each of those -- they set me up to cruise for a while.
I did that before kids. I couldn't do that after kids. After kids, though, I have a depth of knowledge that makes me applicable for other types of productivity and work.
I have mostly worked at large companies, and in my experience this is due to the "business" people picking a deadline with no input from the people who actually have to make it happen.
If anything, that kind of behavior should give the outside world pause and raise questions about the sustainability of any product output.
That mattered less in the days of one artifact software development (and still matters less in areas like video games where that is the case), but software development these days is a process and many projects are far more marathon than sprint.
> If anything, that kind of behavior should give the outside world pause and raise questions about the sustainability of any product output.
It should give everybody pause, including software practitioners. A separate, but related pet-peeve is how these unsustainable heroics are often rewarded at work!! Boss: "Look at Chris over there--he stayed up until 4:30AM and fixed that ship-blocking bug. What a champ!" Chris gets a $1,000 spot bonus and now the rest of the team looks up to him as an example of good software development. Incredible but it happens almost everywhere!
For game developers/designers/artists, this does appear to be the case from what I can tell, but only because they are ruthlessly exploited. Otherwise it is indeed a ridiculous and pseudo-macho attitude that impresses nobody.
The quality of my code drops considerably if I don't take breaks or do something else for a couple of hours once in a while. Making up for it by coding even more sounds like a terrible idea.
Imagine you are running at 2 miles/hr and your competitor is doing 4 miles/hr. How much distance gap increases between two of you as time passes? The answer is secret to virtually all success in most companies which started as startup. In companies where people do 9-5 and competition where people sleep under desk, the gap grows tremendously. Yes, people burn out and they will be discarded and replaced with new blood but that’s how history is made, unfortunately. The 9-5 companies are exactly the target to be swept away by startups.
Obviously very good that Chrome was delivered without people doing lots of overtime. However, a lot of his argument seems to be about the age of the management, and surely ageism is illegal and it should be about the person's skills rather than being old enough to have school-aged kids or even how many decades of experience they have
Edit: Okay, I guess the kind of ageism he is suggesting isn't illegal in the US, but it is in the UK and is still generally considered unethical
He frames it in a way that kinda sounds age-ist-y, but I think it's less about age and more about experience (he was using age as a proxy for experience, which isn't always true, but is close enough, often enough).
I had my first "senior software engineer" title when I was 28, and that was after I'd only been writing code professionally for a few years (in my early 20s I had a campus coding job at my university, and then I was doing a lot of open source work through my mid 20s, but not sure I'd call any of that "professional"). At my most recent job, I saw most developers making it to the senior in their late 20s, and many even making it to "staff" (one level above senior at our shop) by 30, or soon after. That's ridiculous. In my mind, most people should be hard pressed to develop the experience to really be "senior" in something before they're in their mid to late 30s.
Now, I certainly don't mind (from the standpoint of prestige and salary) that I somehow ended up with the title of "principal software engineer" (one level above "staff") when I was 33, but... c'mon. When you've nearly tapped out your career ladder by the time you're 35 (unless you move to management), it feels like there's something not right there.
The truth is that these are all meaningless titles once you consider people change jobs. Some people won’t accept ever going to a lesser position and stay at a company(unless forced out by circumstance) but those who switch generally experience some reshuffling in “rank” when they leave.
If you left the company you work for right now(other than to start your own company) you could find yourself as a staff engineer(one level below) somewhere with an accelerated path to the next level maybe, or in an equivalent role, although this is more difficult just because there are fewer positions and more filters to being hired.
I won't try and read into whether or not there's ageism anywhere in the tweet stream, but certainly when talking about hiring the magic words are "find experienced engineers to run it". This is very much legal and ethical in the UK - we're not precluded from setting an experience-based hiring bar. I'm sure if a 25 year old had come along with two browsers under their belt they'd gladly have been hired into a leadership role too.
There is currently ageism within the software industry (esp. startups). Older people (apparently) find it hard to get jobs. Part of the justification for that refusal is that young people will allow death-marches.
His argument assumes you are aware of the youth bias, and is gently pushing against the ageism by pointing out that senior software engineers have a LOT of useful knowledge.
> Part of the justification for that refusal is that young people will allow death-marches.
Where I work the young team are sticking hard to their contracted hours (nothing wrong with that). It's the seniors that pull the extra (but not mad) hours to get shit completed.
I know that this is a real problem, but I also wonder if this perception is also perpetuated by selection bias.
People with established careers in tech often change job through their established networks, and especially when they are highly sought after.
So it may very well be that the strongest senior candidates’ resumes never reach your inbox, while it’s more likely that strong junior candidates have no other option.
Seniority doesn't mean "senior", it's a product of expertise. Obviously there is a strong age correlation because generally going up seniority ladder is going to correlate with time at company, and domain knowledge/expertise is going to be correlated with time spent work in that field.
But I know plenty of people my age (my vintage? :D) with higher and lower seniority, similarly I know people older, and people with more time at the company in the industry with substantially lower seniority, and vice versa.
But also the companies I've worked at (FAANGs, so obviously large) don't treat "seniority" at the IC level as giving some kind of priority over lower seniority ICs. Obviously seniority factors into "how reasonable/accurate is their opinion" but that has never, in my experience, been a blanket override of lower "seniority".
The primary real difference is compensation, which is why companies like to get rid of senior engineers. I assume for a competent company they're doing a trade off "how much do they cost vs. how much value do they add", but obviously where we see this is always poorly managed "get rid of all the expensive people, WCGW" policies.
Maybe the secret is not really about the age or management skills, but rather that Chrome is an insanely profitable product (+ in a monopoly) so the pressure is rather low compared to a startup.
Additionally whether a specific feature is ready or not for a specific cycle is not that important considering that there are releases every 6 weeks and even before for metrics gathering activities.
I disagree, many of us wanted this but none of us had the money to fund it... Google has a special kind of magic for the time... IBM would fund some developers to work on Mozilla... Mozilla had some money to fund development on Mozilla... Apple had a few funds for developers to maintain a browser for Apple... Microsoft was happy to maintain IE... Google was different it was a place for innovation. It disrupted search, email (gmail) and mapping (google maps)... now to support those 3 products it made sense to fund a better browser... then with the purchase of YouTube... 4 disruptive web based platforms it made even more sense to fund a browser. I hear chrome team saved youtube billions in network costs (per year) just by ensuring more adoption of vp8/9... Today we tend to focus more on the ad business Google purchased doubleclick and the evils of it's tracking... but think back to this Google had at least 5 major disruptive technologies... search, email, mapping, video consumption, and the 5th IMO... enabling countless businesses to build successful web applications because of the development of a "good" browser.
Chrome never needed to slay itself, because there was no customer expecting delivery. It literally couldn't be late because there was no set schedule. It was done when the engineers finished it.
Like many projects at Google. My experience in general is they don't do schedule-discipline well at all. And management there seems to think throwing ever more headcount at things will make them ship faster (it rarely does).
And there's very little accounting when promised dates are missed. Even by years. I worked on the software for Home Hub, and everything was supposed to get rewritten in Fuchsia, and they promised to be ready in like two quarters, almost immediately after we shipped it. They had unlimited headcount and the blessing of upper management, but failed schedule after schedule with no consequence. It took them another two and a half years.
Wanted or not, nobody (or at least not many people) asked for google to make a browser, and nobody waited for it, which was what the parent comment said.
Also I had a really hard time reading your comment with all the ellipses making it seem like it was just a huge sentence, that might just be me though.
It was also based on existing open source. None of this is to say that Chrome wasn't an amazing accomplishment. I do wonder how much crunch time the Chrome teams face now that Chrome has customers?
Seeing as it's money-losing (like virtually all of non-ad google), it wouldn't have helped anyway. The only value to Chrome for Google is the monopolistic market-distortion through vertical integration.
At time it was a defense against Microsoft embrace-extend-extinguishing the web through their dominance with IE. Remember ActiveX, VBscript in the browser, etc?
The threat of that had already been greatly reduced by the time Chrome came out. Firefox was quite dominant, and IE had become much more standards compliant.
I just checked and seems like IE was down to around 60% and Firefox up to a third when Chrome launched (not sure how long before that the project was greenlight). So it probably still played a role.
I've never seen anything good from intense pressure from above—it barely even changes the timeline. You take the pressure away and you can still solid work in an orderly fashion.
Even since "Agile" starting taking off ~2010, it has made be very sad that many junior engineers today genuinely believe that somehow no software was ever written correctly without it. They were taught in school that there exists this bogeyman software development methodology called "WATERFALL" where pencil-pushers in a windowless office write requirements which they hand off in a printed binder to the team of engineers in the basement, who is not allowed to ever talk to the user.
The Agile consultants somehow convinced a large segment of the industry that they discovered and/or invented the notion of working with users, of gathering feedback from them, of checking in with your teammates, etc. And they completely disregard the possibility that maybe --just maybe-- there are some developers who can get a metric shit-ton of work done without someone poking them repeatedly for status.
And they popularized the term "Cowboy Coder" as a reckless developer who does whatever the hell he wants and dares you to mess with him. When in fact, their so-called "cowboys" are simply the best developers in the team, who write great code and don't need a scrum-master to help them plan it. But the Agile methodology resists the notion of some developers simply being awesome at their job -- in Agile, you are good at your job by meeting your "points" for every sprint.
I had this problem too. My younger developers pushed really hard to bring Agile in. One of the supposed benefits was that all developers were treated equally (today we might say they were fungible)
We have it a really good shot; we even hired a Certified Scrum Master. But after a while it seemed to me that we were just doing lots of tiny waterfalls. It was nothing for a developer to spend a whole sprint spinning their wheels and not making progress.
Long story short, 2 years later I took over the team, threw it all out, and set up a system based around Kanban and hands-on management. And suddenly we became productive again.
(Not saying Kanban is a solution, just that Agile is not)
You’re right, what I should have said is that Scrum doesn’t work. And I tend to use big-A Agile when talking about the formal processes and little-a agile when talking about the manifesto.
I think the manifesto is great btw. It’s just that the formal “Agile” processes, such as the way Scrum is practiced, don’t implement it.
Interesting, what I like about kanban is that (in my opinion) it explicitly recognizes that work proceeds through specific stages, and therefore tickets go through a mini-waterfall. Waterfall is great with a small enough batch size. In contrast, scrum seems to pretend that design, development, code review, QA, etc, all happen at once throughout the sprint.
We are probably on the same page here but I would describe it differently.
For me, there is a whole separate prioritisation and design process that runs ahead of the Kanban board. So by the time work arrives on the board, we have a pretty good idea of what we need to build.
The difference is that in waterfall, by definition, we would build what we’re told. But implementation of a design should instead be an ongoing conversation/negotiation between the high-level designer and the low-level implementor. And that can result in big changes to the high level design as problems, inefficiencies or new ideas come up. That is the “hands on” management I mentioned.
My way of working is heavily influenced by DDD. Design can only get you so far, and any process that imposes a top-down structure is IMO likely to fail.
Not in our case. We’re breaking work into small stories that are self-contained improvements that can (usually) move into production independently of anything else. Typically an individual or team can complete more than one in a two week period. But either way, we won’t break them any smaller than that atomic “completeness” to fit into an arbitrary time interval. The sprint is just for automatic estimating of when we’ll get to a story, how often to demo progress to interested parties, etc.
In that case, we might only be differing on terminology. In my understanding, scrum's defining feature was the fixed time box. What you described sounds like kanban to me!
I would say we are using a mix of kanban tools and the scrum process to practice agile software development philosophy. We aren’t really using the kanban process since teams shepherd their stories to completion regardless of WIP at any step. And user research is more based on the continuously evolving end goal than what is being pulled in by openings in the kanban board. But yes, we are not trying to define two weeks of work up front, just noting what gets done every two weeks.
Where I work we don't do any "agile" and I always have some fun explaining that to would-be hires. I think the overall reaction is generally positive.
Lots of awesome software was written without:
- Unit tests
- TDD
- Agile
- CI/CD
(to pick a few random practices). ... or in Agile terminology "there is some value in those set of practices".
The sad thing about Agile is that it reflects a certain naivety on behalf of its founders. That somehow we can encourage better software (and arguably corporate) practices through a set of principles (that aren't that bad). It's literally the road to hell is paved with good intentions.
Unit tests allow complex and nontrivial code to be modified without introducing nasty bugs. A lot of software that I've seen that has no unit tests and runs well got there because people slaved away cleaning up nasty bugs that would have been caught with unit tests. I cannot understand how people think unit tests aren't worth doing and I can only assume these people don't know how to prove their code works, which is quite worrying.
Unit tests are overrated. They are for algorithms and really complex functions.
Unit testing with 100% coverage is a red flag for me, someone spend insane amounts of time writing tests for useless shit.
Integration tests are the ones that determine whether stuff actually works, have more of them. There's no point in having a test to see if pushing a button produces the correct event if there is no integration test to see if pushing the button does what it's supposed to do in the backend.
Just like Agile unit tests came about as a fashion. I think the history is a bit murky but Kent Beck was one of the proponents and he also happens to be one of the signatories on the Agile Manifesto. Let's say circa 1997.
There was plenty of very well written software in existence at that point. Large applications, operating systems, databases, games, embedded applications etc. etc.
That's not to say there's anything bad about unit tests just like there's nothing bad about the Agile principles. They don't prove your code works though, that's not really possible. A useful tool- sure.
There are bad things about unit tests compared to any other way of getting test coverage. You have to make your architecture more complex to allow them - people call this “testable” and pretend dependency injection and other weird ways of avoiding straightforward code are good. And they rarely fail and have to be maintained.
It might be good to write them as you’re bringing up new code, but deleting the useless ones after that could be an improvement.
Honestly- subjectively. Does the software crash often, do I see weird behaviour as a user, what's the number of users, bug report rates. I think those systems do compare in complexity to at least pieces of todays software. A lot of this older code is still powering things today/morphed into newer/larger systems.
I've worked in more recent times on software that was used by millions of people, was extremely reliable, and didn't have a lot of unit tests. We did have other forms of automated testing though.
Interesting to think that agile will somehow lead to better software. I like agile in general, and I think that rather than being a superior way to develop software, it better accomodates the realities of many software projects. Websites and -applications are fast moving targets, with marketing campaigns, new features, the need to react to an outside change - I think agile suits this environment.
The actual process doesn't matter that much. It can be waterfall, kanban or agile.
As long as there IS a process.
Agile is a way of protecting the coders from ad-hoc spec changes, because that's against The Process. The Process is a thing that management and stakeholders understand.
If there is no process, they can just turn up at Cowboy 1:s desk and request/demand/suggest their latest invention and expect it to be done.
> The Agile consultants somehow convinced a large segment of the industry that they discovered and/or invented the notion of working with users, of gathering feedback from them
even worse, now the specialist Agile consultants have themselves been replaced by McKinsey/PWC garbage
you can imagine how well one of these multi-million dollar "Agile Transformations" go
> They were taught in school that there exists this bogeyman software development methodology called "WATERFALL" where pencil-pushers in a windowless office write requirements which they hand off in a printed binder to the team of engineers in the basement, who is not allowed to ever talk to the user.
Having worked in government and big-co companies before: sadly, this is not a bogeyman trope, but reality. Including the printed binder, although it's called "spec sheet" or "tender document" (or whatever the correct english words for "Ausschreibungsunterlagen" and "Lastenheft" are).
The amount of "silos" and "leadership" involving themselves in petty fiefdom fights is astonishing - that is partially a reason why small startups are so much more efficient, they haven't had the time to develop layers and layers of middle management wanting to justify their existence, protecting budgets or establishing their authority. Government projects tend to be the worst target for such micromanager wannabe-king types, given that they can rarely be fired from their jobs for incompetence.
"You guys forked webkit which forked khtml, so you all had a nice leg up no?"
says:
"Yes. Just like IE started from Mosaic Spyglass. But a rendering engine (like WebKit/Spyglass) is not a browser. Certainly not a multi process, sandboxed browser. Chrome v1 was a 200 person year effort."
but, come on, much work was already done and they seem not to remember this.
No where in that thread is he suggesting they weren't standing on the shoulders of those who came before, but he's also right a rendering engine is not a browser and at the time Chrome was revolutionary in a number of ways. It was crazy fast. Even compared to other khtml browsers. It's multi-process architecture meant that for the first time a single page or tab couldn't take down your whole browser. And yes that was something I experienced frequently at the time. 200 person years to deliver a full browser that was an leap forward compared to the rest of the browsers is seriously impressive. His point stands they did it without a death march.
Also, Chrome was using WebKit long before they forked it. IIRC, for several years they used the exact same engine used by Safari, and both Apple and Google were contributing to it.
Sprints are to knock down the high achievers and provide an opportunity for substandard developers to all appear as if there is progress. A place to hide, if you will.
All of the high functioning teams I’ve worked on didn’t have any kind of agile structure.
Agile can be done well, but more often than not it isn’t.
I mean, that's true to an extent but there's large volumes of software that has no use for high achievers and is better off with process consistency and predictability
the development of something like Chrome isn't one of those project but say a banking website or a inventory management system could be
> I’m in the relatively rare position of having worked on both Chrome and IE3. To add all the truth @aboodman wrote, both teams were amongst the best I’ve worked with, but Chrome was much healthier, happier and more supportive. Senior industry folks helped.