When I was working as a junior doctor, I had colleagues who "saw" 10 patients in the space I saw 3.
Whenever I was on call at night, I could spot the patients who were "seen" by those colleagues, because it was always the patients who crashed or needed their meds re-writing to remove the allergy risks or needed to have their bloods ordered and taken, or even had to be re-clarked from scratch because the history and exam were utter bullshit, not to mention dangerously so. (I once re-clarked an old lady who was admitted for confusion affecting her ability to communicate and was about to receive all sorts of scans and radiation. When I talked to her, it turned out she had just lost her hearing aid and couldn't hear anything.)
Still, it was always the "tenners" who were praised.
(PS. I no longer work as a doctor. Fuck that shit.)
(PS2. Obviously this isn't unique to medicine. I have the same problems inheriting crappy untested code now, haha.)
Next appt, mum goes with her and lo and behold, she doesn't have dementia, she needs a new hearing aid.
(deafness runs in my family, if you're interested: https://medium.com/@veb/my-journey-to-a-cochlear-implant-8ce... - had to remove part 2 because lawyers...)
You figure why "we are rolling down" (loss of quality as a value, "decadence" - loss of purpose (what do those colleagues think they were there for?), loss of good sense, loss of perception).
I have already narrated at HN the story of the evaluation in which the subject was criticized for being not as fast in responding as the colleague - who gave wrong answers.
I think I have also read in these pages our moderator DanG noting the qualitative difference in general separating prompt posts and meditated posts. It's not that the concept of "reflection" is gone, it is that it has generally declined. Surely, over ten years of "bad school" social networks helped that. Plus, economic and sociopolitical factors promoting the use of cheaper personnel in looser monitoring.
Edit: plus, overly-literal-interpreting-people dealing with overly-sloppy compilers of (Key) Performance Indicators...
Hence if your side of the story is out at the same time as the news of the event, you get a good chance of shaping people's perception of the event.
If you wait a week, day or hour, it may be too late.
I am not sure it is good bad or other. Maybe it does not matter to you that millions hear one side if the "right" people hear your more considered side. (perhaps we call that lobbying)
Speed has meant it's harder to get a considered reaction. But then again, by middle age most things in the world are either things we have seen before so have an instant reaction, or frankly are not worth reacting too.
> I don't think there is something inherently bad about social media
Good that that's your opinion, just I hope you have not understood that is the position of the parent post, which wrote «over ten years of "bad school" social networks helped [the decline ... of the concept of "reflection"]». It meant that people have been encouraged to "fast respond", which boosts the diffusion of low quality. Lack of depth, cheap processing...
> the first time people here is when people shape their perception of the event
And duly reflection, the point, is there so that the new informational members in the Weltaufbau are well criticized before getting entrenched.
Why should have «speed [in the time from an event to when people hear about it] ... meant it's harder to get a considered reaction»? One's reaction to information is not really a function of the distance in time of the event in the information. The problem noted is with those ethological drives against «considered reaction».
I assume by emphasising "bad school" you say that the parent was also saying social media not inherently bad, but the way it was implemented was?
I think that we could design (regulate?) better social media (I have high hopes that something similar to the "this looks like anti-vaxxer posting, here are some links to scientific consensus on the issue) would help move us from facebook to wikipedia. But I may be blinded by my own privilege
I will rephrase a core concept more plainly: people are supposed to be trained to be "reflective": whatever input (intellectual, emotional, informational, ideal - so external or internal - etc.) they get, it should be put on a buffer with a label "tentative, provisional, waiting for further validation, digestion, development". Whatever incites to shallowness, cheap, immature thought, reaction etc., is against civilization.
> social media not inherently bad, but the way it was implemented was
Yes indeed. Some environments clearly "point towards evil".
I was mainly thinking about incitation to rush, quick consumption, construed emotional response etc. But already the policy of one of the biggest boards out there, encouraging upvoting and downvoting according to "feeling", (mis-)educates people to give value to a knee-jerk reaction (which should at most be there as a preliminary orientation) and consolidate it through the psychological trap of a statement of position, and create identities in the deterior sense, self-images based on stagnating undiscussed assumptions.
> design (regulate?)
You need to promote, through function and policy, better examples and work recursively from them (let them lead and criticize). In fact, the very example you provide is all about credibility of the proponents.
I believe the US does not align financial incentives with the quality of care provided.
Healthcare in the US is a runaway scam: https://ourworldindata.org/grapher/life-expectancy-vs-health...
There are many posts online about doctors and pharmacists about the pros and cons of tying their reimbursement to patient outcomes.
The odds of her earning similar, even per hour, if she switches to programming may cause sticking to being a doctor make more sense.
In the end it was about prioritising my personal satisfaction and 'happiness' over what a more conservative mind would think 'I should do' (one aspect of which was the financial sense). I am not super rich nor super successful at my choice of profession now (partly because I'm 10 years older than everyone else in my position right now, which can be an issue in itself), but I largely enjoy what I do despite the stressors and pressure that my particular position puts me in, and I'm trying to use my richer experience to 'catch up' career-wise as quickly as I can, so to speak.
I's not a black and white decision of course, and I'd be lying if I said I have zero doubts. There are still moments where I think "what if I hadn't left". Would I have been happier staying miserable but with lots of money to choose my preferred form of misery? :p
Is it better to be 'poor' (by comparison) due to cost of the 10-year career lag in the new field, but in a more fulfilling profession? In general, 9 times out of 10 when I think about it I'm happy it was the right decision and not one that I regret. But there's still that 1 in 10 which nags you, especially when you get together with your old colleagues who are driving ferraris and talking about how delicious that £1,000 champagne bottle was last week while skiing in the Alps. (I actually despise that lifestyle, but for some reason it still stings when it's rubbed right in your face...)
As for how I made the decision to leave, for me the big moment was when I realised I was bound to rebel sooner or later, because my psychology, sense of satisfaction, social life, and quality of life were all crap. This gives the sunken cost fallacy new context: your options are not just "not leave" vs "leave now". This in itself is the fallacy of the false dichotomy! Other options also included "endure another 10 years before giving up", "endure another 20 years", "endure another 30", and so on.
So the question got reframed into "leave now and waste 10 years of medicine? Or leave in 10 years' time and waste 20 years of medicine?". Similarly, "If I leave now, do I still have a sporting chance of achieving something meaningful in the new field before I have to retire? How does this change if I delay this start by another 10 years instead?"
It was easy to make the decision to leave right away after that. :)
Also, if OP lives in a low income country, he has better options to sell himself compared to a doctor, normally constrained to the local market.
I'm curious if you'd be willing to discuss why?
In the end, I think what it boils down to is that I wasn't happy doing it, and I found something else that I felt more passionate about and found a way to do that instead. (while retaining some of the medical context which would set me apart from my peers in the new profession).
"Explode when you push, but slow your overall pace to 20 strokes per minute. Take time to breathe and rest on the way back in."
After finding the new cadence, my pace was now 1:55m per 500m. A 30% reduction in stroke rate improved my pace by 8%.
Slow down. A good pace is 2min/500m. Any faster and you will feel spent.
You can do intervals to build endurance: 30s fast work/90-120s slow and repeat 5 or more times. If that’s too much try 15s/60-90s. Build up from there.
If you're feeling particularly motivated - bring a bucket ;)
Sometimes I do a 5k at about 1:55 instead. Between that and weight training, trying to build back my endurance before doing 2k sprints again. Maybe I'll get back on the water next year.
What? Assuming that you are talking about running, the current world record for 10K is 26:24 for men  and 29:38 for women . It makes you quite a bit faster. Am I missing something?
That goes the other way as well: I've had to cut FAANG hires from startup projects because they couldn't agree on any corners to cut in the name of getting an MVP out quickly. As a senior person you need to be able to make that tradeoff effectively, and figure out what level of quality/testing is appropriate to the business situation. It's a tricky skill to learn, and it's a big reason why a lot of engineers that thrive in startups can't survive the transition to "real business" and vice versa.
Building a weekend prototype? Move fast. Just get it working. Building an MVP? Move fast but keep an eye on which areas will need to be redesigned in 6 months. Pay down tech debt as the code matures and you know which features will stick around. Adding features to software with an established user base? Move slowly and test.
If you can’t adjust your workflow based on the needs of the product, you will only be able to add value in a single niche.
I have never met a single developer who knew how to write correct code.
There is always a bug somewhere.
Someone who's fresh to the field and is eager to take feedback and act on it, great! They'll learn quickly and you can mold them, they'll slurp up all the feedback they can get and end up fantastic after very little time.
But someone who already has 20 years of experience and is bitter that YOU, a young'n with just 10 years under your belt has the AUDACITY to condescend to THEM about how they could improve THEIR PERFECT CODE? You might as well just fire this person unless you're happy with their output, they are going to be locked in their ways and while it's possible to "break" a person like this and get them to bend a bit, it's going to be your full-time job which you'd damn well better have political cover for within the org because it can get nasty.
It's particularly important when you're designing data structure changes more than than code. Getting an API or a database table design right first time (or even just 'more right') is worth the time invested.
It's better to do it twice and have your clients give up and throw it away because they can't use either result.
Most difficult things in life turn out to be balancing acts between two opposing forces, yet junior devs always seem surprised about that fact.
you don’t want to assume anything. Don’t assume. Stop assuming. Stop guessing
I've also had this experience in some consulting gigs before where they asked me to help out with something "because don't you know SSL?". "Sure" I said (coz yeah, I had set up an self-signed cert for my home server once ...). All it took was to get the developer that had been stuck for a day to get something running (don't remember any specifics) to actually take a step back and check all the assumptions he made. I basically just asked him very simple and basic questions about the problem and whenever he told me that he'd checked something, I made him show me that whatever he claimed was true. Sometimes it was. One time it wasn't and we found the problem. Once we had rooted out the assumption, it was easy to fix the actual problem. More than a day of a highly paid senior developer wasted and figured out by the "dumb rookie consultant". Sounds a bit like the recent story about "asking dumb questions" ;)
Actually, at this point I’m always pretty well aware that it must be one of my assumptions, it’s just a matter of figuring out what implicit assumptions I’ve made.
Other people are really good for that.
"What you don't know, you don't know - and you can't make it up".
Attributed to former Raytheon vice president Bruce Dunn.
Alas, the fast hacker gets recognized for their hustle, despite them spending twice the effort to get to the same place, and others having to go back and fix/cleanup their mistakes.
It depends as always on the length of the race, though. The fast hacker gets to the first corner first. The 'slow' hacker finishes the first lap first. The engineer might miss the first season and then show up in a fighter jet.
Agile (are we calling it Scrum now or is Scrum something different these days?) is a good methodology for a very specific scenario: A small team working on a tight schedule towards an ill-defined goal for a client who doesn't know what they want. Unfortunately some people (especially those getting paid to preach it) took this new hammer and hammered every nail, screw, bolt and tube of epoxy they could find while trying to convince everyone they met that Hammering Is The Way.
Honestly, I'm not sure Scrum is good for anything. (And well, we did not technically rename "Agile" into "Scrum", we just have a lot of lying propaganda saying that Scrum implements Agile and have the entire population of managers dismissing any other process. So Agile doesn't exist anymore, we successfully killed the idea.)
You have to know at which pace you won't make mistakes and try hard never to exceed that pace and all will be good. Especially if you work with things where you will go to prison (or die) in the worst case, while the boss who pressured you into going faster will just loose more time.
There is always a maximum tempo at which things can be done reasonably. Know that tempo, never exceed it and never promise someone to be faster than that.
That being said I am still fast compared to other people.
But in my experience, it's often my job as a developer to decide on the last so many percent of the requirements, and asking other people doesn't help much because they all have their own work to do.
And many of the things left not completely specified are often not that crucial, and can be changed later, so this is how it should be.
I like to go fast and make a test build to show someone else and get feedback. Cause I know one person can never see the whole picture. Code is a conversation. This is the essence of Agile (not the Agile that paid management consultants teach).
Now picture you are reworking almost an entire process or part of the system that has already been delivered (management jumped up and down congratulating everyone, promotions handed out), and is highly complex where new issues are discovered that were never thought of in planning. There is no "fast" in this situation and it's where most of us live every single day. And they want it all by X date or everything goes tits up (not really but managements' metrics: dates and "done," are causing them to pressure people--and by people I mean junior devs who are running on the career treadmill at a corp).
Meanwhile there are devs who go fast because they like the immediate praise. They literally print bugs with their keyboard--and they bite hard.
Understanding each requirement fully does not, unfortunately, imply you know everything there is to know about the desires of the customer, but at least you are on much firmer ground than if you only barely know what you are supposed to be making in the first place.
But there is also the "writing a book" model, where a general plot outline is first, but the characters write themselves and often surprise the author.
(Note: The article addresses the issue with incremental development.)
I agree, not necessarily because people have other stuff to do, but rather because that is what programming _is_: you operationalise a vague process into something detailed and clear so a computer understands it.
It's really a spectrum that is never clear cut. The article suggests that one should take time to get close to a clearer picture by thinking, investigating and communicating before writing actual code. I tend to agree.
It’s the developers discretionary power! Perhaps the colour of the capacitor inside the iPhone … you don’t need to run past Jobs (back in the day…)
Assumptions go wrong sometimes, but they go right and save so much time so often that you can't just throw them all out.
The real advice is "learn when to question assumptions" but that just takes experience and you can't write blog posts about it.
Or a stitch in time saves nine.
This is definitely true but some companies are set up to reward rush ‘n’ bug fixes and love the heroism of an all hands production issue fixed. So know what you are swimming in too!
Make haste slowly
Starting to think about the requirements of a task and how to implement it a few days (or more) before I'll start work on it, whilst working on another lets the problem mature in my mind, perhaps think or research the problem in my downtime and means I won't rush ahead into a non optimal solution.
But back to the idea of many applications being emergent and/or not at all (or not entirely) what their creators intended. Maybe there's room for both approaches depending on the situation.
Often small team doesn't have dedicated PM to divide and conquer big issues into smaller ones to be fit in a sprint timeline. It's the main reason to fail at delivery on time for most of my projects.
german saying: eile mit Weile
it seems to be a pattern.