> “Kids are not stupid. They are among the sharpest,
> cleverest, most eagle-eyed creatures on God’s Earth, and
> very little escapes their notice.”
i find myself constantly questioning little details in my codebase like, "why do we thread a request all the way through the business logic" and i think the reason i catch these things, is because my brain hasn't become complacent with this style through 15 years of solving low level problems in C. all sorts of things jump out at me as accidental complexity that the most senior guys don't even notice. i think it's because it didn't used to be important, or maybe its not important to them because they wrote it over years and are already familiar, but it sure is important when you're trying to scale your team and they're facing a brick wall of code needing to be understood for every single little defect.
this complacency is why big codebases get so noisy that you can't see the nuances of requirements in a code review. it's not possible to ask high level questions like < http://www.dustingetz.com/how-to-read-code > without aggressively abstracting away all non-essential complexity. breaking experienced people out of this complacent tolerance for accidental complexity, as we tackle higher and higher complexity, is a really hard problem.
I've also seen youngsters thrash around making a hash of a problem simply because they don't understand the tools available, or they don't have a sense of which problems are worth solving and which aren't, or they don't have a proper sense yet of their own limitations in the face of complexity, or they prematurely abstraculate all over a code base.
Convincing them that this shit is harder than it looks is a really hard problem.
In front of the exact same code, the veteran will bring forward issues such as deliverability on schedule, integration, letting the end-user start the program at 4pm and still be on time to grab the kids at school, the amount of paperwork and convincing that needs to be done so that the people at marketing can understand the new tech well enough to actually sell it to a customer or so that the people at IT can properly install and maintain it on a production environment without calling him every other night.
The novice will say: "why is there a break in the pattern here?", "this code fails when x is negative", "this monster piece of java code looks like what I saw someone solve with a one-liner in python last week".
Putting them both in front of a screen from time to time might either get you the best of both worlds, or hours of painful & pointless arguments.
Our adult brains have built up a large repertoire of pattern recognition algos that help us skip steps, lead ourselves to conclusions, ultimately interpret our world better and faster. Children see the world on a more visceral level because they don't have that pattern recognition in place yet, so they have to rely more on direct sensory feedback.
I saw this in action the other day in a local coffee shop. In front of me was a mother with a child of about 4 who pointed out to her mother than high on a shelf above Italian coffee machine was a small, plastic toy from the children's program "In the Night Garden".
I certainly hadn't noticed it and neither had the mother. Which isn't surprising because the mother (and I) were in the coffee shop with a head full of other stuff (buying coffee, work thoughts, relationship thoughts, etc. etc.) The four year old was in full observe the entire world mode. Given that he didn't have a purpose (like buying coffee) he was free to look around.
What I've taken out of it is that you can only have locus of attention at a time. I am a competent writer in my native language, but when I'm programming I can barely produce copy. I'm somewhat better when I'm writing markup, but this is because I've internalized it to the point I no longer need to think about markup while producing copy.
I used to look up to Alan Cooper (who wrote _The Inmates are Running the Asylum_, where _the inmates_ are developers) but now I just loathe him. Programmers get a lot of unwarranted heat for producing unusable interfaces or producing cryptic error messages, but this is because programming requires a tremendous amount of mental effort, and I very much doubt that anybody is able to switch from computer to human oriented thinking instantly.
To avoid these pitfalls I started practicing waterfall methods: I'd write all my copy, gather all my assets, produce pixel perfect mockups and then get down to programming.
Nowadays I've learned to let go and get things achieved in multiple quick passes: one for design, one for programming, one for copy then iterate and repeat until the product is good enough for release. But it does take discipline, you must completely focus on the task at hand.
I always thought requiring complete focus when programming was something required with all professions, but learning to design and spending a lot of time in Photoshop I've realized that this doesn't seem to be the case (at least with design).
It's not to say that I don't get into the zone when designing, but I've found is a different style of thinking. Where the goal of designing is to try a lot of things out, with the end goal to present information in an intuitive way vs programming where you need to get something done, and it is your goal to beat the machine into submission to perform it the way you want it to.
The good thing about programming I've found, is that if you give it enough time, you can do just about anything, whereas with design, I've found the longer you spend on a problem doesn't always guarantee you'll find a solution ...
To give Cooper credit his ideas have moved on quite some way since he wrote that book. See this old comment http://news.ycombinator.com/item?id=1728957 for pointers to some of his more recent thinking.
Also, we all pay attention to different things, and ignore different things. So even though children notice a lot of things we don't, we probably notice a lot of things they don't. In fact, I'd bet that us adults could notice more things than kids, if we tried. The problem is that what we notice is biased by our expectations which have evolved over the years (for example, we've learned to ignore the zombie arm sticking out of the plant because it isn't important), whereas children look at everything with fresh eyes, which forces them to pay attention, which in turn makes them notice things we've stopped looking for long ago. So in the end, maybe kids do tend to notice things more than us, but it's just because they have yet to learn to close their eyes.
In the beginner's mind there are many possibilities, in the expert's mind there are
Of course, if you have the wrong idea about what is important it is very useful to have "beginner's mind".
This can happen when debugging the wrong part of the code or when an industry is disrupted changing the game (e.g. from efficiency to development speed then back to efficiency again),
BTW: Employing Autism sufferers is an example of division of labour, the essence of civilization. Vernor Vinge examines the benefits of Autism with the fictitious "Focus" in A Deepness in the Sky.
Plants vs. Zombies has really really well balanced game play: many game designers, even of quite popular games, are winging the game play, while paying incredible attention to things like sprite animations.
I have personal experience at a game company where the atmosphere rendered on planets took weeks to perfect: implementing the behavior of light through atmosphere so accurately that if you flew so close to the planet that you crosses /into/ the atmosphere it was still correct. The game, mind you, wasn't the least bit fun.
I also know people (mostly designers, sadly) who agonize over the color choice of icons (or admire the work of specific products), but completely miss that on iOS panels aren't supposed to slide from the right over others, or that highlights are supposed to fade as you go back, not stay lit or fade after click.
As someone who cares a lot about the experience of using a console, Apple's Terminal app is the epitome of "not caring": just changing some defaults would makes it indirect more usable, and that's minutes of work. The fact that it fails to correctly handle certain common escape sequences, though, makes it nearly unusable.
Yet, people love Apple, and they claim "the attention to detail". These same people tend to go "oh, I didn't notice" when I point out such problems. There are tons of tiny things wrong all over the place, and you get "oh, I didn't notice"... and at the same time they are playing the "did you notice game" about other things, and winning (and they often do: Apple has such a large profit per user, and always have, that they can afford a lot of attention on otherwise inane things).
This article, sadly, plays into this same flaw. The child notices some things, but is going to be oblivious to others. People don't get worse at detail as they get older: they just focus on different things: they might focus on the amazing detail the author paid to nailing the feeling you have when you find out your father died, as opposed to the grammar.
Some people are also just different than others. I know someone who seriously watches movies and afterwards is talking about the buildings. It was a love story, but they paid no heed to that: the question of whether the architecture of the time and location was presented with accuracy is what they are focused on.
Developers care a lot about things like efficiency, stability, and security (with different balance depending on the person, sometimes lacking entire categories ;P). These are often the kinds of things you don't even notice unless they /don't/ work, and even then, most people will just get lucky.
People, though, then compare the product with carefully designed animation that has a common "corrupts your file" failure case with a carefully engineered one that "looks ugly" and claims the former has this property of attention to detail, and the latter somehow doesn't. :(
Go to a designer, or a film critic, or even a child, and they are not going to be saying "did you see how they actually were running nmap, from a real shell, in order to prove the enemy computers in The Matrix Reloaded?". That is what all the geeks I know paid attention to: every terminal
in every movie gets extreme scrutiny.
Even if the child wanted to pay attention to that, they can't: to even know what nmap is required a peculiar technical experience. With age comes an expanded set of things we understand, and an infinitely diverse set of different and very interesting things to care about. This is not a failure of age: at worst it is just a difference.
(This was typed on my iPhone at a party; I was thereby forced to pay less attention than I normally do to high-level editing, and I trusted the iPhone's spelling corrector more than I should: it sometimes corrects words to things that are ludicrous.)
Also, I think your analysis is spot on. Another way to look at this is that perfection often kills products—people stress so much over getting every single detail right that they never ship, or fail to execute on the overall vision due to overreaching.
In the Plants vs. Zombies example, there are actually 24 different zombies. If they needed to create different artwork and game rules for every zombie interaction with the plants, the project good quickly spin out of scope.
Perfectionists can generally find an angle to criticize any product.
From the designers you get things like descriptions of the visual hierarchy, guesses as to what people want the user to do most/least, comments on why fonts were chosen, comments on the grid and vertical rhythm of the page. Questions about why some functionality is there, etc.
From the developers you get lots of comments on the what the features will be doing, pointers to edge cases that the UI doesn't handle, comments on the complexity of implementation, suggestions for features to be added, things that they personally prefer (as opposed to designer comments about what the user might prefer), etc.
They're all very detailed oriented - just different details. Getting both groups some idea and appreciation for the other folks details makes everybody playing nice together much simpler.
Groups, at their best, can see things that individuals don't, but groups are also astonishingly good at creating blindness... The whole "emperor has no clothes phenomenon". Both families and work teams can labor on for years with fundamental misconceptions about what they're up against.
But the next feature they work on instead of tracking down all those little errors... might.
A user can stumble across the most subtle things that take two seconds to fix that otherwise we would have never noticed and we'd keep losing customers because of it.
+1 for attention to detail.
If I had to pick between programmer-written tests and usability tests (if! you don't, of course) I'd pick usability tests, because programmatic software can't tell you if you're building the wrong product.
Digesting long-form prose isn't antithetical to this semi-autistic ability that kids have.
But try to mix it up sometimes.
Also note: if you're about to complain about someone's "wall of text," please refrain. :-)
Thank you everyone for whichever combination of Reading | Upvoting | Commenting you did.
Several of you have made some insightful points about how “attention to detail” can be a double edged sword.
In my experience, you don’t get to decide to ignore a detail as “insignificant” until you notice it in the first place. I chose my examples for novelty, not value, but I think the point still holds.
Noticing details, of any nature, is a separate event from deciding what to do with them. I was writing more to the former, at least I meant to :)