* If you're at home, turn around and look at the bookshelf you've already accumulated. Did you really read all those books you so looked forward to when you first bought them? Or do you remember all the best bits from your favorite ones? Be honest now... the unbroken spine on your Godel Escher Bach suggests otherwise. Read what you have, before stressing on Kay's or others' lists!
* If you're at work, have you read all the wiki/docs/etc created by your team and neighboring teams? Do you understand the full architecture and implementation of the system you work with every day? Go read that, level up and become the most knowledgeable person on your team.
Or learn the architecture of git. How many people write bullshit around git because they don't understand that git can already do 100x of what their shell script or UI tool or plugin can do....
And one last point about "read all the content your team created". Not sure how big your teams are, but in many companies that's impossible. If you spend 12h of reading 7 days a week you can't read all the content that is created in that week. But you know that the cool_library that replaces the AwesomeLibrary is just as bad at solving the problem and both actually don't even know what the problem is.
Wouldn't that apply to git? You can probably get 90-95% of the benefit of git using mercurial with a fraction of the effort in learning. Life is too short to spend learning the internals of one version control system.
This idea of "Don't waste time on X focus on more valuable Y" has no real end. Don't waste your non-work time on learning computing related stuff and instead spend time on social connections and physical fitness first. Pays off way more than emacs, git, vim, lisp, or programming paradigms. See how easy it is to make such pronouncements?
You absolutely cannot.
Even if we pretend that Mercurial is simply outright better than Git, there's a lot of value in learning Git specifically, as it's what most development teams use. Employers value proficiency in Git. If you mention your Mercurial proficiency in an interview, it's likely to be scored up as cute but irrelevant.
If you always work alone, sure, Mercurial might work great for you, but the real value of these version-control systems is in enabling teams to work effectively. Most teams these days use Git, so you need to know Git.
(Of course, if your team does use Mercurial, you'd better become proficient in Mercurial.)
> Life is too short to spend learning the internals of one version control system.
I agree that learning the intimate internals of Git's codebase isn't something that's likely to pay off in the day job, but short of that, Git's 'useful skill-ceiling' is pretty high.
> Don't waste your non-work time on learning computing related stuff and instead spend time on social connections and physical fitness first.
Depends on your working situation. If your work offers no opportunity to learn new skills, and you don't want your CV to get stale, you have little choice.
Given that the person I was responding to was advocating not spending time on things for the reason that employers are looking for them, and is advocating things almost no workplace uses, I find your comment strange. Perhaps you should be responding to the comment I'm responding to?
>Depends on your working situation. If your work offers no opportunity to learn new skills, and you don't want your CV to get stale, you have little choice.
You do have a choice, and you made your choice. In the US, in this era, SW professionals are near the top when it comes to freedom to change jobs and change locations. When I know several people who make half of what I do, are relatively unskilled, and have to work much longer hours than I do, who made a very clear choice in favor of physical fitness and social relations, I am not going to claim I don't have a choice.
Having said that, this is all orthogonal to my point, which was how easy it is to make (reasonable) lists of things one should focus on that are almost exclusive to one another.
Yes Mercurial is easier to use than git, and really fossil is even better for me, someone that works solo. However git is what I use. I know that if I am going to help someone with source code control (like my daughter who is studying CS) git is what they will need help with. Personal use of git is to me the only way to ensure that I can give pertinent help.
I haven't used mercurial much, but it seems like you would have to understand the same amount about the core data structures to do something in mercurial as with git . The really simple activities you do day to day don't require any deep knowledge. And when you want to do more complicated things you have to know what you are doing.
Its similar to writing simple C++ programs vs complex ones. You may not need to understand smart pointers to write small programs, but understanding smart pointers or at least the concepts behind them is critical to write large programs.
: There are minor differences between the two that probably make git harder to pick up. Among them being the staging area, and rather obtuse UI. However, the market effects of git are hard to ignore.
Git is a DAG navigation tool that can diff text, and the API shows it.
Mercurial is a version control system that happens to abstract over a DAG. The API, similarly, shows it.
That seems like a relatively accurate characterization.
> Mercurial is a version control system that happens to abstract over a DAG. The API, similarly, shows it.
I am unfamiliar with the details of mercurial. Could you explain or link to something detailing that point?
That doesn't mean either of the other multipliers are not multipliers though. They are just more topic-focussed instead of generally applicable.
And no, mercurial can't do more than 5% of what git can. That's just a fact. Just like you probably wouldn't waste time explaining a flat earther why the earth is round I won't explain this to you. If you are smart (and I assume so from your comment) you will go out and fact check that by yourself from the assumption that you just might be mistaken and maybe this random dude on the net might've been correct.
Hope I'm right in that assumption. Then you'll have an awesome time of a lot of WOWs ahead of you. Enjoy it.
And if not, there's not much lost. Some humans will fly to Mars even if some others believe the Earth is a disk.
Personal connections and social engineering is 10times as important as your actual skills.
Its unfortunate looking at current state of the world, but thats how it simply is. And I understand some may not agree with it, but please look around yourself and tell me Im wrong..
Exactly how do I fact check this? Most Google searches give two types of results: Either pro git or meh they're mostly the same. Virtually every pro git page has basic facts wrong about mercurial and are criticizing the mercurial of a decade or longer ago(0) The very few exceptions cover use cases that really do fall into the 5% category.
And I assume you meant 95% and not 5% that you wrote. If you really did mean the latter I suspect you know little of mercurial.
(0) Branches are a classic example. Although I do think git has slightly better branch handling pretty much most pages criticizing mercurial branches expose their ignorance of beaching in mercurial
After Googling, it's pretty dire actually: https://www.mercurial-scm.org/wiki/ProjectsUsingMercurial Many of these are dead links or links to repos that haven't been updated in years.
I remember my University days: "I don't know anyone else who doesn't use Windows. Why do you use Linux?"
I believed this too, but then Clojure happened. It's not my ideal Lisp, but I can get paid to write it at a mainstream company.
Hickey’s choice of hosting it on the JVM was ingenious, gives access to one of the largest most battle tested ecosystems in modern day enterprise engineering.
There are also rock stars, but that doesn't mean we should give advise that anyone could become a rock star. Those who will become rock stars will become it anyway even if you tell them they won't.
We live in a hype oriented industry, where momentum and programmer enthusiasm (free man hours or work) and not technical merits predict whether a given technology will thrive or not.
In some circles, these are equally as 'hip.' Idea: learn the tools you need to get the job done. Study theory to supplement your skills.
This jars. Why not recommend really learning Cocoa and AppleScript or VB for Applications? Or really dig into Smalltalk's code browser? vi and emacs have a long track record, and I still use both because I sank the time in to learn them years ago, but I wouldn't recommend them as fundamentally changing someone's view of computing.
I've read most of it, some parts several times, over the years.
But my spine is unbroken. Here's how:
Open about 20 pages and run your finger firmly down the gutter. Flip 20 pages and repeat until you get to the center of the book. Next, start at the back and repeat, moving towards the center. Then repeat the whole exercise.
While you are reading, occasionally run your finger down the gutter. If you do this, you'll never break a spine, and those massive paperbacks will lie open on the table.
An iPhone makes a good book weight, almost as good as Levenger's.
I am sure there are various other books like those that truly are impactful, and it doesn't hurt to look up recommendations for those from people who are experts / polymaths.
The case against wikis (with the exception of design docs) and blog posts is that they are erratic, restricted, unedited, stale, and low on pedagogical focus. Some of the books are meticulously structured-- there's no comparison.
Although I have some duds, the hundreds of pages of wiki contributions I've made are nearly all designed to be what I wished existed when I was learning said details. Sadly, they only help to a certain degree. The individual really needs to want to learn to get any real lasting benefit.
This isn't what 'pedagogical focus' means though. Just writing down the stuff that helped you out very rarely covers what is necessary to help people out in general.
A pedagogically focused approach would also cover the assumptions you had going in and provide clear references to expected prerequisites. It should also cover the "why" for a specific approach if there are multiple approaches, which is very frequently left out of internal documentation wikis.
What I want from a book is to learn new concepts. For example functions and data are the same thing. Or recursion and a loop + stack are isomorphic concepts. Or that unit tests only verify one test case of your program correct while a proof based systems like type checking can verify an entire domain of your program correct.
Lisp is such and open mental playground when it comes to programming.
Vs the languages which consist of the exact same ideas with a slightly different syntax.
Right now I'm on a different route that I think is much more harder. Haskell, type checking, category theory, Idris and dependent typing.
Still interesting, but a lot of learn, but it took significantly more effort to see any type of reward. And the total reward is definitely less than that of lisp (from what I encountered).
Read what you like. You're describing someone who's clearly been 'stressing lists' so much they've got books they're not interested in and are never going to read. If you've made the mistake of accumulating a bunch of books as props, well, it happens - there's no need to compound this further by guilt-tripping yourself to read them before reading something you might actually want to read.
I'm also describing that most of us (I posit) already have a great collection of great books, which aren't fully tapped. Honestly, even re-reading your college textbooks will up your level significantly. You don't usually need to seek out the "best of the best" book, unless you're really going deep or want to learn a specific topic in a very particular way.
To be entirely honest, the list is actually too narrowly focussed to support my larger point: namely, that lifting you view to the horizon, by sampling from the best that fields as far away from yours as possible have to offer, is about feeding the soul and not just the machine.
Anyway, I've only read about half the books on my bookshelf (but including GED) and that's actually how I like it. After all, books you haven't read are worth far more to you than those you already know.
Well, in my defense, it's pretty terribly written...
The Amazon wishlist has honestly helped me a lot here, I can also annotate and sort it in case I come across a book again.
Plus it helps people buy gifts for me on holidays really easily.
Sorry for turning this into an Amazon ad but it's a great feature
Note that I'm not endorsing using this for "flipping" books, quite the opposite. I've been screwed over so many times in my search for old out-of-print technical books, that it's a bit of a sore spot.
I find it amusing that this book is chosen as an example of one people don't really read, and not, say, Penrose's The Road To Reality.
I still feel guilty about it. The book sits in my bookshelf, spine unbroken. My mom recently passed away. I'm sorry, mom.
Both are excellent reads, but I'd definitely agree IaaSL is far more approachable and a better use of your time.
I Am a Strange Loop spends an entire chapter of Hofstadter informing us how he is better than everyone else because he hears Bach better than everyone else. And this attitude fills the book, I found it very obnoxious and am not sure why no else mentions it.
There are at least two places in the book where he talks about an interesting point, then realizes it is a rehash of something from a previous book. And aside from the self-congratulations (did you know being a vegetarian makes you more of a person?), anything interesting in this book was done better in a previous book.
I would recommend The Mind's I instead. It's a collection of stories and essays from most of his influences, see what Turing, Lucas, et al. actually said.
I still enjoyed it. It's not for everyone, but I found that his meanderings served to break up the seriousness of the subject in a way that made for an easy read, much easier than GEB.
I have read geb, and it had some nice ideas, but i can't say I was motivated to reread it; it felt overly self indulgent somewhat in the manner of a friend telling you about how their rpg campaign played out without realising that half the interest lay in having been there.
And a cooler cover!
Disclosure: My (unfinished) copy of GEB was given to a friend as a gift, and my (unfinished) copy of IJ taunts me as I type.
But there is a special place in my heart for certain folks...Alan Kay is one of em. Time to move some of that dust on my shelf around.
So accurate as to be creepy. There's a shelf full of unread technical books right behind me. I will save this link for later.
The act of figuring out what's next and buying the materials feels good without any commitment.
This hits close to home!
Ouch. That hurt. I literally turned around and saw this book in pristine condition.
also you can unload poor books that are taking up space now!
Have also read Lisp 1.5 but don't have a physical copy.
Like, SSA form is barely covered in that book.
alan kay is a big fan of his.
The depressing things about reading lists is that it's hard to go through all of them. Many of the books list (SICP) take a long time to wade through, read, and program the examples. They are not "light reading".
(Not that there's anything wrong with those chapters. If someone happens to like them, great. It's your time. Read the book the way that works best for you.)
That gist is copied from this page on Victor's site: http://worrydream.com/#!/Links
It was discussed on HN in 2014: https://news.ycombinator.com/item?id=7578795
The constant recommendation of successful scientists is to "Go to the masters, not the commentators." It is the master who, by definition, has the right style, and often the commentators give the results without the essence—style!
— Richard Hamming, Methods of Mathematics Applied to Calculus, Probability, and Statistics, Prologue
These quotes about Gregor Kiczales and AspectJ https://en.wikipedia.org/wiki/AspectJ are good intro to MOP:
"In Lisp, if you want to do aspect-oriented programming, you just do a
bunch of macros and you're there. In Java, you have to get Gregor
Kiczales to go out and start a new company, taking months and years
and try to get that to work. Lisp still has the advantage there, it's
just a question of people wanting that." -- Peter Norvig
"I am reminded of Gregor Kiczales at ILC 2003 displaying some AspectJ to a silent crowd, pausing, then plaintively adding, "When I show that to Java programmers they stand up and cheer." -- Kenny Tilton
Seems simple when you first slap an annotation onto a class or method. But God help you when it stops working and you actually have to debug what it's doing. Figuring out what code is actually executing and what it's doing seems nigh impossible.
(Which is all to say, Lisp is still far superior, because there is a straightforward process for figuring out what kind of code a macro will generate, or to run a macro and look at its output. Macros can become complex and convoluted, but that's still nothing to the mess that annotations in modern Java frameworks create.)
There is no way to reason about the code without actually running it. You can not read the code and understand what it's going to do.
The majority of programming books are just ephemera and arcana and details that will be irrelevant in a year, or next month when the new version of the framework comes out.
Kay points to books, like the original Lisp Programming Manual, that will help you understand deep core concepts about computing itself, that will remain applicable no matter what framework or library you need to use tomorrow.
Take an Alan Kay, a McCarthy, Norvig, Abelson, Sussman, Armstrong, Steele, etc. from their prime and drop them into a software company where they have zero familiarity with the programming languages or tools currently being used, and within days or weeks they will be the most productive developer at that company by far. They will come up with simple, elegant, high performance and correct solutions to problems none of the other developers would have even considered.
Those are the kinds of thinkers you want to emulate, if you really want to write excellent software solving real problems in the shortest amount of time.
If you want to attract those kind of people and have them do their best work, then part of the “compelling work situation is to let them pursue their own problems.
” I don’t run CDG, I visit it. [Xerox PARC founder Robert] Taylor didn’t want to hire anyone who needed to be managed. That’s not the way it works. I have people on my list who are already moving in great directions, according to their own inner visions. I didn’t have to explain to these people what they would be working on, because they already are. Bret Victor has already hired four people that I didn’t know about. I wanted people to fund, not manage.”
This should be on the wall of every developer's workplace.
Personally, I like this one.
Ephemera and arcana I've learned decade ago still serve me well today, even when it's outdated people mostly either reinvent stuff so I can recognise new stuff as a variation on an existing thing or build on previous solutions so I know the details and circumstances that lead to some developments and this lets me understand new things better as well (puts it in context).
I think the usual back and forth about developer interviews may have this thought as an underlying assumption; specifically people who can solve a red/black tree on the whiteboard at the drop of a hat must be filled to the brim with useful but potentially obscure knowledge. (Obscure to those who haven't majored in CS or a related field.)
It would be interesting to submit an ask hn "What is your so there I was doing x, y, z when being able to answer obscure interview question seventeen saved the day" story.
(That's not well worded but you get the idea).
Lastly, having breadth of knowledge means you have learned how to learn efficiently. This is a significant force multiplayer.
An old-style hacker reads man pages, RFCs, specs, and programming books not to immediately know how to solve problems at hand, but to know what solutions are possible for future problems. It isn’t useful to know that the HTTP spec doesn’t specify a max header length or that the default setting on Apache is to only accept 8kb of headers, until it is suddenly extremely useful to know both things at once :)
* some would say “the other 90%”
Good luck with that, especially, but not only, with git man pages.
I tend to avoid reading these “just to know” but only when I need something specific. They are mostly torture. They could have been more useful to read but they presently aren’t.
"git checkout --detach [<branch>]
git checkout [--detach] <commit>
Prepare to work on top of <commit>, by detaching HEAD at it (see "DETACHED HEAD" section), and updating the index and the files in the working tree. Local modifications to the files in the working tree are kept, so that the resulting working tree will be the state recorded in the commit plus the local modifications.
When the <commit> argument is a branch name, the --detach option can be used to detach HEAD at the tip of the branch (git checkout <branch> would check out that branch without detaching HEAD).
Omitting <branch> detaches HEAD at the tip of the current branch.
git checkout [<tree-ish>] [--] <pathspec>…
Overwrite paths in the working tree by replacing with the contents in the index or in the <tree-ish> (most often a commit). When a <tree-ish> is given, the paths that match the <pathspec> are updated both in the index and in the working tree.
The index may contain unmerged entries because of a previous failed merge. By default, if you try to check out such an entry from the index, the checkout operation will fail and nothing will be checked out. Using -f will ignore these unmerged entries. The contents from a specific side of the merge can be checked out of the index by using --ours or --theirs. With -m, changes made to the working tree file can be discarded to re-create the original conflicted merge result."
Well yes of course.
I am looking to spend another 4 months learning ML, which I likely won't apply in any way.
That's a year down the drain with ONE cloud provider and ONE way of doing ML. It's easy to completely waste your life like this WITHOUT getting better at your job.
Carry-over is far more limited than people make it out to be, unless you REALLY know a lot, but those people are rare.
Being 35 and having worked in IT for 15 years now and seeing the rapid acceleration into DevOps/Cloud/nix/Programming/Stacks I fear for my future. I want to learn a ton of stuff, but the vast amount of stuff needed to learn in order for me to move up in my salary bracket is stifling. AWS/Azure/+ the former I mentioned, then Python, YAML, Cloud networking. I'm good at some stuff, but the industry is just moving so ultra fast now it's hard to keep up.
I've been contemplating getting out of IT altogether because I'm not fully confident in career growth at this point unless I murder myself with study and ignore my family.
I've been a MS SysAdmin for 15 years, moving into nix devops (the new way of sysadminning) isn't easy.
I think this is completely true as well. To move past senior developer, you fall into one of these camps:
1. Very bright.
2. Spent a LOT of time studying or messing around with the right tech on your own.
3. Sell your soul, i.e. ignore (or not have) family/kids.
A lot of people from camp #1/2 don't understand that for most, #3 is the only option (in the short term). There is also the very real tradeoff of not going with #3 and risking declining job prospects/salary.
I think this is doubly painful for devs, because they are generally used to quick career progression / salary bumps, and then it stalls hard at senior dev.
What kind of job would you be aiming for?
I've been a senior developer at the same company for 13 years. I feel that most of the time, I am progressing in my knowledge and experience, so, in my mind, I am making progress. It just doesn't seem that way on my Linked In profile.
Deputy Developer? Elder Developer? Doyen of Development? Development nestor of company X? Director of Engineering at Sub-sub-sub-sub-department that happens to be just your team? Level 20 Wizard? Does it even matter if it sounds good on LinkedIn?
Age-ism, of course. You can't be a 50yo senior developer.
The best developer I've ever had the pleasure to work with was a 50-year old senior developer. He cut his teeth doing a lot of C/C++ stuff back in the day, but was also (pretty successfully) leading the company's adoption of Angular. If you have a sharp mind, and you don't get stuck in your ways, then people will be begging for you to be their 50-year old senior developer.
I'm not ignoring the fact that ageism is a real thing (it most definitely is), and it is more difficult for many older programmers to "keep up", but that doesn't mean that no one is doing it.
It called consulting.
Running contentious meetings and herding directors are difficult skills to even begin to practice on your own time, but navigating family life is probably as close as you can get.
The other thing I wanted to talk about is how I solve technical problems. The first thing I do is get a representation of what the problem is in my mind enough to where I can see a clear path forward. This leans on my ability to take a 10,000 foot survey of a problem space. My current role deals with microservice architecture. Microservices is right in my wheelhouse due to my better-than-average sysadmin skills.
But I end up having to learn a lot within a short amount of time. So in order to cut down on what I call the "sheer mass of information needed for mastery" any time I look at a new tool or tech, I make a beeline for the "architecture" or "concepts" page. This is where I work out exactly which concepts and which level of the architecture I'm working at.
I then use the problem statement and vision above to hone in on a perfect implementation. Then I look at the actual state of the system and work to bring it more in line with the perfect one.
I recently was tasked with getting one of our guys unstuck. He was having a tricky issue with aquasec that he'd been beating his head against for a week. It took me five minutes to understand the problem, then I went to my desk and spent twenty on obtaining a reproduction. I didn't want to redo his work, so I then asked him what happened when he did X and Y. From his answers I had a clear path to being able to demonstrate that it was aquasec throwing a false positive on a npm library, and was already in talks with our devops team about next steps. It took 30 minutes for me to move his issue forward.
I feel like this manner of solving issues with techs that you don't necessarily have full understanding of could revolutionize the industry. But I can't really grasp how to teach it. It looks like magic to people when I show it to them, they think it relies on years and years of experience. I mean, it kind of does, but I was able to avoid ever getting fully stuck on problems even as a teenager.
But stretching out the problem space and treating each barrier in turn, diving in a little bit into complex techs along the way, I don't see a lot of coders doing that. Instead they just kind of muddle around with what they know, believing they need perfect understanding of a tech before being able to solve problems effectively with it seems to be the norm. And we have this tech landscape where years of experience in technologies becomes the primary determinant behind how most employers judge candidates.
I think the increasing march of devops and other techs that purport to unite the whole world into one walled, splendid garden will eventually bifurcate the tech world into supermen who know everything, and the underclass who can only work in one garden. If that's not how things already are?
Maybe a secondary school for advanced coding or bringing guilds back.
That's a real problem. I had a few years when I worked on pretty cutting edge stuff so most of my learning was on the job and could be applied quickly. In my current job there is a lot of repetition and stagnation so you have to spend a lot of time outside the job learning stuff which you then never apply. This gets really old after a while.
Interestingly, earlier in my career, most of the developers I worked with were also capable at system administration, and could easily fill that role if needed.
The sysadmins during that time were quite capable, but had no desire, or an admitted lack of ability for a development role.
It is an interesting phenomena to observe.
Despite its clear fit with the current hotness of immutable systems and functional programming, I don't think I've seen anyone in for-profit industry on a Nix/NixOps stack.
I look at industry as a series of different waves happening where I just need to get on one so that I can find the next. A wave is going to cross the startup world before it moves into established business. If I time it right then I can hang in there for a while. It doesn't help with my anxiety, but I've done this before so I know I can do it again. I suspect I'd find the same pattern in a closer examination of your background.
There are very few people learning all of these technologies because time is required to learn the basics, as well as put it into practice within industry. That's deep learning. A lot of people are lucky to work in places that will put unproven technology into production. The number of places willing to take this risk is increasing because cool technology is a requirement for attracting top talent. If you are in a more risk-averse organization it may seem scary to move to a faster moving, less risk-averse one, but you might find they actually have more leeway for mistakes and learning.
I watch for particular patterns around how a technology is hyped and who is applying the technology. Thoughtworks publish their perspective (https://www.thoughtworks.com/radar) and I try to read this, plus other analyses to understand what direction things are moving in. You have to take a longitudinal view. It's not good enough to just compile your research and make a decision. You need to know how a technology has moved over time. Once you start doing this you start to develop some "spidey" senses when you see something at the top of HN - and you don't have a ramp-up cost each time you have to make a switch.
Other than that focus on a few fundamentals, including one programming language reasonably well. I'd strongly suggest learning Python and having an incremental two year development plan so you don't half-ass it. A lot of the Linux world is built on stable skills. Rather than focusing on AWS, or Azure networking learning networking and TCP/IP beyond a cursory level. There are a lot of folks building cloud infrastructure badly, slowly etc. because they don't understand these fundamentals. The references are old and boring like TCP/IP Illustrated Vol. 1.
nix devops is productizing/consumerizing the old way of nix sysadmining (at least for shops that did things the right way)
Saying that you can't move between cloud providers because you need to learn something new probably implies you don't understand the fundamentals constraining the design of e.g. databases provided by cloud.
IT is a fast-moving field. Solving business problems takes understanding which good ways to do that are currently available, what limitations and dangers each approach brings, how would it interact with other things, etc.
You still need deep knowledge, but the most efficient knowledge is that of key principles, general approaches and ideas. A specific technology does not matter much, and can be mastered quickly enough, if you already are acquainted with the principles behind it. E.g. learn about FRP, and you will see how React, Elm, and Excel all work along the same lines.
What you end up with after some time is like a normal distribution: deeper knowledge around some area and various levels of acquaintance with a wide range of other things.
There are still areas where you can polish the mastery of one narrow thing to utter perfection: making pizza or coffee, sports like running or weightlifting, etc. You can keep practicing them to counterweight the feeling from the view of the sheer and constantly moving IT landscape.
When talking to Junior Devs i always point out that they should take into account the Lindy Effect and invest some of their learning time into things that were here for atleast a century so they can get enjoyment out of their accomplishment for the rest of their lives.
You don't want to die without making at least ONE perfect pizza.
The latter arguably runs somewhat counter to a lot of MVP, etc. approaches but recognizing the difference is still useful.
You just put into words what’s been bothering me about the way my team works. The “process” people—while well-intentioned—seem to think that by breaking inherently complex tasks up in just the right way, they can make the complexity go away.
Well, no. If that’s really the idea, why are you paying me so much?
It's very useful to have enough exposure to things to be able to tell generally what's going on and to know what you need to look up.
Reading and practicing is key.
Some people are just good at seeing the top view and how all of it will fit in most optimal way possible.
I am saying this because I can see abstract patterns from the top and what's wrong with entire project. But entire IT industry is looking for people with depth. So sometimes I wonder if there is need for people like me in IT but then again when I see a vast system I can immediately discern what could go wrong, possible bottlenecks or come up with better solution. This gives me little bit of hope.
He focuses on depth whereas I tend to focus on breadth. Of course I have areas and technologies I know better because I have worked in / with them (computer vision, filesystem / DB replication algorithms, Lua, etc) but I try to expand my areas of knowledge so I can understand whole systems rather than become an expert at a specific thing. For instance, right now, I'm doing mostly Web front-end stuff, which is the part of the Web stack I know the least.
> oh its one of these sort of things like xyz I did four years ago.
Of course the specifics are different but the muscle memory can get you through. I'd say the worst bit about it is that you often feel impostor syndrome pretty bad because in many cases you're never 100% sure about stuff like you might be in a speciality.
I do tend to focus more on breadth, but I do go deeper on some subjects. However, I never give up breadth for the sake of going _Jon Skeet on C#_ level of depth. That, for me, is a waste of time. I don't need to know a language or tool quite that deep. I can be very productive with a certain depth without touching bottom.
 Note: I'm not saying Skeet is wasting his time.
I've known people who were, in one example, arguably the world expert in performance on a long ago computer architecture. There came a point where no one cared any longer and I'm not sure to what degree he successfully moved on to other pursuits.
Another example is Y2K. A lot of consultants ended up defined as being Y2K guys and they didn't necessarily successfully transition to something new.
Not arguing that going deep is necessarily wrong but, if you do, you need to keep your eye on emerging areas that could benefit from your existing skills.
They could just be branching into /r/iamverysmart territory and just parroting omgubuntu.
If you try to have a wide breadth and a deep knowledge simultaneously, you will pay a heavy price. You will not have much of a family, you will not have other hobbies, you will not know much beyond CS, you will not even have time to use the vast majority of your knowledge to its full depth. There simply isn’t enough time. You will likely die as you lived, at a keyboard with your head weighing down on keys spamming them infinitely in a code editor, or slumped in a chair or bed with a technical book resting on your chest. Very few people will notice your passing, and the world will be no different for whatever knowledge you gained.
What we should strive for instead is “Just-in-time knowledge”, where the goal is to quickly become an expert on a topic you know nothing about right when you need to be. Many people first learn to do this when they get into debates on the internet, and then extend it into their professional careers.
This simply doesn't happen for anyone.
1: Admittedly, the lottery of birth can make that more difficult.
First, the greatest book of all time, The Autobiography of Benjamin Franklin - an amazingly introspective and insightful look into how to live an examined life and improve oneself.
And then if you want to learn lower-case "design thinking", my top 10 books
* Design for Everyday Things - duh. I re-read chunks of it all the time.
* Tufte - hard to pick one, I might actually be iconoclastic and go with Visual Explanations which I think has more to offer programmers over pure data visualization. Again, just grab one every day, flip through 3-4 pages, rinse, repeat.
* User Story Mapping - Extremely memorable book - it gives you a pretty clear field guide on prioritization, empathy, communication ... just a great book.
* Badass by Kathy Sierra - I flip through this book again and again. It is gospel truth about what motivates humans.
* The Field Guide to Human-Centered Design - IDEO's most practical book. (Close second: Designing Interactions.)
* Universal Methods of Design - another deeply practical book, lots of good tips and examples.
* Universal Principles of Design - Sister book to the Universal Methods. Again, straightforward, flip to any page and get an idea when you're brainstorming.
* Thinking in Systems - I recommend you skim this book through, but come back to it a lot, it grows with you.
* Inspired by Marty Cagan - again, love nuts and bolts process books.
* Don't Make Me Think! - still a classic, still see these mistakes being made all the time in modern app dev.
Do you mean "The Design of Everyday Things" by Don Norman? If so, I agree that it is a great book.
Edit: Starting Forth also by Brodie would probably be a better “spec” for implementing a Forth.
Makes me curious about SwiftForth . I saw a bunch of other commercially available forth implementations in the past ( ?), but since Forth looks so niche I never got motivated enough to try Forth more seriously.
Both Minsky and McCarthy seem like almost mythical figures in the book, and I don't think I could ever hold aa candle to them, but the next best thing is probably to understand their thinking. I think it's a bit easy for us to get caught up in the medium blog posts detailing a small segment of a new framework, when what we really ought to do to grow, is go back to the basics and understand them in-depth.
I eventually found my way to Minsky's class. It was nominally on one of his past books, but the class sessions often seemed to be him talking about whatever he was thinking about that day, as he worked on his next book.
Minsky's "ten-year grad students" and unofficials were also great. Two of them were especially personable, and would wander around the lab, and strike up impromptu technical conversations with random other enthusiastic students. Which seemed unusual among grad students of my dotcom era, and maybe it was more old-school greatness, like the Levy book.
Regarding multi-layer neural nets Minsky says they're uninteresting as they could be declared with enough complexity to basically reimplement any existing logic circuit. What made multi-layer neural nets interesting again was a multi-layer training algorithm.
There another interesting part, shortly after showing that single-layer neural nets can't implement the XOR function, Minsky shows that all that's required for a single-layer neural net to implement the XOR function is to add another column to the training set with specific values, effectively encoding the hidden layer back into the training set.
Abstract: Freud's theory of jokes explains how they overcome the
mental "censors" that make it hard for us to think "forbidden"
thoughts. But his theory did not work so well for humorous nonsense
as for other comical subjects. In this essay I argue that the
different forms of humor can be seen as much more similar, once we
recognize the importance of knowledge about knowledge and,
particularly, aspects of thinking concerned with recognizing and
suppressing bugs -- ineffective or destructive thought processes.
When seen in this light, much humor that at first seems pointless, or
mysterious, becomes more understandable.
A gentleman entered a pastry-cook's shop and ordered a
cake; but he soon brought it back and asked for a glass of
liqueur instead. He drank it and began to leave without
having paid. The proprietor detained him. "You've not
paid for the liqueur." "But I gave you the cake in exchange
for it." "You didn't pay for that either." "But I hadn't
--- from Freud (1905).
"Yields truth when appended to its own quotation"
yields truth when appended to its own quotation.
--W. V. Quine
A man at the dinner table dipped his hands in the
mayonnaise and then ran them through his hair. When his
neighbor looked astonished, the man apologized: "I'm so
sorry. I thought it was spinach."
[Note 11] Spinach. A reader mentioned that she heard this joke
about brocolli, not mayonnaise. This is funnier, because it
transfers a plausible mistake into an implausible context. In Freud's
version the mistake is already too silly: one could mistake
spinach for broccoli, but not for mayonnaise. I suspect that Freud
transposed the wrong absurdity when he determined to tell it
himself later on. Indeed, he (p.139) seems particularly annoyed at
this joke -- and well he might be if, indeed, he himself damaged it by
spoiling the elegance of the frame-shift. I would not mention this
were it not for the established tradition of advancing psychiatry by
analyzing Freud's own writings.
Found it ironic that in a comment about Lisp, Kay forgot to balance his parens. :)
"Topological features are a lot more fundamental than geometric ones, in that topology is a more basic branch of mathematics than geometry in terms of symmetries and mappings. One thing being inside another is more basic than it being smaller or larger than the other, or than one being a rectangle and the other a circle. Being connected to something is more basic than being green or yellow or being drawn with a thick line or with a thin line." 
More Lisp fun .
This was archived from http://www.sics.se/~joe/thesis/armstrong_thesis_2003.pdf which now returns a 404 Error.
At least two of my items were on Alan Kay's list. The other being "The Mythical Man Month", especially the edition with the "No Silver Bullet" article, which a depressing number of people in our industry seem to not have read. So I don't feel like I'm too far off the mark.
At VPRI (Kay's research institute) they did this: COLA. (See also STEPS.)
http://www.vpri.org/pdf/tr2012001_steps.pdf "STEPS Toward the Reinvention ofProgramming, 2012 Final Report"
Alan kay is a big of Engelbart and I'm surprised it wasn't listed in his answer. Also, for anyone that's interested, a windows clone of NLS is available here for windows: http://www.ndma.com/resources/ndm8543.htm
Minus the "journal", many of the multi-user capabilities, and the "compiler compiler" programming system of NLS. still, it's interesting to play with.
Alan Kay recommends some books. The Hacker News spend most of the thread recommending books they like, instead.
Anyway, this is a nice list filled with works I've not read, so I'll make certain to give them some attention next time I'm at a book store and ask them to order some things, since they only carry magazines and other drivel in-store.
Sounds like the MIT SICP course from the 1980s using scheme!
For those who are interested I made the neural net system into a simulator:
If that is too new, then, for pete's sake, at least study 1986 Lisp; no need to go back to 1962.
Books from the mid to late 80's, like Wilensky's Common LISPCraft are decently useful.
Just like I wouldn't tell someone to read the 1978 first edition of the Kernighan and Ritchie C book.
Assembly Language Step-by-Step
Programming with DOS and Linux
Copyright © 2000 by Jeff Duntemann
Rev. ed. of: Assembly language, © 1992
However, people like to claim that what I said above is reading a book.
I met him at a conference 10 years ago - I basically blew off the whole conference to sit at his feet in the lobby while he told stories and shared pearls. There were 3-4 of us, probably 30-40 years younger than him, and we were all nobodies, but he didn't care. One of my favorite experiences ever.
This implies that he also blew off the whole conference. I wonder why he did that.
> There were 3-4 of us, probably 30-40 years younger than him, and we were all nobodies
I would hope that this isn’t the reason. Being famous and “holding court” like this is probably addictive.