Hacker News new | past | comments | ask | show | jobs | submit login
Reality has a surprising amount of detail (2017) (johnsalvatier.org)
498 points by tosh 11 days ago | hide | past | web | favorite | 115 comments

This is why I think people should still learn their job through hands-on apprenticeship, as opposed to a lot of book-learning in the classroom. In fact I think people do still learn their jobs that way. It's just delayed by several years of book-learning that has little return on investment. When you finally land your job, your first few months are when you really start learning enough to become useful.

Disclaimer: I am biased. I am biased toward book learning! In school I made A's and loved book learning.

Clarification: I think book learning is useful. It's just that the order is wrong. Nothing I read in a book sticks until I have tried my hand at it for a few weeks. But an experienced programmer (or plumber for that matter) would probably get even better after reading a few good books.

You need both. It's unfortunate that we separate out the two. It wasn't until the second semester of my third year of my accounting degree that we actually had a project where we would get fake checks we had to enter into the books, fill out, and put in a folder called "mailed." This should have been day 1 and we should have learned what it was we were doing as we were doing it. And this should have prepared us for some sort of internship.

On the other hand, I've met coders who somehow have managed to completely miss very basic CS concepts because they learned the entire thing ad hoc as they needed it and they never realized that they needed Big O, or that they needed to learn how modulo works.

For me, I flunked out of math my entire life until I learned coding and was able to find the answers to "what is this used for?" so I could put it into context. Book learning is good, but it's easier (at least for me) if you have a scaffolding of experience to know where to put each new thing you're learning.

Never neglect either!

>For me, I flunked out of math my entire life until I learned coding and was able to find the answers to "what is this used for?" so I could put it into context.

When all I used math for was wetlab biology work I found it tedious and annoying. Teaching myself how to code (for data analysis) made me go from disliking maths to loving it. It is easy to see how powerful and beautiful math is now.

This feels like me with category theory and types. Once I learned an ML (OCaml) I understood the power of types and higher level abstractions. I think people also confuse mathematics with arithmetic. I dislike the latter (much better automated than manually done) and love the former.

Ditto. Learning theory without real-life application makes it boring and people forget. Learning application without theory leads to hacks. Often the real-life application happens in the last year when it's too late for some. Maybe the interim assignments should be more real-life examples so students could see application of and be inspired to learn deeper.

Agreed. A half-and-half course that was half degree, half practical would be pretty ideal. Some of these do exist, but they tend to be small experimental courses that not many people take.

I would: I don't think studying for 3 years upfront makes sense for most people. It would be better to spread learning out more.

I am, sort of, doing an apprenticeship right now. We are building an indoor aquaponics pilot (290 m2/3100 sqft facility). Aquaponics is growing fish, bacteria and vegetables in a recirculating water based system.

We are doing this “from scratch” (obviously buying pumps, sensors, tanks etc of the shelf). At the same time I am learning aquaculture (growing fish) in a formal two semester course, as well as reading a bunch of books on the subject. We are also planning an insect larvae farm, to use food waste as feed, to partially feed the fish.

It turns out there is a massive amount of things to learn. I am lucky my colleagues are full of different expertise than what I have. The know how to grow plants, build buildings, mechanical engineering, electronics, software, sales, the list goes on. It is somewhat scary how complex everything is, but it is amazing to build the thing with your own hands and see it gradually come to fruition. I am learning so many thins that I can’t learn from a book.

We are building this ourselves to learn by doing, as we don’t want to scale up without significant hands on experience.

That sounds incredibly interesting, do you have a site for the project?

Nothing in English yet, a we so far are aiming at local production with local customers. But this site will have at least partial translations. Working on a build log right now.


Would love to learn more about this. Any resources you can recommend?

Depends on which level you start.

The FAO small scale aquaponics manual http://www.fao.org/in-action/globefish/publications/details-...

The Aquaponics Farmer (some quibbles, don’t sterilise the water with UV constantly) https://www.amazon.com/Aquaponic-Farmer-Complete-Operating-C...

Commercial Aquaponics systems, Wil Leonard (the book and other resources) http://www.aquaponic.com.au/index.htm

Small scale mealworm farming https://www.mealwormfarming.org/

This is mostly the system we have in Switzerland.

Only around 20% of teens go directly from school to an university. The rest makes a three or four year long apprenticeship, starting at age 15-16. Apprentices get compensated in the range of 500 to 2000 chf per month.

You can combine the apprenticeship with a secondary education that allows you to go to universities of applied science to study engineering, computer science or whatever it is you are interested in.

So most Swiss with a tertiary education got some practical experience before they stated their tertiary education.

> Nothing I read in a book sticks until I have tried my hand at it for a few weeks.

I don't think this is universal, though. Book reading is very natural for me and sticks immediately. In fact, I often find that things I've read today and not understood become magically clear a few days later.

I'm the complete opposite. I'm convinced my brain has some kind of overly aggressive internal garbage collection. Knowledge/tools that I haven't derived utility/benefit from get purged. I will say that something I've book-learned in the past can be re-learned more rapidly if I run into that problem in the future, but it's grossly inefficient and of little benefit.

I have found this to be true, but I think the key to absorbing things is to figure out how to associate it with benefit/utility.

I can see it going both ways, depending on how narrowly you define "new thing."

I have never written a program in Go but could probably pick it up through reading, because I have years of experience with other languages. But I pity the poor programmer trying to set up everything for the first time, alone.

Or my first try at home repair was daunting. But now that I have fixed a few simple things I have a better feel for it and might be okay learning the next thing with a book.

I think this is partly why some people like watching Youtube videos and others just want an article.

Doesn't that mean it doesn't stick immediately?

I've always assumed that's because you're sort of mulling it over in the back of your head.

If I understood it the first time, it usually sticks very well. I'm refraining from saying completely and always because it's obviously not an absolute.

What I meant with that last sentence was that even for material I did not understand well (because it seemed to go over my head), I usually still finish reading the material, trying to absorb a very handwavey gist or general structure of the idea, avoiding the need for closure and leaving large chunks unclear. Then, after several days, I often find that I now almost magically understand it because it just clicked without me even being aware of it.

It is highly dependent on the person (and book) indeed. I always preferred book reading and learning like that. Carefully thinking it through and then working on it. Worked well so far.

But over time (especially reading material from my kids classes) I see a lot of tiny mistakes, left out information, or imprecise statements which make me wonder whether I just did not see that before or quality of school books etc did degrade.

I've found I need both myself. I learn best when I read something and then shortly after get practical practice using what I just read. I find i'll forget little details of what I read if I don't use it right away, but they'll stick if I do something with it right away.

I agree, I've come to appreciate the practical aspects too and feel like I'm now using them synergistically. I'd still say theoretical reading is my primary learning mode.

Wow I couldn’t agree more! I found myself bored in school and often would skip class to go to my off campus job to make money. I was poor and wanted the short-sighted immediate success of income.

However, looking back I regret not taking classes that actually interested me. My school (like I would presume most) had a rigorous course schedule that one needed to adhere to in order to graduate. I think in taking classes I was actually interested in I would have (and the ones I took in the curriculum that I was interested in I got A’s) done much better overall.

Maybe I should have gone to a liberal arts school? Either way I’ve supplanted my lack of A’s with lots of books. It’s made up a lot of the difference so far!

I see this wrt. Software Development all the time. There are Tons of Online Curses, Books and other self learning material but:

1. Completely self learned people are more often prone to develop some bad habits or worse have some subtil but fundamental misunderstandings which are hard to get out of them. (in my experience)

2. Often just a view month with proper guidance can have much better effects then a year+ of self study or "isolated" experience. The problem most times not the general ideas but the details, wrt. the situations for which there are no clear rules you can "just" apply. And often making a wrong decisions doesn't matter that much and sometimes fall under "programming style", but there are hundreds of such small decisions and making many somewhat suboptimal decisions can easily pill up and cause serious problems.

Best example is KISS (Keep it Stupid Simple). Seems easy right, but what does simple mean? Is less code more simple. Are less sanity checks or maybe less validation more simple (due to less additional complexity in them)? Is using less abstractions more simple, even if more abstraction allow you to have code which is harder to get wrong when writing/changing it, _assuming_ you got the abstraction right? Is it more simple to sometimes just repeat yourself, or should you keep strictly to DRY? Is it more simple to sometimes explicitly write out type which would be inferred anyway? What about kinda unnecessary temporary variables? What if it's like `var valid = ...; if valid { }`? What about splitting functions, is it more simple to have 2 pseudo independent sub-functions you will not reuse or one slightly to large function where everything is in the same place?

The answer is always: It depends on the exact situation.

There is no clear cut definition of simple. The best would probably be based on code readability.

But then is it more readable to have one large function body or to split it into sub-functions which are well named but to which you have to explicitly navigate to "fully" understand all details of that function?

Sure we do have guidelines and they allow you to generally get it right (well ok, if you use some good guidelines, there are many bad ones, too). But there is a limit on how much you can learn from them and the rest needs practice. But many problems only show if you do large projects by yourself _and care to maintain and reflect on them, and come to the right conclusion unbiased by the programming language induced bubble you are in_ which is super time consuming. (So guidance is important, but hardly done as many companies do not plan in any time for that at all).

> 1. Completely self learned people are more often prone to develop some bad habits or worse have some subtil but fundamental misunderstandings which are hard to get out of them. (in my experience)

I don't find this is particularly true of self-taught people. People who are raised in bad coding environments like working alone or, project-based consulting shops may have bad habits. Hint: the habits are not taught in school.

Isn't working alone part of being self-taught?

My school tried to teach good habits. Many of the professors are out of touch with what good habits are though, so they usually teach to books of varying quality or practices specific to how they learned a few decades ago.

Yes, but I think there's a stage learning where you learn "this is how I work in a manner that lets me get along with others." There are a lot of ways to gain the knowledge of this stage. One part of it is working on projects that you will maintain for a while--this lets you see the cost of tech debt. Another part of this evolution comes from working with others--you learn where your code falls short, and you learn better conventions.

The right environment helps a programmer evolve through this stage.

In a similar vein, stay at home moms or dads may choose to send their child to daycare a day or two a week to learn how to interact with others. Without these learnings, life, or work, can be more challenging.

Yes, through it might also happen if you are not self thought and even if you work in teams as long as you don't do proper reviewed with time to fix non critical problems in a PR it's still like working alone. Which isn't good.

I do count school and University courses as self thought, just with a sometimes better fundation (but sadly often with a worth one too).

As long as you don't have someone who checks your code or does pair programming it's all self thought just with different sources of input.

Also school/uni projects tend to be not that useful either even if you do pair programming, because the people you pair up with have roughly the same experience like you (most times).

Not necessarily. Me and a friend both taught ourselves, and made projects together (games, websites, etc). I guess we partly taught each other, but there was no expert or teacher involved.

Besides, self-taught programmers have to work with others as soon as they get their first job.

I take it you are agreeing with the previous poster that hands-on apprenticeship would be a good alternative to formal study out of books. An apprenticeship (hopefully) implies a good deal of mentorship and guidance.

* * *

Personally I think “schooling” should consist much more of student-directed projects, with scope, difficulty, and autonomy gradually increasing over time.

It’s a real shame that typically the most substantial pieces of work 18-year-olds are expected to complete in school take maybe 10 or 15 hours, and most of the work students do is broken down into atomized and unrelated chunks which individually take <10 minutes with few if any decisions involved.

This is the methodology of Gap Year[0] and ideally filled with some sort of extended internship, which I personally subscribe to even though I didn't do it - simply wish I did.

Theory is definitely fantastic, but not being exposed to the real world took me longer to come to grips with than it probably should have.

I think book-learning has great return on investments when applied critically. The problem is when it's applied dogmatically like it is today. Not everyone needs to go to college - for profit schools are scum. Universities can hardly call themselves impartial, but it still has a place and is useful.

[0] https://en.wikipedia.org/wiki/Gap_year

While I agree in theory, you are grossly underestimating the amount racism and other forms of phobia that still exists among humans. When “apprenticeship” was the only way to learn, there was no opportunity for the unfortunate ones. School, granted they do a mediocre job, are at least legislated to be fair, secular, and what have you to the those who are otherwise shunned, poor and unfortunate

I largely agree that hands on experience is much better than book learning. I hope higher education moves towards this model instead of lectures.

However, in some jobs you do need a basic understanding of terminology and history. But probably 6-12 months of concentrated book learning is more than sufficient rather than 4+ years.

This is why I think people should still learn their job through hands-on apprenticeship

Isn't that what internships are for?

I notice this effect all the time when cooking, and the maddening part is how hard it can be to even figure out the right details.

Like, the best sausages I ever cooked were at a barbecue in park, on a random public grill with no particular attention to the heat (I think the sausages caught fire briefly), and yet they were amazing. Juicy and delicious, with just the right amount of char on the outside. Over my home grill, with careful regulation of the grill temperature and a pro quality instant read thermometer, I’ve never quite managed to do better than “pretty good.”

(I highly suspect one of the details is “undercook your sausages” which is easier to do when you’re blissfully ignorant of the actual temperature.)

Hm this reminds me of reading Salt Fat Acid Heat, and the buttermilk chicken recipe, which has only 3 ingredients (chicken, buttermilk, salt).

She says you have to orient the chicken a specific way in the oven because the corners of the oven are the hottest and the different parts of the chicken will cook evenly that way (?) Not quite sure since I didn't try it. But my friend made it and it was fantastic.

So the way I look at it (without much experience) is that a thermometer measures one dimension, while a chicken is a 3D thing. And an expert chef noticed that enough to put the detail in her recipe. i.e. "reality is complicated".

Not sure if that's the problem with sausages (since they're not that 3D) but there's a reason that specific foods are cooked in specific ovens. i.e. why do people obsess over pizza ovens and import them from Italy? Why is clay pot cooking different than metal pot cooking, etc.?

Because temperature / heat aren't just one-dimensional. She says that good cooks look at the food and not at the thermometer. They're looking for signs in the food which stay constant across environments rather than measuring one aspect of the environment.

tl;dr Samin Nosrat rejects the use of a thermometer because it doesn't measure enough reality to cook well.

I think there’s a large emotional aspect: If you’re hungry and a bit tired (from a lot of exercise not stress), generally in a positive mood, and not super focused on the taste of the food then it’s going to taste a lot better than if you don’t particularly feel the need to eat, you’re a bit stressed, and you’re focused on perfecting sausage flavour.

In other words, it’s probably quite easy to enjoy an ok sausage if it’s come from some random grill in a park because you’ll be focused on having a good time with your friends or family in a park more than on whether the sausage tastes good. If you make it at home (even if you’re inviting those same people) you can easily get yourself worked up about cooking the perfect sausage and whether the temperature is right and whether the time is right and so on, that you’ll just feel like the sausage doesn’t taste so good (and you’ll probably be more careful and critical about tasting the sausage).

Then again maybe the sausages were just undercooked in the park. Or maybe you used different sausages (I’ve found much more variance between sausages being made differently than how they are cooke)

I've had very similar experiences and I've always wondered how much of that is expectations/perception. At home where I can control things better I have higher expectations. I didn't have any expectations for the camping grill. The 'best steak ever' from the camping trip is a great romantic story and a wonderful memory and maybe that is what makes it better.


This is the article to read.

In general I can really recommend The Food Lab as the guy who does it has a very scientific approach.

Don’t forget the source of the heat: even “the best” gas grill is an entirely different thing than a (presumably wood fire underneath) public grill

I like to think you're on a quest for cooking the perfect sausage

I agree a lot with this author, but after being in software for a long time, it still feels like one of the unique problems of software is we keep changing the details. I imagine constructing stairs is largely the same as it was 50 years ago. Most web development pretty much does the same thing as it did 20 years ago, but almost all the details have changed. So as a programmer it gets frustrating having to learn all nearly completely new details but the outcome isn't really that substantially similar than it was when I first learned Java servlet dev.

TBH, I'm not arguing so much against this evolution of software (it seems like the over-complication of web dev has been discussed as nauseum on HN), but more that just from a personal perspective I have less patience for figuring out a new way to take stuff from a database and display it in a browser window.

I agree a lot with both of you, and add that after seeing things through the lens of software I went back and started looking at the details of people building things and found that for some reason unknown to me, even those professions are constantly changing. New materials, companies vying to be the “one true solution”, governments changing codes for various reasons.. I wouldn’t be surprised if building a basic staircase was more than 50% different than 50 years ago.

50 years ago construction often required individual skill. My grandfather used to tile kitchens back when they still floated countertops using "mud" (dry-ish concrete mix) which resulted in a solid very level counter top (assuming skilled labor). Nowadays, you use concrete board or similar and hope the cabinets were leveled well (usually off about an inch) for cost. Most things are now some degree of prefab and plug-and-play built by less skilled workers so you can get by paying them less. They often have no experience in building and can't spot details that will cause problems. In some cases the technology makes for an overall better product or reduced waste, but often as not it's just less material resulting in flimsy construction. I'd be surprised if you couldn't get a staircase nowadays that was just a precut aluminum struts with pre-etched holes that you can tap out (or similar). Like an upscaled Ikea part. So that's what contractors use because otherwise they couldn't compete on labor costs with someone who did.

Oddly the overall costs for building a house have not correspondingly decreased for the consumer as banks and builders generally take a larger share. Though to be fair consumers also want larger "McMansions" as well so part of the factor is some of the savings do go into giving "more".

I think it sort of depends on what you count as different, but I think you're basically right. Power tools alone have made a big difference.

Still, I think stairs are conceptually almost identical, including in the internal details, in a way that software really isn't. Maybe the rise/run is a slight different ratio or the stringer is an incidentally different material, or the saw has a battery instead of elbow grease, but it's fundamentally similar.

Isn't programming all fundementally similar too? Ok, there are a few patterns you need to learn (functional, OOP, etc), and domain specific stuff. But that's also true of physical construction. Knowing how to build stairs is only going to get you so far if you want to build a wardrobe, and probably won't help much at all if you want to build a brick wall.

No, the programming that is fundamentally similar is carried out by a compiler or by parameterizing a subroutine. What's left over for the humans is the novel part.

I agree with Kragen here. I see what you're saying, but from my perspective as a person who can build both software and stairs, they aren't in the same ballpark.

Sure a CPU is still executing instructions somewhere deep down the stack, but the semantics and concepts I am working with to make a control appear on the screen and then do something are wildly different over time in comparison to stairs.

Take a time traveling web programmer from 1995 and ask him to look at some modern web app code. He'll be completely lost for quite a while. Take a time traveling carpenter from 1900 and have him look at some residential stairs built in 2019. His comments might include "why is this lumber so thin?" and he'd probably have opinions on the newer materials (mostly positive, not always) but he would be not at all confused.

I think it would be fair to argue that a modern web app (big, complex system) is not comparable to stairs (simple, subpart of a system), but I think you can extend the analogy a lot. If that 1900s carpenter saw a modern commercial construction project he'd be a little more out of his element, but only because the materials we use are different and it would include stuff like HVAC. Fundamentally though he could easily see "oh, that span is longer than I would have made because they have stronger materials than I had." I can't think of any particularly confusing aspect of modern construction because the physics never change.

It's only details from that perspective though, from an user's perpective they're essential because of cultural relevancy that's tied to images, popularity and familiarity, that's what counts for most. It's really important to understand that details change depending on perspective.

Many people just expect more complicated web apps because of the alternatives they use and are familar with (native apps).

It compares well with the stairs example of the article. If there are no people other than you that expect a stairway, you may just pile up some boxes, or use a rope and combine it with your workout. Do you really need and expect a stairway if it's only you?

A real world construction and software construction each has remarkable amount of detail. The thing is that things like stairs or pizza dough or other real world things have approaches that might be hard but follow specific principles, have their qualities circumscribed so to speak.

A given computer program is often stupidly easy, much easier than building a staircase really. But often, what's involved in the computer program is fundamentally new in contrast to the way each staircase is only somewhat new.

That situation gives programming it's wild, messy, unpredictable quality.

I had a similar experience while visiting my parents recently. Sitting at the kitchen table my mom mentioned they had a small weed problem, and my initial reaction was "No, your yard is fine" because I hadn't noticed any weeds.

After saying that though, I went and walked around their yard and noticed many details I never had before. There were multiple broadleaf weeds including different phases of dandelion and some weeds I couldn't name. There were patches of bare ground and areas where the grass was unusually thick. I saw different kinds of sprinklers and also startled a rabbit while I was walking around.

I was struck, while investigating the lawn, by how before I looked at it, it existed only as a simple object in my mind "Parent's lawn". On closer inspection it has an immense amount of detail.

This also probably implies that they are the only ones who notice the "problem" with their lawn. :-)

Maybe it wasn't a small problem with weeds so much as a problem with small weeds (which you didn't see until walking around ;) )

Actually you only see what you wanna see, right?

"No, your yard is find ... oh, that kind of weed ..."

is what I was expecting

I really like the essay but I think there is a form of Stockholm syndrome playing out especially with the following statements:

> using a Haskell package for the first time, and being frustrated by how many annoying snags there were > If you’re a programmer, you might think that the fiddliness of programming is a special feature of programming, but really it’s that everything is fiddly, but you only notice the fiddliness when you’re new, and in programming you do new things more often.

I think programmers think that the suckiness of their tools is a fact of the universe, when it's just that they don't know there's a better way.

At Repl.it, we built `upm` (Universal Package Manager) to make using packages a breeze with no fiddling. Ever since I started doing most of my scripting on Repl.it I almost never "fiddle" to get a package or a feature of the language to work.

upm goes as far as to guess your dependencies when you import them and will install and generate the spec and lock files. It's a truly magical experience. More here: https://repl.it/site/blog/upm

And we try to do a similar thing with every potential programming workflow. Can you make it "just work"? I think we can, for most things.

I agree that we can build better tools. I think what is missing in software is wholistic developer tooling and longevity of that tooling, such that we can invest in mastering them. UPM is a great example of a step in the right direction I think.

When you go into a carpentry workshop, there are hundreds of tools, all designed to work on wood, and a carpenter will know the majority of their tools quite well. They can invest in a tool knowing they can re-use that knowledge for the rest of their career.

When I inherit a software project, it can often have two, three, even four different language ecosystems involved with hundreds of libraries and tools as dependencies. In software we have to accept that we will be surrounded by dozens of tools we need to work with but will never have time to fully master.

Most people I meet are only proficient in many of their tools enough to get the job done and no more. Soon enough the status quo will change and the software industry will happily drop thousands of man-hours of learning right into bin.

The finer points of library X, framework Y, or tool Z are almost ephemeral. Even a company's core software product could just be thrown away and re-written in the new-hotness at the behest of a PM meeting, often losing nuanced domain knowledge in the process.

I have personally stumbled upon this paradigm (value craftsmanship on tools that will have a relevant shelf life), and I credit my ongoing satisfaction with my career to this, at least in part.

Without getting into details and specifics, something I highly enjoy doing, I'm referring to things like Vim, tmux, unix utilities and software, various Linux administration knowledge. Various electronics concepts. I now consciously and deeply evaluate a potential skill to learn based on how likely it is going to improve my life in the long term.

Despite the tool you work on which solves (maybe) that one very specific problem the author posed, I'd say his point still stands about 99% of other things in this universe.

There's an entire book about this, well-known to many artists, "Drawing on the Right Side of the Brain". [1]

One of the exercises is to draw a tree. Then to go outside and look at a real tree, and draw what you see.

The two could not be more different.

Many artists will talk about when they "learned to see". Which means: understanding that reality isn't the simplicity of what our brain constructs, but rather the seemingly infinite detail of what is actually out there.

It changes the entire way you look at the world.

[1] https://www.amazon.com/Drawing-Right-Side-Brain-Definitive/d...

This is basically the reason why our current approaches to AI are so woefully inadequate, that they don’t even deserve the name “AI”. The manifestation of this in AI, as first discussed in the 80s, is called the “frame problem”.

This is basically the problem with “big” data i.e. more comprehensive data (more columns) rather than more examples (more rows) — it’s difficult to winnow out the relevant features from the irrelevant.

The current belief is that good priors are super important to have any hope of doing that reasonably well — but how to get those good priors is anybody’s guess.

Embodied intelligence seems to break this mold, in principle. I believe the principle to get good priors is called reinforcement learning, and that currently the problem you describe is called inductive bias. Or maybe I'm just pushing your goalposts to something I can talk about...

There are some really brave ideas being kicked around among the researchers who focus on their passion of AI, as opposed to those who post ever higher benchmark scores. They don't optimize their algorithms because they feel the algo should be serviceable out of the box. Cool stuff is happening.

Embodied intelligence, Inductive bias, etc are the right words but we rarely know how to build them into our models in all but the simplest problems :-) Eg: Our reinforcement learning models are largely stupid and without any great inductive biases. That’s why it takes a crazy amount of training and reward tweaking to even do anything simple.

> There are some really brave ideas being kicked around among the researchers who focus on their passion of AI, as opposed to those who post ever higher benchmark scores. They don't optimize their algorithms because they feel the algo should be serviceable out of the box. Cool stuff is happening.

Where do I find out more about this? :-)

Tensor networks come from quantum physics but are finding use in ML research.

The papers on Neural Turing Machines, and Neural Tangent Kernels, which is related to Gaussian Processes, are getting a lot of citations. There is work on making these kernels composable.

There are some papers combining some of these approaches but not yet all of them, that I have seen.

If I have understood correctly:

TN in the context of ML seems to be a pre-optimization step which lets you deal with inductive bias.

The NTM learns to use its own memory through an attention mechanism.

The NTK builds compressed representations which should be of great interest to anyone needing interpretable models.

Gaussian Processes are a very different perspective on ML which lets you talk about composition of probability distributions.

It seems to me that these components could be assembled into an ML algo capable of general intelligence (not 'strong AI' IMO) given a curriculum and enough compute. It wouldn't compete with the smartest humans but at least it should make for useful robots, because of the cost of training.

A related concept: tacit knowledge:


Which is why attempts to "document all human knowledge" or whatever are always doomed to fail. We don't even know what we know. Yeah, people can print out Wikipedia and store it in diamond or whatever, but that's not what human knowledge really is.

Stuff like this is why we always encouraged our new hires to add to documentation anything they had to ask us about, even for what should already exist as a good set of explicit steps. It was pretty much a guarantee something would be missing.

What a great read! But the staircase example seems overwrought.

Reality often doesn't need to be that precise. If you're a few degrees off in your angle, it won't matter. The floor and/or walls are probably not perfectly flat and may not even be perpendicular. Stairs will work anyway.

This angle problem has been solved in tooling. You set the (<$100) chop saw for 30° (or whatever), mark the length of the board, and cut. It takes seconds, no tracing required. Trig is a great solution, especially given the "close enough" tolerances of wood.

If you need precision, work with metal.

Notching angles into boards is the reason framing squares exist. I've learned and forgotten this three times in my life, the last not so long ago.

Stairs are for human legs, and they care about the rise, the run, and the depth of the stair (tread). And all those can be done with a framing square. Angles are incidental to this (google around and you'll see a lot of stringers being cut with a circular saw).

A 3/5 ratio is 31°. You slap down the square, you measure 3 inches on one arm and 5 on the other, and you mark it with a pencil.

Random dude answering question on the internet:

> In US construction stair pitch (angle) is expressed as a ratio between rise and run (vertical and horizontal distances).

> For commercial construction the minimum ratio is 7:11 but 6:12 is often used (these are expressed as rise and run in inches which is why it’s 6:12 and not 1:2)

Maximum ratio. In the ("whole") rest of the world it is probably 15:30 to 17:26.

Crappy tooling, and crappy software in general, is our legacy from Microsoft. Before Microsoft, people had expectations that software would work right. It didn't always, but it was a big deal when it didn't. Microsoft, single-handedly, got people used to rolling their eyes and rebooting, and not demanding a refund, partly by never ever refunding anything. It worked for them, and here we are.

The avoidance of detail, enabling its occasional rediscovery today, is equally new. People took pride in attention to detail, and demanded it. Watch any detective movie from the 30s or 40s and you get a glimpse of how attention to detail was respected.

As someone who had constructed a stair once, I fully support the author. Any activity has lots of small and non-obvious details, which stop you from doing anything right when you stumble in then, and which seem easy and trivial in the hindsight. I guess, that's why it is called "know-how".

I remember the last time this article was posted. It's one of my favorite pieces to be posted on this site, thanks for reposting it!

I see that this is at least the 8th time it has been submitted, but only one other led to any discussion:


FWIW, the stairs drawn are still over simplified. Top step is unusable and dangereous. As an architect I would never accept it.

Given the subject matter I had to smile at the way the author bungled the detail within the first sentence:

> "A while ago, I read this amazing piece about building a ladder."

Hint: a staircase is the same thing as a ladder, just nailed down. On ships they call staircases ladders.

Mind blown

> Mind blown

Totally. TIL that I can call my blanket a tent and everybody will know what I'm talking about, being that it's the same thing, just nailed down.

When I first read the title, I understood it more philosophically. As in––why is most of outer space just endless, indistinguishable dust, while here on Earth there's just so much _stuff_! There's nothing simple about reality at all. It's boundlessly complex and intricate for apparently no reason at all. It seems intuitively simpler that less "stuff" should exist. Another way of putting it: even if a supreme being did create the universe, why would it go to so much trouble to make sure that dust gradually accumulates on the top of my refrigerator?

EDIT: Going a step further, what's really weird is just how different human beings are from everything else. Compare a McDonald's bathroom stall to the surface of the moon. How can these things coexist? They don't go together. It's like a giant accident. Like someone spilled pink nail polish on the Mona Lisa.

I find the refrigerator and McDonald's bathroom stall both to be a fairly straightforward metaphysical deduction from the generation of life on this planet and its consequent blossoming of civilization. Both of those things are byproducts that support the functional needs of humans which are the ultimate result of evolutionary dynamics in carbon based life forms found on the planet.

These things which are crazy complex (which you did not pick particularly complex examples of, better examples might be quantum computers or nuclear reactors, but that's ok) are a natural result of the process that evolution kicked off (this is the accident that you refer to) that allowed people to be a thing, which, when they do their thing results in astounding advancements that necessitate complex designs to fulfill their function. I personally don't believe there needs to be a better reason to explain why we have refrigerators, mcdonald's bathroom stalls, quantum computers, and nuclear reactors other than "because we can". That is to say, once humans became capable of inventing those things, it was inevitable that they would proliferate.

The nuclear reactor is a very good example of the bizarre contrast between human achievements and the natural world from which we evolved. Thank you :)

I see your point about the inevitability of humanity's technical achievements. But I can draw attention to my point further by suggesting the following hypothesis: it seems intuitive that something very complex cannot be derived from something very simple. Nonetheless, the smooth, dense simplicity of the big bang has, through a process that is empirically observable, unfolded into this incredibly complex microcosm on Earth. That there is a scientific explanation is not really debatable: we know that the complexity of the tree is stored in the seed as genes. The simplicity is only apparent. So, if we wish to explain the development of the universe as we have explained the development of trees, animals, etc. then we will have to be able to identify some kind of code that contains the potential for infinity complexity; a code that was housed in the smooth, dense big bang and thereafter dictated the procedural unfolding of reality from gas to stars and eventually nuclear reactors. All of this is to say that our intuitive idea that complexity cannot be derived from simplicity is actually, in a weird way, contradicted by biology, and that this "weirdness" applies not just to biological phenomena but physical ones as well so that new things seem to arise out of nowhere, when in fact they were stored in an imperceptible, condensed form that unfolded itself out of itself.

I personally completely agree with all the thoughts exposed in the post. Though mostly I was intrigued by the figure labeled "Some important details for colonizing the universe.": http://johnsalvatier.org/assets/colonizing-the-universe.png Unfortunately author used it just as an illustration without giving any details. The dust and heating related constraints are self explanatory, what surprised me most was the slowdown fission/fusion curves. Since everything else on the plot seems to make sense I presume they just depict that whatever amount of fuel you take with you the amount of extracted energy will decrease following such a path as you accelerate closer to c.

I've often thought that truly ambitious projects need a certain amount of ignorance to succeed. People who can see all this detail tend to be more conservative with their plans because those details give them pause, and limit the scale of what they attempt.

The optimistic newcomer will bite off much more than they can chew, but sometimes they succeed.

"If I knew how much work this would be, I never would have started" is sometimes felt at the end of successful projects.

I dove headfirst into rebuilding an engine recently with basically zero practical experience. The amount of possible detail in the project is mind boggling. What's even more amazing though is that I'm able to just keep on chugging through the problems as they come up. I occasionally have bouts of vertigo as I stare into some abyss of minutiae, but then I just shrug my shoulders, wrench on some more bolts and keep going.

This also explains why most PhD theses are only a tiny advance towards the problem. At the frontier you have no idea which details matter and, worse, details arise where you didn’t know even mattered.

The good scientists have an intuition for problems where the hairy details and solvable and they are good at solving. Part of research is to develop this intuition.

I think this is an argument against the "we live in a simulator" idea - the amount of detail is just too much.

FWIW I think that simulation theory is fairly obviously bunk, but that's not really an argument. We have no frame of reference for the nature of the simulator, including its constraints and the level of detail it can manage.

I don't personally think the theory that we likely live in an artificial sim is true.

But I also don't think it's 'obviously' untrue. Yes, the level of detail is mind boggling. But quite a lot of that detail follows naturally from the framework itself: The complexity of an object moving through space and time, interacting with heat and light and atmosphere and etc. is huge. However once the framework is in place (physical laws, light models, properties of materials and so on) everything happens 'naturally' provided you have the processing power to render all the variables.

It's not difficult to imagine a point where a detailed enough framework, and the processing power to render it, could exist. Not soon, but eventually.

Add to that the established tendency of our minds to normalize (ignore things that don't seem to fit, rationalize and otherwise edit reality for palatability) and the power of our neurotransmitter systems to override rational thought... The biological interpreter of the theoretical simulation is well designed to be fooled given a deep enough understanding of it's workings. Which is also not difficult to imagine as a future possibility.

Still, not likely, but too compelling a thought experiment to call bunk IMO.

Absolutely, it could be that we are living in a simulation. It's just that there's no reason to believe that as yet. FWIW I think that a 'biological interpreter' is rather unlikely; it's probably just emergence from atomic interactions. (It could be planted there, however.)

The problem with your argument is that it makes too many assumptions. We know nothing about the nature of the simulator, nor do we have any reason to believe that (assuming we are not in a simulator) the universe works the way it does for any particular reason. Furthermore, the isomorphism you point out between the the physical world and computers are much more easily explained if you flip the causality chain: we are humans living in this world, and we modeled computers after ourselves and our surroundings.

I'd like to propose two pro-simulator things that have been bouncing around my head for a while:

* Plank length/time: The granularity the simulation is capable of.

* Quantum entanglement: A deep copy/shallow copy bug (that is, shallow copy was accidentally used where deep copy should have been used, hence why quantum entanglement holds across large distances).

Fair point. But I think that a lot of the support for the "we live in a simulation" idea is because we see ourselves able to produce better and better simulations, and it feels like we might be able to progress to the point where we can do it. Realizing how much detail there is makes it feel much less achievable.

I use the word "feel" for a reason. I think we're dealing with emotional reasonableness, not careful analysis.

Why do you think it’s “obviously bunk”?

It's a horrible abuse of probability, which is not unlike saying that 0*∞ = 100%.

Not to say that we can't possibly be living in a simulation, but rather that the only way that can be shown, or even evidenced, is through concrete evidence. Thought experiment is insufficient evidence.

It's obvious projection. Centuries ago we lived in a dome, in the 18th century we lived in a clock, now we live in a computer, soon enough in a quantum computer. Give it 50 years and we'll live in the new fad, and giggle at the old fools who thought we lived in a simulation. So primitive!

I think this is an argument for the "we live in a simulator" idea - details appear just when they are needed and whenever we think we completely discovered something, another layer of details opens beneath us.

Higgs Bostein (or is it Bostain?)

Classic HN thread on the "we live in a simulation" idea:

Do we live in a computer simulation? UW researchers say idea can be tested: http://www.washington.edu/news/2012/12/10/do-we-live-in-a-co...

61 points by ph0rque on Dec 11, 2012 | 64 comments


I think it's one of the more fundamental meta-learnings, meaning: learn a few different things and then compare the processes (of learning) with each other and you find that they all share the quality that if you pay attention to anything the details expand. This is true of physical things like making stairs but also of mental "things" like programming, or poetry.

The fundamental "direction" is in-out; outward there's this multimodal unbounded fractal, inward there is Self-Awareness, the "infinity shot" of consciousness.

Too much for the kind of simulator you can imagine? But if your imagination is simulated how can you use it as a reference?

Too much detail in relation what? Our current computers in 2020?

What if the humans that made the simulation are from the year 3020 or 10020?

What do you think the level of detail of the simulation is? If it's simulating down to the atomic or subatomic level, then the only way to simulate that is with a universe at least as big, having turned the whole universe into a computer.

Maybe it's a lazy simulation... the atoms and subatomic particles are only simulated once someone is looking at it.

Even if that's true, there's no reason to believe our universe is particularly big for a universe :)

This becomes apparent any time I try to 3D print something that will interact with an existing structure.

For this I like to use foam tape as in interface.

If you think reality has a lot of detail, ‘unreality’ will blow your mind. (I’m referring to Hilbert space.)

There is no such thing as "unreality". What you talking about is just a concept inside our mind which itself is inside reality. We can create any number of concepts but they all are limited by the limitation of our perception.

The reality, in turn, exists independently and does blow your mind every moment given that you are not lowering your attention by deciding what is important and what is not. Children know this perfectly well and can be engaged and mind-blown at every little thing until society forces its rules which in turn develops a layer called personality or ego and after some time only that becomes real. This is where one disconnects from the magic and mind-blowingness of reality, and as a result dissatisfaction, depression and all sorts of mental problems start appearing.

But paying indiscriminate attention to everything can bring back the initial raw perception with all it's joy and wonders and is something they found out in the East, including hinduism and buddhism practices.

I think a better way to put it is, "People like programmers are prone to underestimating the amount of detail in things". "Amount of surprising detail" is relative to the assumed amount of detail.

I was hoping this article was going to pivot into mindfulness/psychology or something more abstract like that.

> Another way to see that noticing the right details is hard, is that different people end up noticing different details. My brother and I once built a set of stairs for the garage with my dad, and we ran into the problem of determining where to cut the long boards so they lie at the correct angle. After struggling with the problem for a while (and I do mean struggling, a 16’ long board is heavy), we got to arguing. I remembered from trig that we could figure out angle so I wanted to go dig up my textbook and think about it. My dad said, ‘no, no, no, let’s just trace it’, insisting that we could figure out how to do it.

"...different people end up noticing different details"

If this can happen with something as relatively simple as building stairs, might it also happen in more complicated scenarios like interpersonal or inter-cultural/national relations, politics, etc?

> I kept arguing because I thought I was right. I felt really annoyed with him and he was annoyed with me. In retrospect, I think I saw the fundamental difficulty in what we were doing and I don’t think he appreciated it (look at the stairs picture and see if you can figure it out), he just heard ‘let’s draw some diagrams and compute the angle’ and didn’t think that was the solution, and if he had appreciated the thing that I saw I think he would have been more open to drawing some diagrams. But at the same time, he also understood that diagrams and math don’t account for the shape of the wood, which I did not appreciate. If we had been able to get these points across, we could have come to consensus. Drawing a diagram was probably a good idea, but computing the angle was probably not. Instead we stayed annoyed at each other for the next 3 hours.

Considering the amount of detail in the world, it shouldn't be too surprising that two people can look at the same thing and see dramatically different things. Often the two parties can realize how the collaboration went off the rails after the fact. But what happens the next time such a scenario arises - can they recognize in realtime that the same thing is happening again, and consciously step in and prevent another derailing of collaboration?

In my experience, not only is the answer usually no, but even if one of the parties does happen to be aware enough at the moment, and points out the "obvious" fact of what is happening again, just like last time, the conflict will still not resolve, almost as if there is something occurring at the neurological level that prevents it.

> Before you’ve noticed important details they are, of course, basically invisible. It’s hard to put your attention on them because you don’t even know what you’re looking for. But after you see them they quickly become so integrated into your intuitive models of the world that they become essentially transparent. Do you remember the insights that were crucial in learning to ride a bike or drive? How about the details and insights you have that led you to be good at the things you’re good at?

While some skills (riding a bike, playing tennis, etc) can easily be integrated such that one can conduct them with extreme skill completely intuitively, others (interpersonal communication) seem highly resistant to the same level of integration.

It seems common for people to say "So what?" when this phenomenon is pointed out, likely because it's so obvious when being discussed in the abstract sense. But the "so what" is that: in practice, in realtime, this prior knowledge seems near impossible to be intuitively accessed. And not only that, it often seems like it is near ~inaccessible to the conscious mind - if it is pointed out, acknowledgement of a current manifestation of the phenomenon will be refused, typically via anger or silence.

'Interpersonal communication' is not a skill. It is a weasel word for when you don't want to say 'social status' for whatever reason.

This doesn't make any sense to me whatsoever. Of course, I don't understand what you mean, but I think there's tremendous unrealized value in arguing over seemingly minor thins such as this, so if you're willing to elaborate, I'm more than willing to consider your ideas.

This essay has a taste of “Zen and the art of motorcycle maintenance”. Really enjoyable and authentic style.

Yeah, seems like the last post, does the author write elsewhere?

> If you wish to not get stuck, seek to perceive what you have not yet perceived.

Amen to that.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact