I learned to program on a course that follows SICP, I spent all my college years learning how to program from first principles, building all the pieces from scratch. Building compilers, soft threads implementations, graph parsing algorithms... and I was happy with that way of programming!
Today I'm an iOS developer, I spend most of my day 'poking' at the Cocoa Touch framework whose source I can't even read because it's closed source. The startup I work for moves so fast that I'm forced to use other peoples open source projects without even having the time to read the source.
I feel miserable doing the kind of programming I do nowadays! I wish I could go back to simpler times, where I actually had to reason about algorithms, data structures, hardware, etc...
This is one of the reasons why I moved away from product side programming to IT. My brain screams in pain, when I do lot of context switches, and I figured this is not going to make me a better programmer. The fun stuff is in handling data, not learning layers of apis(for me).
Now working in enterprise is not for everyone - the politics etc, but for me, it still beats the pain of working in a smelly, loud, fast context-switching, agile-kanban startup.
I feel miserable doing the kind of programming I do
nowadays! I wish I could go back to simpler times, where
I actually had to reason about algorithms, data
structures, hardware, etc...
But I think the lucky ones are people who get to work low enough relative to their knowledge where it doesn't feel they are dealing with endless abstractions and layer upon layer of magic.
The startup I work for moves so fast that I'm forced to
use other peoples open source projects without even
having the time to read the source.
I don't subscribe to the school of thought that
values engineers lower on the stack more than
those higher up
I've done the full range, from entire games written in assembly language and embedded C code, through high level full stack development with NodeJS, Python, and other languages.
The low-level coding is far more skilled work than the high level coding. Instead of most of the work being "how do these plug in to each other?", it's how exactly does my code function with respect to all of the potential events additional threads, and how all of the edge cases interact, and what is causing this obscure bug?
While that may not seem intrinsically harder, none of these are typically something you can Google or search StackOverflow for the answers on. So you're really just on your own. And developers who have grown up on "Google all the answers!" often hit a wall when they need to apply real problem solving skills.
Luckily I can find enjoyment in many levels, since a lot of the jobs I've found recently have been more of the "full stack" or "mobile development' category. It's easy and fun work.
I also have little problem piercing the magic and understanding how things fit together, but that means that I end up with opinions on many topics divergent with the crowd. For instance, I avoid writing Android/Java or iOS/Swift, and instead use cross-platform development stacks exclusively. Yes it means an entire extra layer of complexity, but it also means I write the app once for all target platforms. Far too big of a win to ignore.
I've done the full range too, and I don't agree that low-level involves more skill. I think it involves different skills. When you're working high-level, you don't have the above questions so much, to be sure. Instead you have questions like, what are the relevant business rules? Which ones are stable and which ones are going to change? Does this user interface suck? Is the program giving answers that actually correspond to reality? Does anyone want this product in the first place? Different questions, but not easier to answer.
From my multi-threaded background - The problem is, the code looks simple and the code actually doesn't get more complicated. Usually, the code you need to write in 90% of the time actually becomes easier to write for most people, because they can work in rigidly defined boundaries.
However, explaining why this part of the code base is safely threaded and scales vertically across multiple cores, and how we avoided multiple non-obvious bottle-necks, and the optimization the software 3 services in the past matters... that's going to take hours. And at that point, we haven't touched lock-free operations yet.
And after that, understanding how to design an architecture from scratch, how to interact with the network and the kernel properly, how to enable distribution of the application in an elastic manner... yeah. Let's get a room.
And these kind of systems are exactly the systems which benefit from the SICP-approach. Thread pools, message queues, databases, the network, lock-free data structures all are well-understood components we can combine in structured manners.
I've run into this attitude before, though. It seems to be the last bastion of denial of folks who have spent a lot of time learning one platform and who don't want to learn something new. Sorry, but there are better ways.
SICP is one example, but I bet it's even possible to be happy coding GUIs or any other rather mundane things if one does it with an elegant toolset. For example, Oz comes to mind.
Some modern tools are not pleasing to use, often because architectures are a mess. Things move forward too quickly.
Now I do Python. And with Python I "got" something too : writing tools, coding, etc. is not like mathematics (like Scheme) anymore. It became a social activity. I'm in the field of ERP right now (arguably not the most theory-oriented stuff). It's because I spend most of time using API's by other and making API's in the hope that they will be used by other (none of those API being worth much in terms of computer "science" (stuff like algorithm). I'm also building tools to augment productivity of other people; which is also quite social.
So from the abstract Scheme programming, I've moved to Python social programming. That keeps me happy (I must confess that studying Voronoi diagrams, or 3D structures packing remain what made me really tick :-) )
Join the TXR project:
I made some 3000 commits all by myself since 2009. The main reason why this exists is the above: a project where I can stuff ideas, algorithms, whatever, into a coherent whole that has regular releases and is well-documented externally. And this coherent whole is useful to me in many ways.
Moving fast, sprinting, only looks like you are making progress. At some point the debt comes too much
I will probably move to embedded soon. Problem is I have lot of experience with iOS but not that much experience with embedded, I'm willing to get a pay cut just to get out of iOS but there's a limit of how low I can go.
There's also a lot more iOS jobs than embedded jobs!
edit: Other than C, I'm very curious about rust and nim-lang. I do not know C++ though. Hope that won't be a problem.
I've found that it entirely depends what level of embedded development you want to enter - for instance I mainly work with C++ and MSQL databases on our platform - because our main device runs linux this makes sense. But other related products that we sell run on bare metal and are written in pure C.
Having a strong knowledge of C will most certainly get you far in this field, but perhaps learning the ins and outs of C++ wouldn't be such a bad idea if you have the spare time - the more niches you can fit in the easier it will be to pick up that job you really want.
I'm naive but curious and I do not want to make guesses.
I wonder how are we going to preserve knowledge about programming from first principles if, under pressure from corporations and lazy peers, no one does it anymore?
There are also a lot more of this 'poking' jobs when compared to first principle coding jobs. Specially for someone like me who never worked for huge corporations.
If you like videogames, learn some OpenGL ES, apply to mobile game development position, if you’ll like the area, you’ll be able to continue career developing games for other platforms. If you’d like to move to MS stack, learn some .NET, then look for Xamarin iOS jobs, you’ll be able to continue career developing .NET for Windows. If you enjoy hacking, seek a job in a company that makes UI test automation tools for iOS, or iOS-related security software.
You already have experience dealing with all those iOS weirdness, you might be able to minimize or even negate that pay cut.
"...Sussman said that in the 80s and 90s, engineers built complex systems by combining simple and well-understood parts. The goal of SICP was to provide the abstraction language for reasoning about such systems.
Today, this is no longer the case. Sussman pointed out that engineers now routinely write code for complicated hardware that they don’t fully understand (and often can’t understand because of trade secrecy.) The same is true at the software level, since programming environments consist of gigantic libraries with enormous functionality. According to Sussman, his students spend most of their time reading manuals for these libraries to figure out how to stitch them together to get a job done. He said that programming today is 'More like science. You grab this piece of library and you poke at it. You write programs that poke it and see what it does. And you say, ‘Can I tweak it to do the thing I want?''. ... "
Surely graduates of MIT, of all places, would be the ones building the libraries that ordinary programmers are stitching together and would still need the ability to build complex systems from fundamental CS abstractions? I'm absolutely baffled.
This was his polite implicit criticism of the new core, which among other things also teaches much less in the way of EE fundamentals, a topic he's cared about very much since at least the late '70s (i.e. 6.002 is no longer a required course for all EECS majors).
The bottom line is that in the post-dot.com crash panic which saw MIT EECS enrollment drop by more than half, after it had been steady at 40% of the undergraduates for decades, one result was "MIT" deciding that what a MIT EECS degree meant would be substantially different.
There were a TON of changes that happened with the MIT EECS curriculum at that time, so perhaps it was a holistic response to the dot com crash that was beyond just 6.001.
Yeah, MIT has become a Javaschool, plus Python....
And just when the failure of Dennard scaling was making functional programming a lot more important.
They created 6.01, which serves at an "intro to EECS" so it involves both EE and CS, as opposed to the old intro 6.001 which was really an introduction to CS. Some argue that this course is easier (I never took it, so I can't say) which could definitely be argued is to make EECS a little more gentle and open in response to the dot com crash.
They also broke up 2 very difficult courses, 6.046 introduction to algorithms and 6.170 lab for computer science, and at least put some of their course work into new courses. Again this could be seen as making the entire major a little gentler.
They also changed requirements. In the past there had been a lot of CS focused students who were uninterested in doing any EE, and were choosing to major in 18.c (math with computer science) to avoid an EE course load. The department right thought it was a little crazy that people were leaving the CS department in order to focus more on CS, so they lightened the required EE coursed for 6.3 (a major focused on CS) and vice versa.
This is all my recollection from the 07-09 era and from talking to some students since. There could be some errors in details.
Yep, there's the real irony. Having functional programming skills and experience is a real asset in today's job market, I've found.
Ordinary programmers everywhere are building those libraries, just like your assumed wunderkind are building programs out other people's libraries. The nature of programming has changed for >everyone<.
[UPDATE:] One of the symptoms of no one understanding the fundamentals is how excited people get about things like XML and JSON, both of which are just (bad) re-inventions of S-expressions.
Yes, but the reason it has changed for everyone is that no one understands the fundamentals any more, because they aren't being taught to anyone.
As some point people just have to start treating some lower layers as black box abstractions, so they can actually work. Of course it is always beneficial to try and learn some of the lower level fundamentals, but I just don't see any that for everybody to have full knowledge of an entire application stack from end to end.
To me, the best you can do is include fundamentals in an "always keep learning" mindset. That is why, for example, I am still working on learning assembly language (x86/x64) in idle bits of spare time here and there, at the age of 42 and after 20+ years of programming. And why I still go the hackerspace and build circuits from discrete components and low level IC's for fun. But as it happens, for most of my career, not knowing assembly or how to assemble a microcomputer from individual ICs has not prevented me from getting useful stuff done.
Nonsense. The fundamentals don't take a long time to learn. And once you know them, everything else becomes much easier to learn. That's the reason that learning the fundamentals matters: it's a huge lever.
Here's a single, small, very accessible book that takes you all the way from switches to CPUs:
SICP gets you from CPUs to Scheme (well, it goes in the other direction, but the end result is the same). That's two books to get from switches to compilers. Anyone who thinks they don't have time for that needs to learn to manage their time better.
As the old saying goes "if I had 3 days to cut down a tree, I'd spend the first 2.5 days sharpening my axe". Sure, but at some point you have to actually start chopping. By analogy, at some point you have to quit worrying about fundamentals and learn the "stuff" you need to actually build systems in the real world.
By and large I'm in favor of spending a lot of time on fundamentals, and being able to reason things out from first principles. And when I was younger, I thought that was enough. But the longer I do this stuff, and the larger the field grows, the more I have to concede that, for some people, some of the time, it's a smart tradeoff to spend more of their time on the "get stuff done" stuff.
That's why you spend four years in a university before you start your career. If you're not going to learn all the fundamentals, you might as well go to an 8 week code camp and save your time.
I don't know, maybe that is the answer. But I suspect the MIT guys have a point in terms of making some small concessions in favor of pragmatism. Of course, one acknowledges that college isn't mean to be trade school... Hmmm... maybe there is no perfect answer.
No. You are missing the point. The fundamentals are very simple and easy to learn. That's what makes them "fundamental." It's all the random shit invented by people who didn't understand the fundamentals (or chose not to apply them) that takes a long time to learn.
Programming languages, automata theory, set theory, compilers, assembly language programming, microprocessors and system architecture, algorithms, graph theory, category theory, artificial intelligence, machine learning, operating system, parallel and distributed programming, natural language processing, web design, network architecture, databases.
There are plenty of core fields that I've missed. Which are fundamental? Which do we teach? It simply isn't possible to adequately teach everyone the core fundamentals of all of these fields during the course of an undergraduate degree while also conveying the basic design fundamentals that a software developer needs to know. There is just too much in the field to expect every software developer to have a complete understanding of all of the fundamentals in every sub-field. Our field is getting a lot wider and a lot deeper, and with that, people's expertise is forced to narrow and deepen.
If we look just traditional CRUD-kin (i.e. database, view, domain logic) programs these seem to be the most in danger of becoming a mudballs. Which is really strange, given that they are the oldest types of programs out there and closely mimic the second use computers came used for (accounting) just after military applications.
Perhaps because this type of program is so old, it had so much time to stick lots of mud on it. :-)
A fair question, and a full answer would be too long for a comment (though it would fit in a blog post, which I'll go ahead and write now since this seems to be an issue). But I'll take a whack at the TL;DR version here.
AI, ML, and NLP and web design are application areas, not fundamentals. (You didn't list computer graphics, computer vision, robotics, embedded systems -- all applications, not fundamentals.)
You can cover all the set theory and graph theory you need in a day. Most people get this in high school. The stuff you need is just not that complicated. You can safely skip category theory.
What you do need is some amount of time spent on the idea that computer programs are mathematical objects which can be reasoned about mathematically. This is the part that the vast majority of people are missing nowadays, and it can be a little tricky to wrap your brain around at first. You need to understand what a fixed point is and why it matters.
You need automata theory, but again, the basics are really not that complicated. You need to know about Turing-completeness, and that in addition to Turing machines there are PDAs and FSAs. You need to know that TMs can do things that PDAs can't (like parse context-sensitive grammars), and that PDAs can to things that FSAs can't (like parse context-free grammars) and that FSAs can parse regular expressions, and that's all they can do.
You need some programming language theory. You need to know what a binding is, and that there are two types of bindings that matter: lexical and dynamic. You need to know what an environment is (a mapping between names and bindings) and how environments are chained. You need to know how evaluation and compilation are related, and the role that environments play in both processes. You need to know the difference between static and dynamic typing. You need to know how to compile a high-level language down to an RTL.
For operating systems, you need to know what a process is, what a thread is, some of the ways in which parallel processes lead to problems, and some of the mechanisms for dealing with those problems, including the fact that some of those mechanisms require hardware support (e.g. atomic test-and-set instructions).
You need a few basic data structures. Mainly what you need is to understand that what data structures are really all about is making associative maps that are more efficient for certain operations under certain circumstances.
You need a little bit of database knowledge. Again, what you really need to know is that what databases are really all about is dealing with the architectural differences between RAM and non-volatile storage, and that a lot of these are going away now that these architectural differences are starting to disappear.
That's really about it.
Why? I know that stuff inside and out, and across multiple jobs I have used none of it ever. What chance to use this favourite bit of my education did I miss? (Or rather, might I have missed, so that you might speak to the general case)
> You need to know how to compile a high-level language down to an RTL.
Why? Same comment as above.
> You need to understand what a fixed point is and why it matters.
Well, I don't, and I don't. I request a pointer to suitable study material, noting that googling this mostly points me to a mathematical definition which I suspect is related to, but distinct from, the definition you had in mind.
Otherwise... as I read down this thread I was all ready to disagree with you, but it turns out I'd jumped to a false conclusion, based on the SICP context of this thread. Literacy, what a pain in the butt.
In particular, the idea that ((the notion of an environment) is a fundamental/foundational concept) is a new idea to me, and immediately compelling. I did not learn this in my academic training, learned it since, and have found it be very fruitful. Likewise with lexical vs dynamic binding, actually.
So you can write parsers. To give a real-world example, so you can know immediately why trying to parse HTML with a regular expression can't possibly work.
>> You need to know how to compile a high-level language down to an RTL.
So that when your compiler screws up you can look at its output, figure out what it did wrong, and fix it.
>> You need to understand what a fixed point is and why it matters.
> Well, I don't, and I don't. I request a pointer to suitable study material
Particularly lectures 2A and 7A.
I think this deep background is a great goal. But in another way programming is a craft. You can learn as you go. There are millions of bright people who could do useful, quality work without an MIT-level CS education. They just need some guidance, structure, and encouragement.
I wouldn't be so sure. I once applied for a company that was about proving various safety properties of control and signalling applications (for trains). Sizeable applications. They have problems with the halting problem and combinatorial explosions, but they get around those and do it anyway.
For those unfamiliar with it, in R, environments are first-class objects, and identifier lookup in them is dynamic. But the way those environments are chained for lookup purposes is defined by the lexical structure of the program (unless modified at runtime - which is possible, but unusual) - i.e. if a function's environment doesn't have a binding with a given name, the next environment that is inspected is the one from the enclosing function, and not from the caller. So R functions have closure semantics, despite dynamic nature of bindings.
It would appear that this arrangement satisfies the requirements for both dynamic scoping and for lexical scoping.
On first thought, I do agree. However, they are fundamental applications. Category Theory is categorizing the fundamentals. It uses a lot of the fundamentals on itself, I guess, but that doesn't mean ti me it's inaccessible or useless.
The web for instance is an unholy mess. We can do better for mass electronic publishing. We don't because that huge pile of crap is already there, and it's so damn useful.
At a minimum an education should give you a strong enough base that you can teach yourself those other things should you so desire.
That was exactly the curriculum of my CS degree, minus the web design, and I didn't even go to a first rate CS program like MIT (PennStateU 20 yrs ago, no idea what the curriculum is now.).
Thanks, you succeeded in verbalizing succinctly the agony of so many software projects. However, I would claim it's just not ignorance of the fundamentals. It's simply more fun for a particular kind of mind to invent new abstractions and constructs than to solve actual problems. I call it cargo-cult progress.
Spolsky calls these 'architecture astronauts' (http://www.joelonsoftware.com/articles/fog0000000018.html). (As a mathematician, I have a certain leaning to this mindset, so I clarify that I am not criticising it, just reporting on existing terminology.)
I think one facet of this thing is that people have an ambition to build huge systems that encompass the entire universe - while in fact, most software system only need to utilize the already excisting capabilities in the ecosystem.
It's like since people are eager tinkerers they approach software projects with the mindset of miniature railroad engineers - while in fact, the proper way to attack the task should be as brutally simple as possible.
The reason huge theoretical systems are a problem in software engineering is that a) they tend to be isomorphic with things already existing b) while not implementing anything new and c) they obfuscate the software system through the introduction of system specific interface (e.g. they invent a new notation of algebra that is isomorphous in all aspectd to the well known one). And the worst sin is, this method destroys profitability and value.
How could they ever understand "simple and easy"? Their concept of simple is not based in reality.
There seems to be this idea (I wish it had a name) that one can just apply a new abstraction layer and the lower layers magically disappear or are rendered insignifcant.
And of course producing 100's or 1000's of pages of "documentation". This is disrespectful toward the reader's time. The best documentation is and will always be the source code. If I cannot read that, then for my purposes there is no documentation.
This is not to say some of these old new thing higher-level of abstraction solutions do not work well or are not easy to use or do not look great. Many people love them, I'm sure. But until I see more common sense being applied it is just not worth my time to learn. I would rather focus on more basic skills that I know will always be useful.
I think what we actually disagree about is just the exact definition of "fundamentals". I personally would not say "the fundamentals are very simple and easy to learn". At least not if you're learning them to the level of depth that I would want to learn them.
Now if we're just saying that people only need a very high level overview of "the fundamentals", then that might swing the equation back the other way. But that, to me, is exactly the knob that the MIT guys were twiddling... moving away from SICP and using Python doesn't mean they aren't still teaching fundamentals, it just sounds like they aren't going quite as deep.
Anyway, it's an analog continuum, not a binary / discrete thing. We (the collective we) will probably always be arguing about exactly where on that scale to be.
That may well be, but as I am the one who first used the word into this conversation (https://news.ycombinator.com/item?id=11630205), my definition (https://news.ycombinator.com/item?id=11632368) is the one that applies here.
And please don't take anything I'm saying here as an argument against learning fundamentals.. all I'm really saying is that I can understand, even appreciate, the decision MIT made. Whether or not I entirely agree with it is a slightly different issue. But I will say that I don't think they should be excoriated over the decision.
A minor point to make here, college isn't about learning practical skills; that's what trade schools and on the job training do. College is about learning the fundamentals, however, computer science is not software engineering and you don't need learn computer science to be a good software engineer because software engineers don't do much in the way of computers science anyway.
Yes, on paper that is true. And I alluded to that point in my answer above (or at least I meant to). But from a more pragmatic and real world point of view, colleges are at least partially about teaching practical skills. I don't know the exact percentage, but I'm pretty sure a large percentage of college students go in expecting to learn something that will pointedly enable them to get a job. Of course one can argue whether that's a good idea or not, and I don't really have a position on that. But I don't think we can ignore the phenomenon.
I wish I had this book at the beginning of my career. http://www.amazon.com/Elements-Computing-Systems-Building-Pr.... Makes you design the hardware, then the software for that hardware.
Should not take more than 8 - 12 weeks with school work/day job.
right. No one has time to learn endless js frameworks, injections and reinventions of the wheel. So (1) read the fucking SICP book, (2) learn about the business problem you are trying to solve, put 1 and 2 together and "get stuff done".
I don't recall anything about CPUs in SICP. Its more about data driven programming and writing of intepreters.
What i liked about SICP and scheme programming was that it is a pretty good environment for tinkering - the REPL makes it easy to combine functions, and to work in a bottom-up manner. (btw you had less of that in common lisp, and most other environments that teach you to work in a top down way, however you can still work with the Python REPL).
Maybe this bottom-up approach is what Sussman really has in mind when he is talking about first principles, because SICP is really not about working up from the atoms to the scheme interpreter/compiler.
And by your own admission you didn't learn all the stack elements I listed above. And that was not in any way, shape, form, or fashion an exhaustive list!
Sure, if you finish a four year degree in CS/EE/EECS you learn a lot of stuff... and if you spend a big chunk of that four years on the really low level stuff, you have to tradeoff time spent on higher levels of the stack. You can only pour so much water in a 1 gallon bucket.
And even then, you only get the fundamentals are a certain level of depth. At some point, one has to ask "how important is it that I be able to go out, buy an 8088 chip, a memory controller chip, a floppy drive controller chip, etc., solder a motherboard together, code an OS in assembly, write my own text editor, etc, etc., etc."
Don't get me wrong. I'm not against teaching the fundamentals, and I'm not even sure I agree with MIT's decision on this. But I will say I can understand it and cede that it has some merit.
And all of that said, I'll go back to what I said earlier.. to me, the important thing is to continue learning even after college, including going back to fundamentals like building circuits from individual transistors and what-not. That stuff has value, you there's no reason to think you can't be productive even without that.
I mean, if you think about it, every field eventually segments into layers where certain practitioners treat some things as a black box. Does an engineer building a car also need to be a metallurgist or materials scientist? No, he/she just grabs a book, and looks up the parameters for the correct material for the application at hand. Etc.
S-expressions just weren't up to snuff because there's no standard way to do key-value mapping.
> S-expressions just weren't up to snuff because there's no standard way to do key-value mapping.
That's not true. There are two standard ways to do this: a-lists and p-lists.
What else would you recommend adding to the fundamentals list?
I find funny that only last year with MVC scaffolding generators I can do forms to access a database just as easily as I was doing them in Foxpro 20 years ago.
Yes, now it is client server, safe against common hacking attacks, responsive, etc. But it's been 20 years !
And the productivity valley between both points is abysmal.
I don't like it, but I think it's an unavoidable consequence of computing's evolution into ubiquity.
The fundamentals are fundamental. They don't change very fast.
Applications change all the time. So do frameworks. But there's very little genuinely new in most frameworks, or in most languages for that matter. They're mostly repackagings of the same few ideas.
Which is why it's a lot easier to pick up applications and do a good job with them if you know the fundamentals than if you're hacking away without any contextual or historical understanding.
Meanwhile someone is going to have to do the next generation of pure research, and it's a lot harder to do something creative and interesting in CS if all you've ever known is js, Python, and Ruby.
The reality is that software quality is decreasing. Never mind maintainability or even documentation - applications are becoming increasingly buggy and unreliable.
It's common in the UK now for bank systems to crash. Ten years ago it was incredibly rare, and twenty years ago it was practically unthinkable.
Software is too important to be left to improvisation and busking. So "just learn to make applications from other applications" is not a welcome move.
"I asked him whether he thought that the shift in the nature of a typical programmer’s world minimizes the relevancy of the themes and principles embodied in scheme. His response was an emphatic ‘no’; in the general case, those core ideas and principles that scheme and SICP have helped to spread for so many years are just as important as they ever were"
If anything, I would think Sussman is more practical, and understands what the world needs/expects(now).
Literally any programmer who hasn't read SICP before will benefit from it. I think the principles still apply.
Don't get me wrong, if you are going for post-graduate studies such a course will always be relevant, but it sounds like he is talking within the context of undergraduates. And in the context of undergraduates, I too would be circumspect of how useful it would be for preparing you for your first job as a Software Engineer.
Their choice to go toward a Python-based course at the undergraduate level would also seem to reaffirm this view from afar...
What is ridiculous in the face of this "programming by experimentation" fantasy is that programming has evolved since the 1980s... to be even more about composable abstractions with provable semantics. Hindley-Milner-Damas types and monads are now everywhere.
Haven't run into those. Perhaps I know them by a different name?
Local type inference is now used in Visual Basic, Scala, Rust, probably a lot of other new languages I am missing. Gradual types are coming to Clojure and probably Python and Lua.
Erik Meijer did a lot of work on bringing monads and FRP as patterns to .NET programmers. Java 8 has monads (Optional and Stream interfaces). Bartosz Milewski has been getting a lot of attention in C++ circles (see his blog: https://bartoszmilewski.com/)
I am also grateful to have taken the condensed version of 6.001. You do need the ability to understand those abstractions in order to be an informed shopper though.
"MIT" doesn't necessarily mean "super-elite programmer". I work in an office that's half MIT grads, and the non-MIT half are pretty much equivalently good (though with much worse connections around Cambridge). That's not to say MIT sucks or anything, but more to say that with luck, a really solid CS or EECS degree gets the student up to being able to build important components from scratch at all, which isn't necessarily the level needed to build those components from scratch for public release or for profit. That latter goal requires a good education followed by professional training and experience.
I watched the SICP videos, and I remember Abelson specifically endorsing just that.
2) Most high schools don't offer CS. 99% of high schools offer some form of calculus.
As of 1979, the 18.01 I took matched what I understand is the the AP Calculus BC sequence, and while having previous exposure to the calculus certainly helped, it wasn't assumed.
I suppose the same could probably be said for any intro course or just college in general...
It's the same in Economics where you take Micro and Macro and then junior or senior year you do the same all over again but this time you take it seriously with logical reasoning instead of just memorizing a couple stock phrase explanations.
SICP wants to be both. It's a great book so maybe it can do both. But it's hard for a class to do both.
1. the programming environment (an emacs clone in scheme) had an extraordinarily steep learning curve
2. S-expresssions were hugely difficult to visually parse and edit vs. languages with more familiar syntax.
3. very little exposure to practical projects in the class - felt like constantly working on toy projects or impractical abstract constructions
I got a lot more out of 6.170 (software engineering lab) and the other computer science classes.
I have a much greater appreciation for the class now after 15+ years and recently worked through SICP again. It's much easier with more programming experience, not to mention Emacs paredit mode.
I always thought 6.001 should have been a 2nd or 3rd year course. I would have gotten a lot more out of it.
It was also one of the most practical, useful classes they had. A CS student who didn't take it (probably most of them) would still pick up most of the material by osmosis, but I think that class was a great idea.
That's inconceivable to me now.
I'm actually writing my own shell now; I'd be interested to compare :)
The fork/exec/pipe/dup calls are definitely an unusual and powerful paradigm. I think too many programmers don't know this because we were brainwashed to make "portable" programs, and that's the least portable part of Unix.
The materials, as far as I know, are not available online. But we used Tannenbaum's book which you can read for free here - http://stst.elia.pub.ro/news/SO/Modern%20Operating%20System%...
This looks pretty comprehensive for an undergrad project actually... they have a grammar and everything, with pipes and redirects, which are definitely the most Unix-y concepts. And they talk about interrupts, although no user-defined signal handlers or anything.
FWIW I ported the POSIX shell grammar to ANTLR, though I'm writing it in C++ (and prototyping the architecture in Python). ANTLR is actually incapable of expressing the lexer AFAICT...
More like a pedantic jerk.
It's good to know about the existence of quines. But people are (presumably) paying good money for (and more important: investing their perfectly valuable time in) their education, and there's only so much time in a systems programming class, and so much more fundamental stuff to cover (or cover more robustly).
The people who really need to figure out how to write quines will no doubt find time to do so, in the dead of night, no matter what you may try to do to stop them. (And try to make them 3 bytes shorter than the shortest known version in that particular language). The rest -- they just need to know that they're out there.
I took 6.001 as an undergrad at MIT. It changed my view of software forever.
Years later, I now spend most of my time training programmers at hi-tech companies in Python. The ideas that I got from 6.001 are pervasive in my training, and are fundamental (I think) for programmers to understand if they want to do their jobs well.
Given that I teach tons of Python, you might think that I'm happy that they switched to Python as the language of instruction. That's not the case; I think that Lisp offers more options for thinking about programming in different ways.
One of the great things about 6.001 was that it didn't try to teach you how to program for a job. It taught you how to think like a programmer, so that you could easily learn any language or technology they threw at you.
The fact is, as a self-taught programmer, programming is intimidating. I can reason about code, and write it, and understand it if I squint at it long enough, but I still choke on any production code I read, and have trouble solving complex problems. SICP is a book all about this. It's about building better abstractions, and despite having not yet finished it, is probably the most helpful books on programming I've read. Many books can teach you how to program. SICP teaches you how to program /well/, and why.
Along with The Lambda Papers, it taught just how powerful Scheme could be. And I maintain that Lambda the Ultimate Declarative contains the only good explanation of what a continuation DOES.
It was the book that made me want to go to MIT. I don't know if I'll ever go there (I'm still in high school), but if the people there are advocating "programming by poking," it probably wouldn't be worth my time.
This book changed my life, and I haven't even finished it yet. It should be Required Reading™, and the thought of doing it in Java makes me sick. And not just because I despise Java. Java is probably the worst language for this sort of book. SICP is an exploration of programming paradigms and concepts. Java is married to one set of paradigms and concepts, and won't budge an inch against even a hydrogen bomb.
Besides, imagine a metacircular evaluator in Java. Yuck.
We tend to call everything "software engineering" so that everybody can feel proud of such a title ("I'm an engineer"), but engineering is certainly not about figuring out how to vertically center divs with CSS (and it's also not about proving algebra theorems either -- even if it can be essential when it comes to specific problems that require it).
I can't imagine Linux and PostgreSQL being built without "science", they use a lot of it, and I'm pretty sure the authors all have read SICP and those theoretical books.
Poking at things proved to be efficient to building things quickly, but it's just not how one builds critical systems/software that are robust, efficient and maintainable.
In mechanical engineering you design your artifact using off-the-shelf bearings, motors, pumps, etc.
In electrical engineering you design your artifact using off-the-shelf cables, contactors, relays, VSDs etc.
In electronic engineering you design your artifact using off-the-shelf ICs, resistors, capacitors, resonators etc.
In IC engineering you design your artifact using off-the-shelf silicon wafers, etching chemicals, core/logic designs etc.
It's turtles all the way down, and software is no different.
And many programmers aren't engineers, they're just interested tinkerers; people who play around in their free time enough to know how to make something work. Not unlike if you went to the store, bought some wires and batteries and tools, and then played with them until you got hired as an electrician.
Sharing culture is instinctive. People will do it. You might as well try to tell people they can't have sex without your permission unless they pay first. Oh wait, that's the porn industry. Everyone pays for porn, right?
Unless you restrict "authors" to the people that worked on the original Postgres95 (and maybe not even then), I'm certain that that's not generally the case (being a postgres committer and not having read SICP).
Yet, for that specific web dev problem, I haven't seen any way of formally testing the rendering web pages, which would make it consistent on every browsers. The testing process (almost) always leave that up to the developers themselves, and refreshing pages is the norm.
Freshman double E's take six double oh one (a.k.a. six double no fun) to learn to program in LISP... This is also where they begin to leave the rest of the world behind them in their ability solve problems and make things work.
He goes on to describe how the course instills the virtues and limits of abstraction.
And the high priests of computer science strove to build a corpus that was intimidating, complex and beautiful. They tried to be gods and birth a new form of life. That computer scientists applied Socratic and Aristotelian principles to their framework is hubris. Human individuals really cannot be gods to machines that are useful. We have constructed masses of spaghetti code into libraries that no one has the time to read or understand and it is our own fault. These are tools and should have been kept simple and open and easy. Perhaps AI will evolve to save its parents.
The article should have led with this because I believe it would have framed the rest of it more accurately.
And what's interesting is if you look at actual scientific research around programming, say dealing with concurrency and advanced tools like model checking, etc—all of that is very theoretical stuff that assumes you know SICP or have equivalent foundations. So it's not really an argument that theory is dying; in face of this new level of complexity in practice, perhaps we could benefit from theoretical research now more than ever.
The real problem, I think we can all agree, is the lack of widespread specialization of degree programs. Computer Science is still the blanket degree everyone gets, when in reality, some kind of trade program is likely sufficient to train many of today's "developers" (the "pokers").
There's room for both of course, but for people like me SICP was really a slog. Some of the exercises were hard sure, but more than that the material just wasn't that appealing to me. I don't have any comment on whether MIT's decision was the correct one, but liking SICP or working through SICP is by no means a prerequisite of being a good programmer.
Or are computers meant to execute our programs? (CISP)
Personally I lean towards the second view, for a simple reason: we design programs much more often than we design computers. Computer design doesn't take much of humanity's time, compared to programming them. So I'd rather have the hardware (and compiler suite) bend over backwards to execute our pretty programs efficiently, than having our programs bend over backwards to exploit our hardware efficiently.
Many of us are in both groups, which is how stuff like Rust came about.
The theoretical scientists play with abstractions they fully understand. Experimental scientists poke at things they don’t fully understand. They tend not to do well with each other.
When computers were in the infancy, they were viewed primarily as a scientific tools to crunch numbers and play with abstraction. That’s the theory-oriented view that I think is shared by the SICP authors.
Then computers became complex, fast, and capable of doing much more than crunching numbers and evaluating those S-expressions.
While you can view a modern web browser, or MS Word, or a videogame, as a program for a Turing complete machine, this does not mean you should. Too little value in that: sure there’re instructions and state, but now what? More experiment-oriented mindset usually works better with those kind of problems.
ummm, I hardly see anyone and by anyone I mean even people > 20 years of experience doing empirical research while solving problems. Maybe the _like_ needs more emphasis? We don't really teach science or the design of experiments. This is arguably the most important intellectual tool we have ever developed and it should be the core of all education.
Programming by poking that I see folks doing, isn't science. It is stabbing in the dark until something appears to work. There is more to computational thinking that throwing some APIs together in a blender and making a cool demo.
As a longtime Python programmer, I wish more people would start with Scheme. Most Python is written like some dynamically typed Java; people are using maybe 30% of the power of the language.
I really wish I could have experienced those SICP lectures in 86.
I imagine many of us do programming by poking. But the skill set referred to here is essential if you're going to build (create/construct) a complex system or architecture.
To say the skill set is irrelevant is to say we as an industry are doing things more or less right. But when I look around, I see that we're making a huge mess of things everywhere or are at least sorely inefficient when it comes to building and growing systems. That is to say, I believe (I know) that we could be doing much much better than we are now -- and I believe the skill set being dismissed here is a large key to doing better. So I'm sad to see it dismissed like this.
It's sad, but MIT made the right decision. MIT used to produce people who designed elegant, perfect little jewels of code. There just aren't many jobs for those people. Algorithm design just isn't that big a deal any more. Most of the useful ones have been written and are in libraries. Who reads Knuth any more?
I was greatly impressed then with 6.001 and still am today. I have worked through most of TECS  within the last year, and I want to go back through SICP next year (after I retire :-)
 The Elements of Computer Systems http://nand2tetris.org
However, for now, the narrative shifts to "poking around, hoping to poke the right thing"?
I use Racket & its thinking for research. I think that structured thinking is still intellectually important. So it'd still survive, as there'd still be people sticking w it.
I'm sorry to say, but I feel very strongly about 6.01 being the least useful course I took at MIT.
That's an older and less polished online courseware system than edX, but it's perfectly workable. There are also VERY old videos of Sussman himself giving the .001 lectures. Those were easy to find, but I'd recommend the OCW site over the original lectures.
Old school, understanding based, composable via interfaces components gave us things like R4RS, Plan9, git, LuaJIT, nginx, Erlang, you name it.
That packing or poking crappy mindset produced so called eneterprise software with all its ole, jdbc, javaee, corba, NodeJS, and all the other meaningless bloatware.
BTW, this a human-universal law. Lack of understanding of grounded in reality fundamental principles will enivetably produce piles of abstract nonsense, be it philosophy (of Hegel), [Java] programming, theoretical physics or even math (500 pages Coq "proofs" of abstract nonsense).
There is the law of cancer (in terms of 4chan) - any branch of human endeavor will be ruined with enough of cosplaying idiots.
BTW2: teaching of principles is too difficult, because, like a good philosophy, which is a study of principles, it requires extraordinary trained and lucid mind habitually trying to separate underlying principles from human-made "higher" or "pure" nonsense.
This habitual mental hygiene, which guards the principles from contamination by popular memes and waves of mass hysteria is very rare trait, impossible to sell or monetize. To teach it requires a similar mindset in students. Packers would be unable to grasp what is it I am talking about.
The Motorcycle Maintenance book, and that part of Atlas about modern philosophers will be a much better illustration.
And I've also worked with plenty of very senior engineers who will blow smoke up your ass when you're really down in the weeds and you want to do "root cause" analysis.
I would recommend an entirely different set of books for someone who wants to work as a professional programmer.
That being said, TAOCP really gave me a clear understanding of algorithmic analysis. It's a hard book, but after you finish it, you won't look at programming the same way again. Especially when it comes to design decisions for systems where efficiency is a must.
How do you about working with such a hefty tome. I only use it a (light)reference. Any tips on how you went about it?
Re: SICP I noticed Harvard and other schools typically have an intro to CS course then a course after in abstraction using OCaml or other functional language. The syllabuses I've found for these second semester CS intro courses looks almost identical to SICP ToC except no hand rolled compiler which is probably the single most useful chapter I've ever read in any CS book.