"Normal is the greatest enemy with regard to creating the new. And the way of getting around this, is you have to understand normal, not as reality, but just a construct. And a way to do that, for example, is just travel to a lot of different countries -- and you'll find a thousand different ways of thinking the world is real, all of which is just stories inside of people's heads. That's what we are too. Normal is just a construct -- and to the extent that you can see normal as a construct inside yourself, you've freed yourself from the constraint of thinking this is the way the world is. Because it isn't. This is the way we are."
Choosing between musician and computer genius is bad enough but to retain being likable with this is frankly too much :-)
The second option is well-bounded and safe. It requires technical skill, not creativity. It's a legible, comprehensible challenge.
The first option is unbounded and unsafe. It can't be done without creativity, originality, and technical skill.
I'm becoming convinced there should be a much stronger creative and artistic element in developer training. Invention and innovation are so much more valuable than wheel reinvention that turning out people who are only comfortable with the latter is selling everyone short.
We have largely crafted a culture where doing research before writing code is considered slow and ineffectual for whatever reason. Instead, we value "moving fast and breaking things" and whipping up the quick hack, which encourages people to propagate their computing biases and never step out of their comfort zone.
This is one of the reasons why I mostly scoff at attempts to make computer programming a compulsory schooling subject. Coding as "the new literacy" devalues computing and is the very embodiment of the programming pop culture that Alan Kay has warned about.
> This is one of the reasons why I mostly scoff at attempts to make computer programming a compulsory schooling subject. Coding as "the new literacy" devalues computing and is the very embodiment of the programming pop culture that Alan Kay has warned about.
Especially since the people behind this idea have zero idea about what is programming. Some want people to learn HTML, which is pretty much void.
I agree with your other points though.
This is a funny word if you think about it. It's the property where several forces on a point are in equilibrium. This definition generalises to very abstract situations, and especially to political ones. Several sides are all pulling (or pushing... the metaphor doesn't really matter) in their own direction and where the final decision falls is where the powers are in equilibrium. It's the point which is as dissatisfactory for every party as that party is powerless. Sometimes, that's what you want.
Often, balance evokes the image of the halfway point. The balance between killing every Jewish baby and no Jewish baby is not killing half of all newborns. It's also not killing exactly zero babies. If you want a balance between killing all Jewish babies and no Jewish babies, the neo-nazis all have a saying and you end up having to kill one of every hundred million or so babies. Which is unacceptable. This example was an extremist straw-man to illustrate the following point: You're not looking for balance, you're looking to maximise a utility function.
Now we come back to balance. At some rate of new frameworks being made, we balance out the amount of work spent on maintenance of the old ones with the influx of new ideas that make our work and life easier and from time to time we completely overhaul and forget an old technology. Which, in turn, maximises our utility of these frameworks. It does look like balance <=> utility, but not quite. This is a balance in the context of a specific utility function. You've defined the utility function as "I want X and Y with preference weights α and β", which looks very much like a linear programming problem, the solution to which is... weighted balance.
Others would not agree to your utility function. They maybe don't want new ideas so much. Now you're balancing utility functions. But those people still agree that they want to "find the balance". It's just that their equilibrium is not your equilibrium.
So, please don't just throw out "there's balance" like that. Everyone can agree to balance and it's at a different point for different people. It's a word that causes... well, muddled thinking. If you can, avoid using it at all. Get to the root of your problem as much as you can, then say what ails you. "I like seeing new approaches tried, but it's hard to keep up with the influx of largely samey ones that are constantly cropping up. I wish they would slow down." -- or something along these lines, I'm not saying this is what you specifically want.
Sorry for the long rant. I've had to listen to people wanting to balance a lot lately.
 https://en.wikipedia.org/wiki/Linear_programming a type of math problem concerned with finding the point where the given variables all satisfy a set of constraints expressed as linear inequalities and a given function achieves a maximum.
 maybe there is a balance to abusing "balance"...
Harshly put, but there's some truth there.
Devs go for fads (just watch HN homepage over time), and there are tons of snakeoil salesmen pushing their wares (e.g. Mongo) and millions of programmers without the basic scientific and engineering rigor.
It would only be cynical if it wasn't an extremely accurate description (which you agree it is).
Cynicism implies that something is in a condition X, and the cynic describes it as a much worse condition Y.
"Cynicism is an attitude or state of mind characterized by a general distrust of others' motives."
Perhaps you are conflating the word cynic with critic.
And even in this case: it's only "distrust" if other's motives aren't bad in the first place.
If people in general have bad motives in IT (lazyness, profit, unprofessionalism etc) it's not "cynicism" to say so.
It's merely calling a spade a spade.
I'm more reffering to the "bitterly or sneeringly distrustful, contemptuous, or pessimistic" meaning of the lemma though.
Where again, if contempt is waranteed and if the situation is dire, it's not cynical to be "contemptuous" or "pessimistic" is just realistical description.
When in physics you have many fundamentally distinct fileds with their own models and view of the world: mechanics, thermodynamics, statistical physics, electronics, optics, quantum physics and all sorts of fundamental theories like string theory. This is a very rich system, with many complex models. Scienetists only hope fore some unification there, but for the most part you have to deal many diverse parts of a huge multi-scale puzzle, and the peices don't always fit together nicely.
When you look at computation, what it really is, well a primitive Turing machine is all you can hope for really. Some resarchers push it a little bit into infinite-time computability, but it's not a realistic model of computation anyway.
What you can do with computation are conditions, loops, and variable assignment. Or in lambda calculus, it's just substitution and name binding. Even worse, name binding is not really necessary if you express your program with combinators. So, fundamentally computation is substitution, a rewriting system.
Computation is simple and trivial, but still, it's a great model, and you can create amazing things in it, even though the laws of computation are really trivially simple.
Programming at any level is fundamentally the same, it's about iteration, branching, composition of smaller pieces, and packaging smaller pieces into bigger ones. No matter at what level you write your code, it's always like this. The differences in API are not important.
EDIT: I think that concurrency is a bit different in this respect. It really requires somewhat different prespective, but again it reduces to simple basic elements (like the agent model, or pi calculus) which are reiterated and reimplemented many times.
* Smalltalk did everything better
* Software was better in the 60s and 70s
* Alan Kay really dislikes the web.
That's why it's a speech and not a slideshow.
(Besides who said you HAVE to summurize the speech in the slides? You could use them as supplementary material, and have people pay attention to your speech as opposed to PowerPoint BS).
And it's a generic speach, not him presenting some paper etc. Have you seen his research work papers?
>* Smalltalk did everything better
Well, compared to most things we have today, especially JS, it really did.
>* Software was better in the 60s and 70s
No, but thinking about software was better. We got far more breakthroughs in those days compared to today -- and not just because they were "low hanging fruit".
>* Alan Kay really dislikes the web.
Who doesn't? Compared to what it could be?
I don't find it particularly surprising that not listening to a talk makes it hard to divine what the speaker said. If I didn't already believe it, the tl;dr [for a video?] would be evidence upon which I might come to such a belief.
Anyway he has a talk here which at least show some more recent examples of what he considers a better way:
Is it really "Complex"? Or did we just make it "Complicated"?: https://www.youtube.com/watch?v=ubaX1Smg6pY
In this he talks about two programming languages: Nile and OMeta which allows them to solve some problems in very few LOCs (among other things).
More interesting is perhaps the question of how you get the frog to sit still -- or maybe change the experiment so there's a grid over the top of the pan -- and try to see if the frog gets more and more "desperate" (flight from danger as opposed to "just a frog jumping around") as the water slowly heats?
TO MUCH DOING, NOT ENOUGH THINKING.
"THINK" is a motto coined by Thomas J. Watson. If you don't know who it was... Again you fail.
If you want to create the next new thing, go talk to someone who has been working in IT for 40+ years. Remember, IT work in dog years. That means the person you are consulting with has 320 years of experience.
maybe I should do a "Using Cosidered Harmful, considered harmful"
EDIT: seems somebody already beat me to it http://meyerweb.com/eric/comment/chech.html