
Is it really “Complex”? Or did we make it “Complicated”? (2014) [video] - saturnian
https://www.youtube.com/watch?v=ubaX1Smg6pY
======
TheAceOfHearts
I haven't watched the video yet, so please forgive the possibly premature
comment... But this is something that I've found myself thinking about a lot
lately. Are the things that we're currently building or maintaining truly that
complicated or are we over-engineering things? I've been humbled on more than
one occasion where I initially thought an enterprise-y solution was over-
engineered, until all the details of the problem were explained to me.

What I wish we had was a "man" equivalent to provide every-day examples of how
to use the tool "correctly" (although I'm aware there's stuff like "bro"
pages), as well as another tool to explain why some tool / option even exists
and how they're "expected" (by the creator / maintainers) to be used. As I've
gotten into the habit of reading man pages I've become increasingly aware of
how many options certain tools provide, but in many cases I really cannot
fathom why those options are available or in what kind of situation they might
be used.

~~~
braveo
Most things can be done in a less complicated manner, but it costs more.

Consider SAP. It's complicated to say the least, but a lot of that complexity
comes from both it's generality, it's flexibility, and the quality of the
solutions it solves underneath.

Any solution you write in SAP could be written in a much simpler manner using
simpler tools, but doing so would and getting your solution to the quality of
what you can get in SAP would be hugely expensive and take a lot of time.

In that way we subsidize each other, but in doing so, we often make things
much more complicated than they strictly need to be to solve any particular
problem.

Now this is a different class from things that are just shitty design. Those
exist in abundance, and it's unfortunate, but that's life.

------
goatlover
I wonder what a Smalltalk-like environment would look like if had been
developed in the 2000s instead of the 1970s.

If you could marry up the advantages of text/files, live code, visual layouts,
data visualization and perhaps machine learning in the future, maybe you could
come up with a huge jump in productivity and being able to handle complexity.

~~~
rjeli
Mathematica gets live code, data visualization, and ML, all with a
lispy+tacit+functional syntax. I find it much more integrated and easy to use
compared to Jupyter notebook, although it's near useless for anything
imperative -- I find myself using Jupyter a lot these days to develop one-off
scripts interactively.

(please, no one mention stephen)

~~~
505
(I am pleased you brought up Stephen W, and also that you asked us not to.
That's all you'll get from me.)

~~~
hardlianotion
I would have called him the W-ster and respect parent's wishes

------
perfmode
Is it theoretically impossible to fit an interpreter for a dynamic programming
language in the L1 cache of a modern chip?

(I understand there are physical constraints that prohibit super low-latency
memory lookups (of unconstrained size) in 0+epsilon time (where epsilon is
small))

~~~
jacquesm
> Is it theoretically impossible to fit an interpreter for a dynamic
> programming language in the L1 cache of a modern chip?

I'm pretty sure Chuck Moore (yes, he's still around) would be able to fit the
interpreter _and_ an entire OS into the L1 cache with room to spare. Forth
technically is an interpreter.

~~~
mikekchar
You can get a FORTH kernel in 2K words. It's also incredibly efficient for
your own code since you basically have a dictionary and memory addresses.
Thinking about it, in the old days dictionaries stored 8 character
identifiers, which will handily fit in a 64 bit word. That means that the
dictionary only needs 2 words per entry.

As you imply, the interpreter will be dwarfed by the code needed to talk to
the rest of the OS. As an aside, this is why I was initially _very_ excited
about the JVM when Java first came around. Compiling down to a FORTH style
language should give you pretty impressive benefits.

Virtual machines were very popular for a long time, but I'm not entirely
convinced that we've really pushed the concept as far as it can go.

------
mirimir
Maybe better: [https://vimeo.com/82301919](https://vimeo.com/82301919)

A link to a transcript would be cool.

Edit: There's a transcript of the iPad question here:
[https://news.ycombinator.com/item?id=8857113](https://news.ycombinator.com/item?id=8857113)

------
lsd5you
This is a distinction I first learned about working in france years ago.
Without any real basis I wondered whether it is their general more precise use
of language which made it a more obvious distinction for a french person to
make. At the time they were more or less synonyms for me, but since then have
become very distinct especially when talking about software!

~~~
kmicklas
> their general more precise use of language

This isn't really true, it's just a snobby idea the French have somehow
successfully convinced us of. (It goes along with the idea that they have the
most "refined" culture or something).

~~~
gutnor
A lot of the specific term in English comes from the common French vocabulary
and are still very (very) close to the common words in the French spoken
today. The common vocabulary in English comes from German origin. Actually I
think you can basically speak about anything using only German origin words.

In order to learn French and its vocabulary, English speaker will found a lot
of similarity but from the more formal side of their vocabulary. That would
lead English speaker to think French is more precise, I don't think the French
have something to do with this.

That's BTW a common mistake English speaker make when evaluating some French
speaker proficiency. The fact that I use rarely used words does not mean that
I have a large vocabulary, it is just the opposite.

~~~
eli_gottlieb
>In order to learn French and its vocabulary, English speaker will found a lot
of similarity but from the more formal side of their vocabulary. That would
lead English speaker to think French is more precise, I don't think the French
have something to do with this.

Worse, the French apparently teach young students to write in a way that they
consider profound, and the Anglosphere considers imprecise drivel.

~~~
arkades
Do elaborate?

~~~
eli_gottlieb
I can't really go into detail much, but my wife too intensive/immersion French
in her school days. As she became fluent in basic spoken and casual-written
French, they taught her the French style of literary writing. She's the one
who told me it's meant to be profound or deep, but comes across to her Anglo
brain as vague and, well, bad at saying anything at all.

------
faragon
The fool complicate the simple, while the wise simplify the complex.

~~~
goatlover
And evolution laughs at us.

------
Buge
He mentions a Microsoft Office bug that's been around since the 80s. Is there
any more information about this?

------
matt4077
Pah, "complex" is just latin for "put together". Take it apart, divide and
rule.

~~~
defined
More like, divide and be strangled by the huge web of interrelationships... :)

------
dkarapetyan
This is a fun one. But then again most Alan Kay talks are fun.

------
eternalban
Alan Kay has had about a few decades to empirically demonstrate that "we" have
willfully made it complicated. I don't believe he has done so.

~~~
chadcmulligan
I had the same thought, if he had a solution then he should have it by now.

~~~
eternalban
His critical error is evident in his comparative analysis that places Physics
and Programming on the same level. The systems that underly natural sciences
are _givens_. The entire kettle of soup of software complexity boils on the
fact that software engineering must first create the 'terra firma' of
computing. That is the root cause of the complexity in software: it lacks a
physics.

~~~
chriswarbo
I think it's the other way around:

In physics, we don't know what the fundamental rules are, we can only see
complicated outcomes and have to infer (guess) what the rules might be.

In computing, we know what the fundamental rules are (universal computation;
whether that's turing machines, lambda calculus, sk logic, etc. they're all
equivalent in power), but we have to deduce what the complicated outcomes are.

~~~
aethertron
>In computing, we know what the fundamental rules are

In a limited way. Because we're making systems that involve people. Important
and relevant aspects of human nature must go far deeper than our present
understanding.

------
ejz
This is a good line!

