
The Future of Programming (2013) - illo
http://worrydream.com/dbx/
======
fasquoika
Why does everyone seem to believe this talk is about the fact that these
problems were "solved" decades ago? My takeaway is that, no, these things
weren't solved, they were abandoned. We got tunnel vision and decided that a
handful of ideas were what programming _is_ , rather than just a tiny subset.
This talk isn't about how the early programmers were better at everything, if
anything it's the opposite, that we should question their decisions and not
take anything for granted.

"Do you know why these ideas, and so many other good ideas came about in this
particular time period, the 60's-early 70's? Why did it all happen then? It's
because it was late enough that technology kinda got to the point where you
could actually kinda do things with computers, but it was still early enough
that nobody knew what programming was. Nobody knew what programming was
supposed to be. And they knew they didn't know, so they tried everything."

~~~
david927
This talk is not saying that these problems were "solved" decades ago but
rather, decades ago we had much better instincts and in many ways we were on
the right track, and now we're not.

~~~
fasquoika
Except that the track we're on now was started then. Unix was created during
the time period he's talking about. We've just stopped looking at the other
stuff. They weren't "on the right track", they didn't have a track. That's the
point: they were doing everything

------
macintux
One of my all-time favorite talks. The IT industry is pretty bad in general at
knowing what problems have previously been solved.

~~~
fhood
This was one of my professors favorite gripes. He loved to tell stories about
how companies would reinvent algorithms (usually multiple times) that had been
published in the 70's and 80's

~~~
zvrba
The problem here is search. The problem may have been solved, but finding the
relevant paper is like looking for a needle in the haystack. It doesn't help
that terminology has changed during the years...

~~~
mrec
> It doesn't help that terminology has changed during the years...

As has hardware. Trivial example, but linked lists made a lot more sense in
the 1970s than they do today.

------
shalabhc
Bret Victor is now at HARC
([https://harc.ycr.org/member/](https://harc.ycr.org/member/)) which has an
interesting list of projects.

------
sbov
Some of the "better programming" stuff exists, programmers just don't program
them because you don't need them for it.

Parallel programming didn't matter as much as long as clockrates continued to
scale with transistor count. The disconnect is a relatively recent phenomena
(last 10 years). There's also lots of tools beyond raw threads+locks these
days.

------
manodocedoceu
Finding new semantics for constructing computational logic is certainly a good
middle-of-the-road innovation.

The philisophical underpinnings of my own research and practice over the past
decade are in constant existential friction with many of the most basic
assumptions that underlie information theory. I've argued about it on here
before. Since I'm terrified of general AI developing under the current power-
dynamic, I haven't been too enthusiastic about contributing.

That said, a lot needs to be disbelieved-- we need an ideological grass fire
on the great plans, of sorts-- before certain types of phenomenon can be
reliably conjured through our electrical counting machines.

We've mistaken the representation with the thing itself, especially in the
'information sciences.' The idea of 'information' and 'data' as 'packets'
presupposes the existence of a physical world composed of neatly delineated
objects. While this has been and continues to be a convenient and even useful
approximation of reality, it is still that-- a representation.

I've never seen this assumption questioned Im CS/IS forums. At this point,
it's beyond dogma.

The unbelievable volume and density of the knowledge built in the information
sciences, all of it sewn into these base assumptions of information theory,
generate in the unwitting practitioner a belief considered as a hard-as-
bedrock fact, always presumed but never questioned. It's confirmation bias
propping up a simulacra of reality.

Like physics did last century, our latest IS works (like general AI) are
hitting the hard limit on these assumptions.

What's next? Try deconstructing it yourself. That's part of the fun, and
helpful in the creative process.

In the mean time It would be nice to see new ways of visualizing code and code
execution, that's probably a good start.

Code in Motion – An Interstellar Inspired Visualization for the JVM -
[https://vimeo.com/96317948](https://vimeo.com/96317948) Code Galaxies
Visualization - [http://anvaka.github.io/pm/](http://anvaka.github.io/pm/)
Binary data visualization -
[https://news.ycombinator.com/item?id=15164166](https://news.ycombinator.com/item?id=15164166)

------
melling
We’re still trying to crawl out of the tar pit.

[https://github.com/papers-we-love/papers-we-
love/blob/master...](https://github.com/papers-we-love/papers-we-
love/blob/master/design/out-of-the-tar-pit.pdf)

------
miguelrochefort
Is anyone working on these problems, or did people just give up?

It's crazy to think that the way we create and consume software barely changed
since the 60s/70s.

~~~
kareemamin
Yeah it's definitely crazy! A few people are working on this from different
angles. There is research
[https://harc.ycr.org/project/realtalk/](https://harc.ycr.org/project/realtalk/),
side projects that explore Jupyter notebook like programming environments:
[https://www.maria.cloud/intro](https://www.maria.cloud/intro), or
[https://www.runkit.com;](https://www.runkit.com;) my team and I are working
on [https://www.clay.run](https://www.clay.run): making it easier to prototype
by allowing developers to write code that is instantly running without setup
or configuration and 'fork' other people's code to have your own running copy.

~~~
shalabhc
Maria, clay and runkit look interesting - I like the quick iterations and
incremental programming they allow.

But you're still writing 'text' to manipulate 'data'. An example of something
fundamentally different would be if you can _make_ programs without text. I
intentionally said _make_ not _write_ to avoid framing the discussion. For
instance why write `(circle 10)` when you can instead just draw a circle and
then perhaps define processes to manipulate it? I don't mean code generation -
why even have text as the canonical form that describes computation processes?

~~~
muxator
Maybe I am old school, but the fact that text is easily version able and
comparable means a lot to me. I use to code with two side-by-side windows,
comparing the old and the new version of my program.

It's a sort of fluid evolution where I can always be in control.

It would be a hard time doing that via visual programming.

~~~
shalabhc
> text is easily version able and comparable means a lot to me.

Don't you really want to do more meaningful comparisons though? I think text
is a poor form for versioning and comparison because you always have to
mentally extrapolate the textual diff (lines added/deleted) to the syntactic
diff (functions added/deleted/modified) to the diff in effect (which logic in
which systems is modified). There is no reason other forms of programs cannot
also do diffs, that could be more useful.

Visual does't have to mean flowchart boxes. Even a tree structure might be
incrementally better than plain text.

~~~
dragonwriter
Syntax-aware diff for text isn't hard, but it's just not usually worth the
effort because line-oriented text diff is good enough and more generally
applicable.

------
clusmore
Slightly off-topic but does anybody know of other great talks with a quirky
theme like this, where the talk is "set" in the 70s talking about the "future"
(present)?

I'll kick it off by submitting [Growing a Language]
([https://youtu.be/_ahvzDzKdB0](https://youtu.be/_ahvzDzKdB0)) by Guy Steele,
where he speaks using only monosyllabic words and words he defines in terms of
other monosyllabic (or previously defined) words.

~~~
spiralganglion
There's the Gary Bernhardt classic "The Birth & Death of Javascript"

[https://www.destroyallsoftware.com/talks/the-birth-and-
death...](https://www.destroyallsoftware.com/talks/the-birth-and-death-of-
javascript)

------
aws_ls
Call me naive, but I listened the entire talk thinking it happened in 1973.
Then came back and after seeing the discussion realized its an act.

So definitely my wonderment and awe of it came down by several notches, as I
could see how every 1/2 minutes I was exclaiming in wonder of how relevant it
was to today's time. But of course when you recreate the past in the future,
you can be very selective towards ideas/topics which are relevant in some way.

------
CyberDildonics
There is a huge disconnect between these flashy ideas and what you can
actually use to create software. Showing cool ideas is great, but there needs
to be interfaces that scale to complex software and a realistic way to
integrate them with languages that already exist. Brand new languages with no
eco system and without mature compilers backing them are not going to allow
general software to be written in them.

------
krmboya
I've been looking into the linked resources more and more of late and they
reveal a perspective of computing research I find I'm really excited about.

I wonder how one can start studying and taking part in this direction of
computing research.

------
pknerd
Using slide projector in 2013. Daring.

~~~
hacym
I am pretty sure that was a conscious decision based on the subject matter.

~~~
evv
No kidding! My favorite moment is at the beginning when he introduces the
talk, then pushes the slide up to reveal the date :-D

------
symstym
I think this attitude of "we solved these problems decades ago!" is rather
naive and sometimes arrogant.

I think it's a fantastic talk, and that Bret Victor and Alan Kay are geniuses.
But I feel that they both promote the idea that we definitively solved all
these important computing problems years ago, and that people are just too
clueless/resistant to catch on. Yes, I agree that many good ideas have been
culturally "forgotten". But for the most part, the reason these great past
ideas are not in use is because nobody has _made them into a compelling
product_.

Their attitude is comparable to someone saying "oh I invented the WWW in 1985
but nobody would listen to me", or "oh I invented Twitter before Twitter but
users weren't enlightened enough to appreciate it". Almost all good ideas were
already had before, but they are worth comparatively little, and unlikely to
catch on, until they are reified into something that people want to use.

I agree with them that probably more people should be working in certain areas
(e.g. new ways of programming). But if they really had it all figured out,
then why haven't they themselves made the amazing new programming language
that we all use? What if it's the case that some of their ideas are good in
theory, but are hard to translate into a usable product? Most people accept
that execution>>idea in the world of startups, but don't acknowledge that the
same may apply here.

~~~
shalabhc
> But for the most part, the reason these great past ideas are not in use is
> because nobody has made them into a compelling product.

I think you're overestimating the market's ability to select good ideas and
specifically, promote long term scientific advances. A lot of variables affect
what succeeds, e.g. marketing, coincidence, network effects etc.

You cannot leave everything to the market (i.e. what people adopt) and expect
great science to come from it. Many times, great science (and maths) comes
from the compelling drive of people to discover and create something new.
These talks are encouraging that kind of research, and I don't see them saying
they 'have it figured out', but rather pointing out ideas they think should be
explored more extensively.

~~~
symstym
I certainly don't think that the market selects good ideas or is sufficient to
promote long-term advances. Per the bit of my comment that you quoted, I'm
saying that the reason that the ideas are not _in use_ (more) is that they
haven't been incorporated into more compelling products. He feels that good
ideas are not in wide use because we didn't "get" them or forgot them. I'm
saying that sure, that may be part of the reason, but I think most of the
reason is that certain ideas that seem good on paper are really hard to put
into practice.

Quoting the talk:

> But I do think that it would be kind of a shame if in forty years we’re
> still coding in procedures and text files in a sequential programing model.
> I think that would suggest we didn’t learn anything from this really fertile
> period in computer science. So that would kind of be a tragedy.

Lots of people seem aware of the idea of coding without text files. There are
some "visual" programming environments with traction (in the game dev world,
Max/MSP). I'm even working on one myself! But there are significant
downsides/challenges associated with this approach (more difficult to version
control, often tied to one editor, etc.). So to his quote, the fact that we're
still coding in text files may not be because we didn't learn anything, but
because the idea of non-textual programming is hard to form into a product
that more people want to use.

I agree with you that his talk is very valuable in terms of drawing attention
to ideas that deserve more exploration, and I love the talk. It's just this
one facet that I take issue with, the suggestion that the ideas haven't caught
on because nobody appreciates them. His talks are frequently at the top of HN,
they are widely appreciated. People have been super excited about related
projects, like Light Table and Eve, and yet they haven't gotten much traction.
So I think it's worth acknowledging that the problem is less idea-awareness
and more compelling-implementation-difficulty.

------
bluetwo
Was it just me or did the talk not actually cover "The _Future_ of
Programming"?

~~~
miguelrochefort
It covered the future of programming from the point of view of the past.

~~~
platz
[http://tvtropes.org/pmwiki/pmwiki.php/Main/DaysOfFuturePast](http://tvtropes.org/pmwiki/pmwiki.php/Main/DaysOfFuturePast)

[https://en.wikipedia.org/wiki/Retrofuturism](https://en.wikipedia.org/wiki/Retrofuturism)

