
Ask HN: Which of our ideas about computation could survive a thousand years? - abiro
A passage in the preface to SICP [1] on the real meaning of computer science has captured my imagination ever since I’ve read it:<p><pre><code>    The computer revolution is a revolution in the way we think 
    and in the way we express what we think. 
    The essence of this change is the emergence of what 
    might best be called procedural epistemology -- the 
    study of the structure of knowledge from an imperative 
    point of view, as opposed to the more declarative point 
    of view taken by classical mathematical subjects. 
    Mathematics provides a framework for dealing precisely 
    with notions of ``what is.&#x27;&#x27; Computation provides a 
    framework for dealing precisely with notions of ``how to.&#x27;&#x27;
</code></pre>
In this spirit, I’ve compiled a list of ideas about computation that I think could remain relevant in a thousand years. I’ve written a short justification for each item and I’m wondering what HN would add to or remove from this list.<p>1. Entscheidungsproblem, Halting problem, Gödel&#x27;s incompleteness theorems: Fundamental limitations of formal knowledge.<p>2. P != NP: A contradictory proof is unlikely and the distinction remains relevant for quantum computing.<p>3. Lambda calculus and combinatory logic: Richest model of computation available today.<p>4. First-class continuations: Every stateful program must have continuations. First-class continuations let programs write themselves.<p>5. Fixed points: Find equilibrium in complex systems.<p>6. Random variable: Universal abstraction to represent input to a procedure.<p>7. Interventional distributions: Capture causality under uncertainty.<p>8. Distributed representations: Versatile representations of patterns.<p>9. Church encoding: The purity of a system where everything is represented in terms of procedures is intriguing.<p>10. Principle of computational equivalence: We can represent everything as a computation. (Big if true!)<p>[1] Source: https:&#x2F;&#x2F;mitpress.mit.edu&#x2F;sites&#x2F;default&#x2F;files&#x2F;sicp&#x2F;full-text&#x2F;book&#x2F;book-Z-H-7.html
======
ktpsns
I don't share this way of thinking of "relevance". Are Newton's principae
mechanicae (+300yrs) still relevant? Obviously yes, they form what we call
today "classical mechanics". Every student learns it, despite of course the
best theories in physics are quantum mechanical or/and general relativitistic
ones. What about the ancient Greeks? They frontiereed in math and philosophy
and many proofs are almost 1000 years old.

That's how science works. All the items you listed are classical building
blocks of computer science and will remain that forever, even if people come
up with successor ideas.

~~~
abiro
Good to hear that we agree on the ideas in the list! These timeless ideas from
our current knowledge are exactly what I’m trying to identify. Let me know if
you can think of more

------
abiro
I’ve included the halting problem, but omitted the Turing machine, because
while ingenious, the latter is simply a hack to make proofs work and not a
very good way to think about computation. Similarly, I’ve included distributed
representations, but omitted backpropagation since while backprop works very
well for a number of problems that are otherwise unsolvable today, it is not a
particularly enlightening way to write programs.

Also, the first 10 minutes of the introductory SICP lecture contains a
livelier presentation of the idea in the quote:
[https://www.youtube.com/watch?v=-J_xL4IGhJA&list=PLE18841CAB...](https://www.youtube.com/watch?v=-J_xL4IGhJA&list=PLE18841CABEA24090)

------
rotterdamdev
I don't think humanity will be around in a thousand years. Really a bummer,
but it doesn't seem that way. Now I just code for entertainment and not to
build something that will last forever.

