Hacker News new | past | comments | ask | show | jobs | submit login

I like the idea of identifying ‘bit flips’ in papers, which are (if I am following along) statements which precipitate or acknowledge a paradigm shift.

Perhaps the most important bit-flip of this paper’s time (and perhaps first fully realized in it) might be summarized as ‘instructions are data.’

This got me thinking: today, we are going through a bit-flip that might be seen as a follow-on to the above: after von Neumann, programs were seen to be data, but different from problem/input data, in that the result/output depends on the latter, but only through channels explicitly set out by the programmer in the program.

This is still true with machine learning, but to conclude that an LLM is just another program would miss something significant, I think - it is training, not programming, that is responsible for their significant features and capabilities. A computer programmed with an untrained LLM is more closely analogous to an unprogrammed von Neumann computer than it is to one running any program from the 20th. century (to pick a conservative tipping point.)

One could argue that, with things like microcode and virtual machines, this has been going on for a long time, but again, I personally feel that this view is missing something important - but only time will tell, just as with the von Neumann paper.

This view puts a spin on the quote from Leslie Lamport in the prologue: maybe the future of a significant part of computing will be more like biology than logic?




This is essentially the paradigm what Karpathy deemed to be "Software 2.0" in an article in 2017:

https://karpathy.medium.com/software-2-0-a64152b37c35


And the next paradigm shift after that will probably be "programs' own outputs are their input"


That sounds like a feedback control loop. That's the basis of a programming language I'm writing ^_^


> instructions are data.

is an insight I've often seen attributed to von Neumann, but isn't it just the basic idea of a universal Turing machine? - one basic machine whose data encode the instructions+data of an arbitrary machine. What was von Neumann's innovation here?


Turing machines' instructions are different from their data; a Turing machine has a number of states that it flips between depending on what it reads from the tape.


GP is referring to universal Turing machines (those that can emulate an arbitrary Turing machine) and not a specific Turing machine.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: