easy there, you might want to get your facts straight before you drop vitriol like that. LC actually came first, via Alonzo Church in Annals of Mathematics. Turing didn't publish his definition for another year and also, he did his PhD thesis under Church in LC notation.
>LC actually came first, via Alonzo Church in Annals of Mathematics.
That's irrelevant, as early computers weren't programmed in LC, not build on such an architecture. And of course algorithms and even programs (e.g for Babbage's computer) existed before LC.
No computers have ever been programmed via Turing Machines, or built on an architecture resembling TMs(Von Neumann architecture can be described as a register machine). TMs are simply a mathematical formalisation of the notion of computation, equivalent in power to the lambda calculus.
However, most programming languages(including C, Java, etc.) look much more similar to lambda calculus than a description of a Turing Machine - and for very good reason. Have you ever tried describing a TM that encodes even the simplest logic? It is a pain in the ass.
Indeed, most courses on the theory of computation that discuss Turing Machines etc. don't ever expect students to fully describe a Turing Machine. Many times they use a language reminiscent of the lambda calculus to describe Turing Machines.
Random-access, multi-tape Turing machines are actually very good descriptions of computer hardware. The description of Turing machines in practice (using unary or binary notation on a single tape) is generally done because they're simpler and no less powerful, but modern computers hew much closer to a state-transition model embodied by Turing machines than a symbol rewriting process embodied by Lambda calculus.
>However, most programming languages(including C, Java, etc.) look much more similar to lambda calculus than a description of a Turing Machine - and for very good reason. Have you ever tried describing a TM that encodes even the simplest logic? It is a pain in the ass.
That's because we don't actually have tape, but random access memory. But a turing machine is just a limited form of imperative programming, and much closer to Assembly, or C, Fortran, BASIC, or even Java, than Lambda Calculus.
>That resembles no description of programs that are written by humans to run on computer systems, unlike the lambda calculus.
Actually looks like a pretty run of the mill description of working with memory locations, gotos, conditional jumps and so on. Substitute the need to run through the tape for random access memory, and you're there.
The examples don't remind you of programs written by humans mostly because they're visual examples showing the whole state configuration. If we similarly mapped the memory states during various steps of the execution of a common imperative program, it would look very much like those tables.
I don't think that pointing out who came first is even relevant to the discussion; the real question might be: "which model is better to program in?" and the answer would be lambda-calculus derivatives for many.
different problems require different tools. functional has its place and imperative has its place. that's why i was so snarky in replying (even if i did get the timeline wrong): prescribing one over the other is dogmatic and pretentious.
I find that the more competent people are as programmers, the more they treat their preferred one as a religion. It's not even that they can't hack others, it's that they've arrived at the One True Way. Less experienced programmers (and moreso those who can kind of code but are faking their way with Stack Overflow) tend to be (or claim to be) language-agnostic.