Boolean logic, or Boolean algebra as it is called today, was developed by an English mathematician, George Boole, in the 19th century. He based his concepts on the assumption that most quantities have two possible conditions TRUE and FALSE... Boolean algebra is used primarily by design engineers.
I became fascinated with it. Most of my buddies went to the nuclear industry when they got out but I got out as soon as I could and went into computers.
Fun fact: Boole's youngest of five daughters Ethel [2] married Wilfrid Voynich, who acquired the Voynich Manuscript in Italy in 1912 [1].
[1] The Voynich Manuscript is one of the world’s most mysterious books written in code. The manuscript gets its name from Wilfred Michail Voynich (1865-1930), George Boole’s son-in-law. Several attempts by world class codebreakers (including Alan Turing) have failed to definitively unravel its meanings.
[2] Ethel Lilian (1864–1960), who married the Polish scientist and revolutionary Wilfrid Michael Voynich and was the author of the novel The Gadfly.https://en.wikipedia.org/wiki/Ethel_Voynich
One thing I found interesting was that Boole’s original algebra was based on AND (×) and XOR (+, exclusive disjunction). In the finite field F2 (i.e. integer arithmetic mod 2), these are multiplication and addition, respectively.
It was later people who rewrote results using AND (∧) and OR (∨, inclusive disjunction) and named this (slightly different) algebra "Boolean algebra". While this has some cute symmetries, the version with XOR is in many ways easier to work with, since it is more compatible with people’s standard expectations for arithmetic (using × and +) on rational numbers. NOT x is also very easy to express: It is just 1 – x (or alternately 1 + x, since x = –x in F2).
Boole didn't work with XOR, as such, and certainly not with F2.
Boole worked with polynomials over the integers. Not all such polynomials directly represented propositions for him. But those which sent 0 or 1 valued inputs to 0 or 1 valued outputs did represent propositional formulae (with 0 as false and 1 as true).
Thus, he used a notion of + for which, in the familiar way, 1 + 1 = 2, distinct from 0.
And then he was able to interpret the inclusive disjunction of A and B as 1 - (1 - A)(1 - B) = A + B - AB. But most often, he spoke of this in cases where A and B were already known to be disjoint from each other, so that AB = 0 and this simplifies to A + B. That presumed disjointness/exclusiveness of A and B might be confused with this being XOR. But he did not use the principle A + A = 0. He worked firmly within familiar non-modular algebra.
(He spoke of all this in a fashion a bit less familiar to modern eyes than the way I'm describing it, or anyone else in this discussion is. He never talks about "the OR operation" or any such thing, for example. He just saw himself as applying ordinary numeric algebra to the laws of thought/logic, not making up some new algebraic structure. But the above is indeed in essence what he did.)
Okay you are right, fair enough. But his equations are consistent with working over F2 (where we can then throw in additional simplifications) and treating multiplication as AND and addition as XOR, in a way that using an algebra of ∧ and ∨ is not.
When Boole writes "x xor y" as x(1 − y) + (1 − x)y = 1 ⟺ x − 2xy + y = 1 (p. 53, eq. 31), we can simplify by 2 = 0 and write x + y = 1, treating + here to mean exclusive or.
Thus by restricting himself to the idempotents of his algebra, Boole would have had Boolean algebra- and we rightfully honor him by attaching his name to this algebra. Only Boole didn't know it. He steadfastly refused to acknowledge any operation but his
+. When Jevons claimed that x + x = x, and Boole emphatically denied this, they were really talking about different operations.)
The NEETS training is beyond excellent! It's a true full-stack course that starts from the basis of matter and builds from there. I'm still working through it, but it's an absolute tome of knowledge. It's also freely available: https://www.hnsa.org/manuals-documents/2575-2/.
Awesome. We learned AND gates and the like by learning how elevator controls work. Everyone is familiar with elevators, so no one had any problem learning it. Teaching courses could learn a lot by looking at how the Navy teaches.
Quite amazing that this great book is just sitting there on Burris' website. I came across it when researching Burris after being very impressed by his book "Logic for Mathematics and Computer Science", which is a great introductory text to logic, even covering syllogisms, equational logic and algorithmic aspects like term rewriting and Knuth-Bendix. It also has lots of interesting historical remarks. Unfortunately it is out of print, but I got it used for £17 including shipping (also, libgen ...).
In general we're better off reading modern treatments because they have more context and more convenient notation.
That said, it's important to read the old texts precisely because they elucidate the state of the art as it was, and so help us understand not only how far we've come but how we got to where we are now.
One of my professors wrote the canonical biography of Boole. I read it, and learned as much about Boole as I did about Des MacHale.
In a roundabout way what I'm trying to say is learn the mathematics and the mathematician, you won't be disappointed.
I personally really like Alfred Tarski's book "Introduction to Logic and to the Methodology of Deductive Sciences". The first Polish edition was written in 1936 and the English version is still available from Dover: https://store.doverpublications.com/048628462x.html
perhaps this is the place to mention my pet peeve, that programmers think "bool" should be a type synonymous with logic values.
I would find it much more useful if language designers felt compelled to include set theory types and operators.
and the C language shows quite brilliantly that zero/nonzero is a perfectly good space in which logic can operate, and I've never understood the fetish of wishing to make sure that !NULL is an error.
So, you'll know I designed the language if declaring a bool gets you a set. (You can use them for logic, the Boolean algebra of sets is one-to-one-and-onto the Boolean algebra of logic.)
Navy Electricity and Electronics Training Series: Module 13—Introduction to Number Systems and Logic
http://www.tscm.com/NEETS-v13-Logic.pdf
Boolean logic, or Boolean algebra as it is called today, was developed by an English mathematician, George Boole, in the 19th century. He based his concepts on the assumption that most quantities have two possible conditions TRUE and FALSE... Boolean algebra is used primarily by design engineers.
I became fascinated with it. Most of my buddies went to the nuclear industry when they got out but I got out as soon as I could and went into computers.
Fun fact: Boole's youngest of five daughters Ethel [2] married Wilfrid Voynich, who acquired the Voynich Manuscript in Italy in 1912 [1].
[1] The Voynich Manuscript is one of the world’s most mysterious books written in code. The manuscript gets its name from Wilfred Michail Voynich (1865-1930), George Boole’s son-in-law. Several attempts by world class codebreakers (including Alan Turing) have failed to definitively unravel its meanings.
https://georgeboole.com/news/the-mysterious-voynich-manuscri...
[2] Ethel Lilian (1864–1960), who married the Polish scientist and revolutionary Wilfrid Michael Voynich and was the author of the novel The Gadfly. https://en.wikipedia.org/wiki/Ethel_Voynich