This is also known as computational trinitarianism - https://ncatlab.org/nlab/show/computational+trinitarianism
Robert Harper, the computer scientist that coined "computational trinitarianism," has a series of lectures on the foundations of type theory including some commentary on these correspondences - https://youtu.be/9SnefrwBIDc.
But how do you understand type theory without some basic understanding of logic?
The program is officially only for MacOS and Windows, but with a little work it's possible to get it running on Linux, as it's just Java.
The exercises are fun, and the program will not only check your work to make sure it's right, but also point out where and sometimes why it's wrong.
 - https://logiclx.humnet.ucla.edu/Logic/Download
 - https://www.amazon.com/Logic-Techniques-Reasoning-Donald-Kal...
Our everyday "folk" logic is always contextual and ambiguous. The deductions are probabilistic instead of being deterministic. They are sometimes true.
I don't understand your confusion: you're both right. Everyday decisions are logically chosen from our perspective but still lossy in absolute terms. That manifests when we are frequently wrong about our presumptions... every damn day, more or less.
The abstract notion of compression is reducing the size of something by trading off some of it's other properties.
You can compress a gas, data, an image, audio, rocks can be compressed, brains can be compressed.
I think there's also an element that hackers want to feel as though using the concepts they've developed which are very powerful in computer science should therefore be key to understanding the world as a whole. It's partially being eager to apply one's knowledge and partly hubris and unwillingness to defer to philosophy, sociology or critical theory which they view as less rigorous (and therefore bad).
On the flip side, how many philosophy, sociology and critical theory types reach for concepts from biology and physics? And how well do you think they grasp and employ those concept compared to specialists in those fields?
I think any specialist who believes their specialty will necessarily magically generalize is presuming too much.
FWIW, as a formerly deeply-invested philosophy/critical theory type for over a decade... I have to say I had to begrudgingly admit that math/hard science are unequivocally more rigorous than the soft ones. Each is suited to it's purpose, but it doesn't help to pretend that Zizek is as rigorous as Knuth.
My issue is not the application of the concept, but the apparent lack of consideration for if the concept is valid for the domain to begin with, and secondly if so, to what extent it models the world or ought to model the world. Are the levels of abstraction matched? Is it better to think about this with dialectical logic? Do the concepts use represent any particular ideologies? What sort of instrumental reason do they employ? The complete disregard for these is what makes them "funny" to me. When I saw the mention of lossy compression, I (maybe unjustifiably) cracked a smile. It's not just a mismatch in ways of thinking, it's a total mismatch in the content of the concepts.
So I would say it is more pronounced among hackers, and I feel that there is a strong current (perhaps also due to the very industry-orientedness of the field) to eschew what philosophy has to offer. This isn't only in explaining the world, but also questions of how the world should be (several other commenters over the years here have noted the general want to ignore issues of ethics in computer science). Maybe I just think it's funny because I see it very often and I only see it on Hacker News and Reddit, usually leading with "This <complex real-world sociological phenomenon> can be thought of as a...", which is a trend I've seen in some branches of philosophy itself which tries to excise the "mystical" content of "bygone" theories using new techniques, which significantly weakens the original idea's applicability in favor of putting it into a dogmatic (perhaps ideology-laden) analytical framework.
And on the last point on rigor, I agree, though I do want to salvage some dignity in my view of the world when I say that rigor as formalised through equations has its uses in some views of the world, and descriptions of ideology, power, the state, metaphysics etc. have their uses in other parts. There's also a difference between an argument (which I think must be rigorous) and a critique (to which I don't consider the concept of rigor to apply to in many cases, if we want to retain the normative force of them).
That aside, you raised good questions. I think would be useful if you turned your list of critical questions into a template for criticising instances of analogous reasoning anywhere. It would help greatly in evaluating these attempts.
Sure, technical people might try to apply methods from technical disciplines to social ones. Sometimes it works, sometimes it doesn't. As a rule, however, I don't think anyone assumes mathematics will completely replace social studies. We might use statistics to better understand social studies, but the social context is still distinctive from pure math, and always will be.
Some of the best hackers are the ones who DO borrow ideas from other disciplines. And all other disciplines can often benefit from applying increasingly complex statistical methods, aided by computer and data scientists.
Not only is it useful to attempt to map powerful concepts from math and computer science onto the social and other sciences, but it useful to map powerful concepts from the social and other sciences onto computer science! In the search for meaningful and useful abstractions you might just hit upon something useful among the junk, but it is a necessary part of the search.
Making assumptions is useful and sometimes it leads you astray. There are good analogies, analogies that are ok but something feels off, there are bad analogies. As the domains get further apart and the cost to acquire accurate models in the domain increases the search cost to answer the question of whether the mapping applies or not also increases. I would argue the search cost is infinite. Or at least it's near infinite. In practical terms the search cost is too high for most people as it relates to most things. Our time and other resources we would use to compute the applicability of any mapping is finite and so there is an economics to it. Metaphor/analogy is a shortcut trading off a lower search cost for useful abstractions with a greater possibility the models are inaccurate and/or lower resolution models. Guess and check is often a reasonable strategy. Sometimes it's the only one.
All people are engaged in hubris to some degree as their confidence in their conclusions is almost never proportional to the completeness of their models.
I find it likely that there are a vast multitude of domains that contain useful concepts necessary to understanding the world as a whole and that anyone in any domain in possession of one of those concepts should feel like it is not "the key" but a "a key" to understanding the world as a whole really doesn't surprise me. It's likely that it actually is a key to understanding the world as a whole.
The search cost for any one person to be in possession of all the keys at once is greater than anyone can afford. So to find it amusing that one group of people try to economize their understanding of the world by leveraging the tools in their possession in lieu of gaining a prohibitively expensively deep understanding of all domains of knowledge is to fail to realize how impossibly expensive an accurate view of the world is to attain.
The hubris is in thinking it's possible to compute whether something is applicable or not working from a desperately incomplete set of models and expending very few resources in the computation.
Also, though very different, I just got GEB for Christmas, so I guess I'll be reading that finally now too.
1. Using a well-known and widely accepted system of logic to produce proofs and solve problems. A working man's logic is propositional logic (AND, OR, NOT, and the material conditional) plus first-order predicate logic (quantifiers "there exits" and "for all" over bound variables), natural deduction, and ZFC set theory.
2. "non-standard" logic systems which are somehow "better" or "more expressive:" 2nd order logics, modal logics, many-valued logics, etc.
3. Meta-mathematical investigations into the properties of different formal logic systems: model theory, proof theory, forcing, consistency, completeness, etc.
4. The philosophy of logic: Semantics, sense/reference, truth, nominalism/realism, theory of descriptions, etc.
5. Logic and computability: lambda calculus, recursive functions, turing machines, complexity classes, etc. This is somewhat tangential to logic proper but there are connections and Dr. Smith has a section on it so it's worth enumerating.
While everyone will need to know (1) as a working man's logic, the other topics are more or less independent and can be approached independently.
For people just starting out and finding this study guide pretty overwhelming, let me re-emphasize two of Dr. Smith's suggestions for starting points: Smullyan's First Order Logic and Halmos's Naive Set Theory. These two books will get you to the same depth of understanding that 95% of working mathematicians need or apply in their day-to-day work, and it's a fact that 95% of modern mathematical theories are built on (formalized within) ZFC set theory and first order logic.
I also suggest the online game The Incredible Proof Machine. It's extremely helpful to have a computer formally check your proofs when you're just starting out in logic because it's incredibly easy to "cheat" and skip steps or use rules that "make sense" but aren't part of the formal system your using. TIPM lets you do that without learning any specialized syntax.
For a 1000 level logic course that Software Engineers, Computer Engineers and Computer Science must take, it is extreme.
Overall, interesting to see!
Does anyone know of any others for other domains of knowledge??