Um, this is total garbage.What’s the difference between machine learning, deep learning, big data, statistics, decision & risk analysis, probability, fuzzy logic, and all the rest?Not NONE. machine learning is the super discipline, an ensemble of techniques. Deep learning is a specific type of machine learning, focusing on higher order statistics. Big data is a term from business not academia, but deals with the unique challenges of machine learning with very big data sets. Fuzzy logic was the fashionable way of dealing with uncertainty before Bayesian logic came to the front (particular flavour of probability). Probability and fuzzy logic should be considered notation.So what’s the difference between probability and logic?Not Not much. There are lots of types of logic, Bayesian probability can be considered a type of logic. Most people when they think of logic they think of propositional logic, boolean algebra, predicate calculus, formal methods etc. That kind of logic is a billion miles away from probability in culture and the kind of things you can conclude from its application. Logic is rigid truths, undecidability. Probability is a representation of uncertainty and half truths, from which you hope to extract the most likely explanation from partial or noisy data. Formal logic is next to useless on noisy data. Probability is a good fit for noisy problems (like perception).Logic is classic AI. Probability is Modern AI. There are attempts at unification (e.g. Bayesian Logic networks, but that's cutting edge stuff). Saying they are basically the same is lunacy.

 Logic is much more than classical AI. The logic culture in classic AI was very ambitious and that was one of the reasons it crashed and not due to fundamental flaws in logic.Formal classical logic can represent all kinds of truths and even uncertainty with trivial syntactic mechanisms. E.g., "It is true that statement X holds with probability P." can be a syntactically correct statement representing uncertainty with certainty (there can be levels of statements with varying levels of certainty). Classical logic is extremely good at complex nested representations and the objects represented can be probabilistic statements. See the following paper for a syntactically classic FOL of probability (semantically it is a bit different from the standard model-theoretic approach)http://www.cs.cornell.edu/home/halpern/papers/first-order_pr...Of course, researchers who posit probabilistic models reason mostly in classical logic when they write papers explaining the math (other than for assumptions of course).
 I think that the author was attempting to note that the similarities between these techniques/models/tools are far greater than the differences.For me, machine learning is a particular model set atop the edifice of statistics mixed with coding. I don't agree that big data poses unique challenges, I was reading a psychometric book from the late eighties where a lot of the material was familiar from the big data stuff we read about lately.Logic is a way of deriving conclusions from propositions/data. Probability is a method for introducing uncertainty into logical reasoning (IMO, clearly). I don't understand your point about probability in culture, can you explain? I would suggest that your last statement in your second last paragraph is rather proving the author's point, but as my perception is noisy, I can't be sure.And seriously, both probability and logic are related to natural intelligence (or whatever we want to call ourselves) far more than AI. In fact, the reason they have been applied to AI is because they are useful tools for humans. When you say bayesian logic networks do you mean Bayesian belief networks (as put forth by Pearl et al) or something else?
 "I don't agree that big data poses unique challenges" Google does. IBM does too (Watson). Unique algorithms on unique hardware for the sake of ML. I was at a machine learning conference with a speaker from Google (soz can;t remember who). They said they are not interested in any machine learning with complexity above O(nlogn) or that has to visit a training point more than once. That's a hell of alot of literature in the bin for the sake of big data (e.g. SVM, back propagation NNs). So the problems there are solving do have novel constraints (even if the overall function they are trying to approximate is the same)Yeah I agree probability is a logical inference scheme. I was just commenting on the kind of logic people think of on the term "LOGIC". That's classical, but you can also argue any system with symbols is logic. I was just trying to say that the Bayesian people think differently to the formal logic people. Thats what I meant by culture. They are very different sets of people normally (exception, see below)Bayesian Logic Networks:- http://ias.cs.tum.edu/_media/spezial/bib/jain09blns.pdf probabilistic inference upon hard logic constraints. != Bayesian Belief Networks (standard Bayesian formulation)
 Actually, the different between fuzzy logic and probability is more than just notational. There is a nice table at the end of Section 8.2 (in the 3rd edition) of Russell and Norvig's Artificial Intelligence book talking about the epistemological (what an intelligent agent believe about facts) versus ontological commitments (what exists in the world) of various "languages" (e.g., propositional logic, first-order logic, probability theory and fuzzy logic).Taken straight from that table, here is the comparison:Language | Ontological Commitment | Epistemological commitmentProbability | facts | Degree of belief \in range from 0 to 1Fuzzy Logic | facts with degree of truth | known interval value
 Yes they are types of notations (for different things). The author of the articled lumped probability in with machine learning. Probability is a notion for machine learning as is fuzzy logic. Although they trying to represent quite different concepts.

Search: