Right -- I'm just saying that fuzzy logic is crap (at least compared to proper statistical inference; maybe there are other, better, uses for fuzzy sets).
Why use ad hoc schemes when you can just maintain a probability distribution?
Fuzzy logic is not crap, and the fact that you are comparing it to statistics in this manner shows that you have little understanding of either.
Fuzzy logic is just like binary logic only it allows for partial truth.
Probability relates to how likely something is to happen.
To take an example (I didn't make this up, but I don't remember the source):
If you take a series of data points to determine whether or not I am in my living room at 7:00 on any given evening and determine that the probability is 50%, that means that I am in my living room 50% of all nights.
However, if you give me a 50% fuzzy logical value of being in my living room, this means that I am lying in the doorway between my living room and my bathroom, such that exactly half of my body is in one place and half of my body is in another.
These are two different things and the mechanisms do not apply at all to the same problem sets.
"However, if you give me a 50% fuzzy logical value of being in my living room, this means that I am lying in the doorway between my living room and my bathroom, such that exactly half of my body is in one place and half of my body is in another." Or it could mean any number of other things depending on what the "Fuzzy logician" finds convenient.
In other words, "Fuzzy logic" can mean anything vague related to numbers. In other words, it's just a buzz word that was trendy in the eighties for quantifying something without any particular logic behind it. In other words, it is crap.
I mean, seriously, the "discovery" of Fuzzy Logic involved no original or interesting mathematical machinery whatsoever, it just involved y Lotfi Zadeh coining a word to cover ad-hoc quantifying processes. It's the flimsiest of "pop" mathematics and it hasn't had much following for a while now. Sure you can "use" it in the sense that still engage ad-hoc quantification but you could do that before Zadeh came around.
Fuzzy logic is useful for appliances. Let's say your dryer knows the humidity and temperature of incoming and outgoing air and approximately how dry you want your clothing, and how long it's been running. At what point should it turn off? Now, let's add it has various sensors that have some level of accuracy which you can extrapolate by how effective the device is at drying clothing at a given temperature and humidity.
Now you could setup a wide range to test cases with various loads, temperatures faulty sensors etc. Or you can figure out a reasonable approximation by hand based on Fuzzy logic and ship it.
Note: your solution must run on a 4bit 32khz cpu with 400 bytes of ram.
Fuzzy logic can be useful when combined with a frequentist distribution for doing natural language processing (e.g. how many people would refer to someone at height X as "tall"?).
It's really just a special case of Bayesian inference: p(A calls B "tall" | B is a 6'1 man) is a combination of what you know about who is called tall in general and what you know specifically about who A thinks is tall. Unfortunately, for some reason many linguists don't like thinking in these terms, so it is easier to communicate with them using fuzzy logic vocabulary than Bayesian inference.
It's really just a special case of Bayesian inference: p(A calls B "tall" | B is a 6'1 man) is a combination of what you know about who is called tall in general and what you know specifically about who A thinks is tall.
As far as I know, thats not true. Fuzzy logic is meant to encapsulate the idea that someone is "sort of" tall.
I believe fuzzy logic, or something similar, is used in some handwriting recognition software. E.g. as you are looking at a letter, you start with "This letter is an A" as having value 1/26, etc... and start to change them as you look at it. In this case its very similar to probability. I'm not sure of any other uses.
However, I remember reading studies that seemed to indicate that apes/chimps use fuzzy logic. I don't remember who wrote it or how they tested it, but it seemed fairly convincing at the time.
So, I guess I'd say its not so useful now (at least not as an independent concept) but if its true that humans use it, it might become useful in the future.
In the end though, fuzzy logic isn't going to solve your problems for you, at least not alone. The way you use the fuzzy logic is going to be much more important.
Or, as jey said, you can actually use probabilities to control your appliance, and then have a lot of theory behind your inference process. i.e. not something ad hoc, like fuzzy logic, which can ultimately be transformed into probabilities regardless.
Fuzzy logic is not about creating actual intelegence just a quick and dirty aproach that happens to be useful. When selling bread makers you are vary limited in your development budget and the HW you send to people. So yea it's overly simple add hock solution but it's also cheap.
There is a lot of theory behind fuzzy logic as well and you get the bonus of it being very simple to implement. Thus it's use in appliances where cheap/tiny processors are the norm.
I agree that fuzzy logic is crap, in the sense that it just involves adding a once-trendy buzzword to ad-hoc approaches.
However, it should be noted that statistical inference is not a necessarily an effective learning approach given that it was created to deal with random variables and the world we are trying to understand has many non-random, orderly aspects.
Humans aren't good at doing the things that statistics is good at but statistics isn't good at doing the things humans are good at. Just as an example, a person can indeed act effectively in uncertain but somewhat ordered environment but virtually no human being can tell you anything like the probability distribution of the events which they deal with in daily life.
So basically, we do indeed need new approach different from both the probabilistic and the pure-logical approaches. But problem is that melding these various existing approaches into something coherent and usable is far more easily said than done. One clear problem with any such system is that the complexity explodes for a formal specification which involves both probability and logical process.
I suggest people call their approach "a general theory" after they do something impressive with it. We're waiting.
Perhaps the intended title of article was "There Ought to Be A General Theory Of AI". That I'd agree with...
Why use ad hoc schemes when you can just maintain a probability distribution?