

Finite State Machines in Clojure - PaulHoule
https://github.com/cdorrat/reduce-fsm/blob/master/README.md

======
sunilnandihalli
Thanks for this new library. Can you compare yours with
[https://github.com/ztellman/automat](https://github.com/ztellman/automat)

~~~
emmelaich
It would also be interesting to compare in features and speed to

[http://www.brics.dk/automaton/](http://www.brics.dk/automaton/)

------
Reefersleep
What are the typical (or potential) use cases for FSMs?

I do web development. Is it something that could help me better model rules
for a business domain, for example?

~~~
unhammer
They are heavily used in natural language processing, e.g. for modeling
dictionaries (using finite state transducers, where the input side has the
inflected form, output side has the dictionary form+part of speech), part of
speech taggers (markov chains can be implemented by finite state machines).

Note also that regular expressions (the kind you find in sed/awk, not the perl
extended stuff) are equivalent in power to finite state machines – a regex can
be modelled by an FSM, and an FSM can be turned into a regex.

Outside of NLP, they can be used anywhere you can implement your logic as a
transition table with rows like "fromstate,input,tostate,action". I believe
it's common for many servers and network stuff to do this.
[https://en.wikipedia.org/wiki/State_pattern](https://en.wikipedia.org/wiki/State_pattern)
and [https://en.wikipedia.org/wiki/Automata-
based_programming#Exa...](https://en.wikipedia.org/wiki/Automata-
based_programming#Example) have some fairly traditional examples.

One of the plus-sides of finite state machines is that they are formally _less
powerful_ than full turing machines. This might make it easier to check that a
program does what it should, but they also compose in ways that full turing
machines (or intermediate-level machines like context-free or context-
sensitive) can't. E.g. you can take two FSM's, and do a set
intersection/union/difference/kleene star/concatenation/reverse and still stay
within finite state (unlike e.g. context free, where you can't do
intersection/difference). But it also means there are certain types of logic
you can't express within finite state.

~~~
burntsushi
To add another one: Lucene uses a finite state transducer to represent a part
of its term index: [http://blog.mikemccandless.com/2010/12/using-finite-state-
tr...](http://blog.mikemccandless.com/2010/12/using-finite-state-transducers-
in.html) \--- It's essentially a dictionary as you say, but instead of a
direct NLP use case, it might map a word to some numerical value (like a file
offset where the term's postings list resides).

I actually haven't been able to find anyone else using transducers for this
purpose (sans OpenSextant, I think), but I'd be curious to hear about it if
others knew. It seems like a really awesome way to represent nearly arbitrary
maps/sets of billions of strings.

