Hacker News new | past | comments | ask | show | jobs | submit login

>Building something that isn't Turing-complete is surprisingly-hard once it's complex enough

The most basic computational device that is studied is the (deterministic) finite automaton, which corresponds to regular languages (regex, although actual implementations are usually way more powerful). If you add a stack (to count parenthesis basically) you have context-free (CF) languages, which correspond to the syntax of most programming languages. Add a second stack and you're already Turing-complete (TC).

If you know that, you can add any extra-power to your machine that is strictly less than a second unbounded stack, and you get a new language class! For a example, a second n-bounded stack. If you do so you will easily get an infinity of language classes. The point is, are they interesting? In particular, the language classes we focus on have some good properties that most arbitrary classes tend to lack.

The Chomsky hierarchy has context-sensitive languages in between CF and TC, but it is already not a very natural class so I've never seen it discussed anywhere, even in complexity theory research --which focuses a lot more in getting links to computability theory or subtle distinctions between deterministic and non-deterministic classes (most famously P vs NP). For the latter, studying analogs of the complexity classes on restricted models of computations is an interesting approach since Turing machines are difficult to work with.




In natural language processing, mildly context-sensitive languages (which sit between context-free and context-sensitive) have seen some study and use, because context-free languages fall short to model certain phenomena but context-sensitive are overkill (and not even polynomial to process).




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: