
Programming paradigms that change how you think about coding - niels
http://brikis98.blogspot.com/2014/04/six-programming-paradigms-that-will.html
======
AlexanderDhoore
The aurora language seems very interesting. Too bad there is already another
language called Aurora...

I makes me think of Elm [1] and (functional) reactive programming. Reactive
programming is fantastic. It's kind of like how a spreadsheet program works.
If a variable changes, all variables who depend on it change as well. Given "a
= b + c", if c increments by 1, so does a.

It has many advantages over event based systems, like Javascript. Reactive
programs don't need callbacks. The changing values propagate the "event"
through the system.

I'd love to hear what you guys think about this direction of programming. It
seems very natural to me.

Edit: I also see reactive programming as the golden way of having changing
state in functional languages. Functional languages have no problem with data
or state. They have a problem with change of state. The reactive paradigm
solves that problem. All change is implicit and code can be exactly as
functional as before.

[1] [http://elm-lang.org/](http://elm-lang.org/)

[2]
[http://en.wikipedia.org/wiki/Reactive_programming](http://en.wikipedia.org/wiki/Reactive_programming)

~~~
ibdknox
we're actually in the process of changing the name :)

EDIT: re FRP, you might find this Lambda The Ultimate post insightful:
[http://lambda-the-ultimate.org/node/4900](http://lambda-the-
ultimate.org/node/4900)

FRP has issues with openness and isn't real great at dealing with collections.
It also forces you to express things kind of unnaturally (e.g. instead of
"click this and increment x", you say "the counter is the count of all click
events"). There are other methods of managing time, like Glitch[1] and
Bloom[2] that seem more promising :)

[1]: [http://lambda-the-ultimate.org/node/4910](http://lambda-the-
ultimate.org/node/4910) [2]:
[http://boom.cs.berkeley.edu/](http://boom.cs.berkeley.edu/)

~~~
rybosome
"It also forces you to express things kind of unnaturally (e.g. instead of
'click this and increment x', you say 'the counter is the count of all click
events')"

I suppose it's personal preference, but I actually think that the latter of
the two statements is much more understandable. This is why I like functional
programming in general; a single value can be expressed as a declarative
transformation of other pieces of data, making it clear how the value is
derived. Using the imperative model, it's up to variable names and
documentation to make it clear what 'x' is. It's like looking at one of those
pictures that contains multiple shapes simultaneously; suddenly you see the
one you haven't been seeing, and then you wonder how you didn't see it all
along.

~~~
seanmcdirmid
Explicitly manipulating events as a stream has performance problems; mainly
time leaks. Another issue is that it makes debugging really difficult, as
you've now converted much of your control flow (that you could step through)
into data flow (that you cannot), while there is very little progress on
building good data flow debuggers.

Finally, events are often manipulated in very tricky ways. Take the problem of
implementing a drag adapter: you have to remember to capture the mouse on the
down event and release your capture on the up event, but then you also need to
capture widget and mouse positions on the down event so you can compute deltas
that avoid sampling resolution problems on the move events. Rx always gets
these two points wrong, which is very annoying, but they have to fudge it
otherwise event stream processing can't win in elegance.

~~~
rybosome
I haven't done any serious FRP, but thinking about your example of the drag
adapter, wouldn't it be possible to do something like the following?

    
    
      - Declare a stream consisting of the composite of two stream: a mouse down and a mouse up
      - Map over the mouse down portion, transforming into an (x, y) coordinate
      - Map over the mouse up portion, transforming into an (x, y) coordinate
      - Produce a tuple of the two values
      - Use the resulting signal that determines what to do with the drag
    

In Bacon.js, I think it would look something like this (haven't tested it):

    
    
      var makeCoordinate = function(v) { return {x: v.clientX, y: v.clientY}; }
      var mergeStreams = function(v1, v2) { return {down: makeCoordinate(v1), up: makeCoordinate(v2)}; };
    
      var $html = $('html');
      var mousedownStream = $html.asEventStream('mousedown');
      var mouseupStream = $html.asEventStream('mouseup');
      var dragStream = mousedownStream.flatMap(function() {
        // Ensure that we only sample a single mousedown/mouseup pair.
        return Bacon.zipWith([mousedownStream, mouseupStream], mergeStreams).
            takeUntil(mouseupStream);
      };
    

I don't mean to be pedantic - your point is well taken. This was definitely a
mental exercise to write, and I have no experience debugging (though I imagine
it would be difficult).

EDIT: For this particular example I actually made it more complicated than it
needs to be. Example JS Fiddle here:
[http://jsfiddle.net/w6mCK/](http://jsfiddle.net/w6mCK/)

~~~
seanmcdirmid
Compare with the control flow heavy managed time [1] version:

    
    
      on widget.Mouse.Down:
      | var pw = widget.position
      | var pm = widget.Mouse.Position
      after:
      | widget.Mouse.Capture()
      | widget.position = pw + (widget.Mouse.Position - pm)
      | on widget.Mouse.Up:
      | | widget.position = widget.position # freeze widget position
      | | break                             # stop the after block so capture/dragging stops
    

[1]
[http://research.microsoft.com/pubs/211297/managedtime.pdf](http://research.microsoft.com/pubs/211297/managedtime.pdf)

------
jameshart
This is a substantial piece of writing with information many here would find
interesting; putting it behind a buzzfeed list style headline does it a
disservice. One step short of calling it "Six weird programming paradigms that
will blow your mind".

~~~
glifchits
Just because Buzzfeed uses list style headlines and has vacuous content does
not mean that list style headlines are an indicator of vacuous content. If you
read this and you liked it, does the title really matter?

~~~
zwieback
Exactly. Plus, while the content might be vacuous it's often quite
entertaining and the editorial style is genius, in my opinion. I think writers
can probably learn 41 things from Buzzfeed.

------
simias
In the "concurrent by default" section I would add the hardware description
languages like VHDL and Verilog.

Learning Verilog was an eye opening experience for me. It reminded me of the
time I switched from unstructured BASIC to C when I was a kid. At first it
seems complex and weird then suddenly it clicks and it all starts making
sense.

~~~
adwn
... and then suddenly you realize what a horrible, horrible language it is.
I'm not exaggerating, it isn't even well-suited for the domain it is mainly
used for (i.e., designing digital hardware circuits). For example:

1) Synthesis/simulation mismatch: Your design might work in simulation but not
in hardware, _and vice versa_. Often, this is due to X-value (representing
unknown/invalid values) problems.

2) Signed datatypes: If you mix signed and unsigned values in an expression,
the result is _unsigned_. So, if _a_ is unsigned and has the value 4, _b_ is
signed and has the value -3, then

    
    
      (a * b) < 5
    

will return 0 (false), while

    
    
      ($signed(a) * b) < 5
    

will correctly return 1 (true). That's because signed datatypes are just an
afterthought and weren't officially introduced until 2001, more than 15 years
after the language's inception.

3) If an undeclared net is used in an module instantiation, the compiler
silently creates a 1 bit net with that name. No error message, no warning. You
can turn off that stupid behavior with a compiler directive, but you have to
turn it back on at the end of your file, because otherwise its effects are
active in _all files compiled afterwards_ , and many third-party sources don't
work correctly with this setting.

VHDL is also a bad language, but for very different reasons. And yet,
alternative HDLs have a hard time getting traction, for various reasons and
non-reasons.

~~~
Igglyboo
I wholeheartedly agree.

I'm a CSE Major and I took a hardware design class last fall, we programmed
and Xilinx FPGA using VHDL.

It was a huge change from everything I had learned before, I had no idea what
a latch was and why it was bad to imply them. The thing that bothered me the
most was the simulation/synthesis disconnect, the only efficient way to debug
a program is by using the simulator because debugging it on the board takes a
very long time when changing code. But even after you've debugged it in the
simulation it might not even synthesize, which is an even harder "bug" to
debug.

~~~
noobiemcfoob
Not knowing what a latch is or why it is bad seems like more a failure of the
class than an inherent problem with Verilog. There are times when a latch is
useful and desired (though most often not and most often a normal flop
structure is better).

With hardware design, you have to keep in mind that you are modeling hardware.
A language that did it all for you might be nice, but you still have to know
what framework you are working in. The simulation/synthesis disconnect is
inherent to that. You can think of a ton of structures that seem fine at a
code level but just don't have a workable analog in silicon. Synthesis in that
sense is just another debug tool.

~~~
adwn
> _Not knowing what a latch is or why it is bad seems like more a failure of
> the class than an inherent problem with Verilog._

You misunderstood the problem: It's not a problem that you _can_ describe a
latch, but that it's so _easy to make the mistake_ of describing a latch
instead of combinational logic. It's a common mistake and clearly a failure of
the language, because two very different intents are described by very similar
code.

> _The simulation /synthesis disconnect is inherent to that. [...]_

No, you're confusing simulation/synthesis mismatch with non-synthesizable
code. The former means that simulation gives you different results than the
hardware at runtime, which is a very bad thing and should be avoided wherever
possible (that's the point of simulation, after all). The latter is
unavoidable to a certain degree if the HDL should be usable for simulation.

~~~
noobiemcfoob
On latches: From experience, I've never had an issue accidentally defining a
latch. Simple guidelines (fully describe any case statements or always have an
else statement in if branches, use * from Verilog 2005 rather than trying to
list all relevant signals) make it more an issue of typos than true
unintentional latches and a basic lint flow will identify them quickly.

While I'd accept that perhaps exposing this so easily might be an annoyance in
the language, it is hardly a failing.

Synthesis: Again, from experience, this is an issue with your synthesis flow,
not the HDL. Synthesis is absurdly complicated, to the point where any team
I've been on has at least one guy where that is the entirety of his job. If
the generated netlist breaks simulation when your HDL made it all the way
through all other flows, more than likely it's an issue with synthesis.

------
jarrett
A thought on dependent types:

Can a dependent type system catch _all_ type errors at compile time? For
example, suppose I write the following (in pseudo-code):

    
    
      // Variable x is an integer greater than or equal to 0 and less than 256.
      int x (>=0, <256) 
    
      x = 128 // This is valid.
      x = x * 3 // This violates the type.
    

I can imagine how a compiler could catch that kind of error. But that's
trivial. What happens in programs like this:

    
    
      int x (>= 0, <=10)
      x = parseInt(getKeyboardInput)
    

Now the compiler can't know for sure whether the type has been violated,
because the value of getKeyboardInput could be anything. To take a page from
Haskell, you could do something like this (which is still pseudocode, not
valid Haskell):

    
    
      // x is a value that is either 1) an int from 0 to 10, or 2) nothing at all.
      maybe (int (>= 0, <=10) x
      
      // applyConstraint recognizes that parseInt may return a value violating x's contraints.
      // Thus it transforms the return type of parseInt from int to maybe (int (>= 0, <=10)
      x = applyConstraint(parseInt(getKeyboardInput))
    

Or perhaps applyConstraint wouldn't have to be called explicitly, but would be
implicitly added by the compiler as needed. I'm not sure which is better
stylistically.

Either way, applyConstraint would be required any time a computation could
return an invalid value. That would get tricky, because the compiler would
have to track the constraints on every variable, even where those constraints
aren't declared. For example:

    
    
      int w (>= 0, <= 10)
      int x (>= 0, <= 2)
      int y
      int z (>= 0, <= 20)
    
      y = w * x
      z = y
    

Here, the compiler would have to infer from the assignment "y = w * x" that y
is always between 0 and 20.

Do any languages currently take the idea this far (or farther)?

~~~
arh68
> _Can a dependent type system catch all type errors at compile time?_

I'm probably not clever enough, but someone could probably prove that'd be
equivalent to solving the halting problem. It seems impossible.

> _because the value of getKeyboardInput could be anything_

That makes my brain hurt. I think implicitly narrowing the type is
stylistically better. Maybe just adding a dependent type declaration:

    
    
        x = (parseInt(getKey) :: Int(>=0, <=10))
    

Then you type-annotate everything you can, marking every non-annotated
expression as wild:

    
    
        w = ... :: Int(>=0, <=10)
        x = ... :: Int(>=0, <=2)
        z :: Int(>=0, <=20)
        y = w * x                 -- :: Wild
        z = y                     -- compiler should Scream!
    

Asking the compiler to infer arbitrary type declarations seems hard. I think y
probably shouldn't be inferred as :: Int(0<=..<=20) unless there is a rule
somewhere that Int(a<=..<=b) * Int(c<=..<=d) = .. :: Int(f<=..<=g).

The work still needs to be done (no magic bullet here), but we should be able
to use the computer to generate well-defined type combinations.

~~~
leoc
> I'm probably not clever enough, but someone could probably prove that'd be
> equivalent to solving the halting problem. It seems impossible.

IIRC dependently-typed languages dodge this bullet by not being completely
Turing-complete (as 'twere).

~~~
chriswarbo
Not quite. Many dependently-typed languages aren't Turing-complete, but that's
no how they 'dodge this bullet'.

Think about the following Java code:

    
    
        public int getMyInt() {
          return getSomeOtherInt();
        }
    

How hard does Java have to work to figure out whether getMyInt is well-typed?
Does it have to solve the halting problem? No. It just checks the information
that you have given it. If you wrote that getSomeOtherInt has return type
"int" then getMyInt will type-check. If you gave it some other type, it won't
type-check. If you didn't give it a type, it will complain about a syntax
error. At no point will Java try to write your program for you (which _would_
hit against the halting problem). The same is true in dependently-typed
languages, except the types happen to be much stronger. You still have to
write them down, write down values of those types, write conversion functions
when you need to combine different types, etc.

Incidentally, if a language _is_ Turing-complete, it actually becomes _really
easy_ to get a program to type-check; we just write infinite loops everywhere,
which leads to logical fallacies ;) That's why many dependently-typed
languages aren't Turing-complete (although many are; eg. those which
distinguish compile-time terms from run-time terms, like ATS).

~~~
Ralith
You can compromise on turing-completeness, too. Idris, for example, lets the
programmer decide whether a given definition should be required to provably
terminate or not.

------
lolo_
I think an underrated non-standard approach to programming is graphical
programming. Though this approach doesn't seem to received significant uptake
amongst professional programmers, there is an application called max [0] that
is popular amongst musicians and artists and quite surprisingly powerful and
effective.

There's an interesting article [1] on how Jonny Greenwood of Radiohead uses it
extensively, in there you can see some examples of how it works - modules
wired together visually.

I think there is a lot of potential for a really nice mix between text-based
programming and graphical programming to work for general programming too.

[0]:[http://cycling74.com/products/max/](http://cycling74.com/products/max/)
[1]:[http://thekingofgear.com/post/25443456600/max-
msp](http://thekingofgear.com/post/25443456600/max-msp)

~~~
humanrebar
I've used graphical programming in the past. It sounds amazing in abstract,
but ends up being a mess when it is implemented. It basically requires a
"sufficiently smart editor", which, even if implemented perfectly, would not
leave a lot of room for a third-party ecosystem to be built around the
language.

There are many solved problems in text-based programming that would need to be
resolved in order for a graphical programming language to be as useful.

How would one post a "snippet" to StackOverflow? How would diffs work?
Consequently, how would source code management work?

~~~
tarblog
The solution that comes to mind is that the graphical elements are backed by
text for these purposes. Then, the graphical interface amounts to an editor
for this meta-language.

~~~
humanrebar
Inevitably, that is what is attempted. But it doesn't solve the use cases I
mentioned.

To take a diff for example, not only do you need to illustrate the diff
version of (I drew a new line from here to here and made this other box
green), you would also need to illustrate patches to textual details and how
they related to the graphical code.

Maybe I can imagine a solution that could do that (at great expense). I cannot
imagine the language being friendly enough that third-party compilers, IDEs,
or static-analysis tools would be feasible.

------
bru
Some notes:

\- parallel and concurrent are 2 different things

\- the 'symbolic languages' definition seems off. Wikipedia puts it right:

> symbolic programming is computer programming in which the program can
> manipulate formulas and program components as data

So it's not "using graphs & such to program"

~~~
seanmcdirmid
Mathematica is a good example of a symbolic language (can't get more symbolic
than term rewriting), but he calls that....knowledge oriented or some other
nonsense.

~~~
Orangeair
Mathematica[0] and The Wolfram Language[1] are two different things. The
Wolfram Language is a sort of extension of their Wolfram Alpha service, which
Wolfram itself describes as "Knowledge Based." Mathematica is still just
Mathematica.

[0][http://www.wolfram.com/mathematica/](http://www.wolfram.com/mathematica/)

[1][https://www.wolfram.com/language/](https://www.wolfram.com/language/)

~~~
taliesinb
No.

Mathematica is a commercial piece of desktop software that uses the Wolfram
Language. Just like how RStudio uses R.

There are other product platforms (coming soon) that employ the Wolfram
Language, both in cloud and desktop incarnations:
[http://www.wolframcloud.com/](http://www.wolframcloud.com/)

~~~
seanmcdirmid
I'm not great at following markitecture, but didn't Mathematica exist and have
a language before this language was rebraded as the Wolfram Language? There is
nothing really wrong with this, but its understandable that people might be
confused about it right now.

~~~
taliesinb
That's right, and what seems to be confusing people is that the language
underlying Mathematica-the-product was always, implicitly, "Mathematica".

We were a one-product company, and it didn't make sense to distinguish
Mathematica-the-product and the language it ran.

The last time we'd tried to branch the underlying language off into its own
thing was in the early nineties, where a certain intern by the name of Brin
was in the middle of refactoring the code before he went off to do other
things :).

My prediction is that the confusion will pass soon once we have these concrete
products actually out in the world.

A new generation of people will be using and seeing the language for the first
time. And they'll be doing stuff that old-school Mathematica-the-product (and
most of its user base) would find quite alien.

Things like trying code out in online sandboxes, coding things in the cloud
IDE (or locally, via Eclipse or "Wolfram Desktop"), deploying public APIs for
other people to use, building websites totally within the language, doing
internet-of-things type stuff, embedding chunks of WL code in Java or Python
or whatever, writing natural language parsers on top of Alpha's technology,
creating data science workflows, making private clouds, designing interactive
visualizations, tweeting code at our executor-bot... and on and on...

------
hexagonc
My first and only encounter with "Concatenative Languages" was programming the
HP48GX[1] graphing calculator in highschool. Thinking back to it, I'm amazed
by what you could do with it. It was very powerful even by today's standards.
Whereas other kids had Gameboys, I had an "HP". I even got in trouble playing
tetris on it during my German language class. My calculus teacher never knew
that you could do symbolic integrals and derivatives with it (using a free
computer algebra library). Sadly, the only program of note that I wrote for it
was an implementation of the The Game of Life[2].

[1]
[http://en.wikipedia.org/wiki/HP-48_series](http://en.wikipedia.org/wiki/HP-48_series)
[2]
[http://en.wikipedia.org/wiki/Conway%27s_Game_of_Life](http://en.wikipedia.org/wiki/Conway%27s_Game_of_Life)

------
untothebreach
I was a little disappointed that Factor[1] didn't get a mention in the
'Concatenative' section. Its stack effect checker takes care of a lot of the
problems he mentions, IMO.

1: factorcode.org

~~~
evincarofautumn
“Stack checking” is definitely the ticket to maintainable stack programs. I
have been working on a statically typed concatenative language called
Kitten[1] off and on for a while, and while the error messages leave something
to be desired, they can at least help you understand how data is flowing
through a program, and of course rule out some bugs.

[1]:
[http://github.com/evincarofautumn/kitten](http://github.com/evincarofautumn/kitten)

~~~
untothebreach
very cool, I will definitely take a look at this

------
milliams
QML [1] is an interesting example of declarative programming. It allows
constraints and relationships to be defined and the runtime will do the rest.
Perhaps it's not as powerful as other languages but in its domain it does very
well.

[1]
[https://en.wikipedia.org/wiki/Qt_Modeling_Language](https://en.wikipedia.org/wiki/Qt_Modeling_Language)

~~~
general_failure
QML is simply brilliant for developing UIs.

------
sirsar
LabVIEW is concurrent by default; control flow is done by linking the outputs
of one function to the inputs of another. This makes writing concurrent loops
ridiculously easy: just put two loops next to each other.

I rarely use it because organization is such a pain, but its "data-flow"
paradigm does simplify a lot of logic.

~~~
cdtwoaway
Happy to see another LV user here.

Organization gets much, much easier once you get the big architecture concepts
- produce/consumer patterns, messenger-queues, event structures, actor
framework. Yes, it can be painful, but the newer functionalities (clean-up,
riddiculously easy sub-vi creation,..) subtly improve it.

Btw: If you are using LabVIEW and have never used Quickdrop, try it. It is the
most underutilized and amazing feature.

------
prezjordan
I wish APL/J/K were on here, but I guess that doesn't really change the way I
think about coding... it just blows my mind.

------
saosebastiao
I'm a huge fan of the declarative programming paradigm, but outside of Regexp
and SQL and a handful of other DSLs, it's dead. Its death should be a case
study in Open Source strategy: It died because it became boring before it
became useful. SQL and Regexp have stuck around because they did something
useful immediately.

I think that any future that the Declarative paradigm has within general
purpose languages is the kind applied by compilers. For example, x = (a + b) -
a can be reduced to x = a or even eliminated altogether with subsequent in-
scope references to x being replaced with a. Another example is dead code
elimination. These forms of declarative let you use an imperative or
functional language immediately but gently introduce you to declarative
benefits without having to deal with all the mind bending that is necessary to
optimize pure declarative code.

~~~
segmondy
You are very mistaken. Prolog is still very much alive. You can find us on
##prolog in freenode. You can build a web application with prolog. When the
2048 madness was going on, I implemented in prolog in about 200 lines in 2
hours and I had about 3 weeks of prolog under my belt at that time. It's a
very powerful concept. I didn't have to figure out the how to implement it, I
just broke 2048 down into rules, declared it, and bam, I had a game.

~~~
singold
How/where do you recommend to learn prolog? The concept looks really
interesting and useful, at least for me.

~~~
mindcrime
There are a lot of good Prolog resources on the web. I gathered up a little
list some time ago. You can find it here:

[http://fogbeam.blogspot.com/2013/05/prolog-im-going-to-
learn...](http://fogbeam.blogspot.com/2013/05/prolog-im-going-to-learn-
prolog.html)

------
z3phyr
Since 'functional' is not mentioned, I will assume that it is mainstream now!

~~~
nly
Or maybe it just won't 'change the way you think about coding'

~~~
lomnakkus
I think that depends a lot on what you mean by "functional". If "functional"
means "everything is immutable" then I can guarantee that you'll learn a lot
by programming that way. If you mean "first-class functions" then I cannot --
although you still might. You could probably program in Scheme (say) as if it
were C with side effects all over the place, but that wouldn't teach you
anything except a different syntax. OTOH, coding something non-trivial in
Haskell would teach you _a lot_ \-- even if you don't end up using/liking it.

(Aside: I was somewhat disappointed by the blogger calling out the "Wolfram
Language" as something special or to be admired. It's a ridiculous ad-hoc
hodgepodge of a _language_. I stress the word _language_.)

------
mjb
Other languages to add to this list would be Dijkstra's Guarded Command
Language, and Promela. Promela is especially interesting because of the
(nondeterministic) execution semantics, which provide an extremely interesting
way to model parallelism. In a similar vein, TLA+ is worth a look.

Both Promela (Spin) and TLA+ have active communities and have found fairly
wide use in industry. They are generally used for model checking, model
extraction by guided abstraction, and development by refinement, but can be
used in a much more adhoc way to just experiment with parallel ideas.

------
snorkel
10 Ways Buzzfeed-style Headlines Will Forever Be Annoying

~~~
cshimmin
Honestly, my facebook feed has trained me to simply never click on any article
that begins with a number.

------
sergiosgc
Where is Aspect Oriented Programming and all the other offspring of the
Inversion of Control pattern (Dependency Injection, Dependency Inversion,
...)?

Is this line of evolution in languages considered dead?

~~~
ExpiredLink
Yes, thank heavens, yes!

~~~
platz
These are more tactical (i.e. low-level) techniques that big strategies that
encapsulate entire languages.

------
kitd
Pointed out elsewhere, but ANI appears to be dead according to its own
tutorial[1]. However Funnel[2] by Martin Odersky/EPFL does a similar job, with
a more explicit nod to Petri nets which are usually used as the basis for
concurrent systems.

[1]
[https://code.google.com/p/anic/wiki/Tutorial](https://code.google.com/p/anic/wiki/Tutorial)
[2] [http://lampwww.epfl.ch/funnel/](http://lampwww.epfl.ch/funnel/)

------
josephschmoe
Code Search is a better way to do Declarative Programming for non-optimized
solutions. I've been obsessing over this topic for the last few months. Right
now there's limited versions in a few places: Python's howdoi and Visual
Studio's Code Search.

A true Code Search would work like this: 1\. Type in your search term in your
code in a comment line. i.e. "Bubble sort StampArray by name" 2\.
Google/Bing/StackOverflow searches for your string. Replaces your terms with
generics. Searches for "Bubble sort [an array of objects] by [string
variable]" 3\. Takes code results and shows them to you. Replaces all
instances of [string variable] with getName() and all instances of [Object[]]
with StampArray. 4\. You pick your favorite. 5\. Your IDE adds the code to a
"code search module" which you can edit. 6\. Your edits get added to the
search database.

The best part? You could even put your Declarative Programming engine -inside-
of the Search just by populating initial search results. What about better
code coming to exist in the future, you say? Well, you don't necessarily have
to keep the same result forever. If it's been deprecated, you can re-do the
search.

------
protomyth
I've been thinking a lot about agent-oriented programming. I had a General
Magic device back in the day and later thought the concept of a Telescript
like language as applied more for code organization than code mobility might
be interesting. I guess APIs won, but I still think there is something there.

~~~
abecedarius
[http://erights.org/](http://erights.org/) was not technically like
Telescript, but aimed at a similar-enough vision you might find it
interesting.
[http://erights.org/elib/capability/ode/index.html](http://erights.org/elib/capability/ode/index.html)
for that vision. The section on mobile almost-code at
[http://erights.org/elib/distrib/pipeline.html](http://erights.org/elib/distrib/pipeline.html)
should help to relate it to Telescript.

------
josephschmoe
Dependent types are a wonderful idea so long as I can do a couple things with
them: 1\. Copy paste them without complications. i.e. "non-null" requires no
code that relies on specific variables unless there's a logical conflict
(which variable?) 2\. If it's a known failure, give me a warning. If it's an
unknown failure, let me choose how to deal with it, again in a neutral fashion
that I could simply say @Notnull<skip> and it would just skip the code if the
variable is null.

------
keenerd
Declarative programming is a great one, almost a magical experience.

"It feels like I am sitting at the controls of a quantum computer. I've got
all my qubits (terms) all wired together in some complicated expression and
when power is applied every qubit will instantly collapse out of superposition
and crystallize into a perfect answer."

(From something I've been working on,
[http://kmkeen.com/sat/](http://kmkeen.com/sat/) )

------
Blahah
The concurrent by default paradigm looks like it could be really useful for
some cases. Does anyone know of any more well-used languages that support it?

~~~
simias
As I mentioned in my other comment Verilog and VHDL are "concurrent by
default" since that's how hardware works anyway.

If you want to experiment with them you don't need an FPGA, you can just start
with a simulator such as Icarus Verilog[1] and a waveform viewer like
gtkwave[2] and get a feel of the language. There are a bunch of tutorials on
the net.

[1] [http://iverilog.icarus.com/](http://iverilog.icarus.com/) [2]
[http://gtkwave.sourceforge.net/](http://gtkwave.sourceforge.net/)

~~~
GotAnyMegadeth
Verilog is Turing complete and can do standard IO, so you probably don't even
need a simulator/wave viewer.

~~~
simias
Yeah, but using waveforms is half the fun! :)

Also, debugging verilog using only $display doesn't sound very fun...

------
aufreak3
Would be good to add Mozart/Oz.

Dealing with process coordination using the resolution of logical variables
gave me a refreshing new perspective. The finite domain constraint system
design in Oz is an awesome example of this in action.

An interesting tidbit - the Mozart/Oz team invented "pickling" before it
caught on with Python.

------
danielweber
I'm in the midst of something else and can't pull out my C++11 book now, but
doesn't C++ have custom types that would let you declare something like "this
must be a positive integer"?

I might be confusing this with custom literals.

~~~
AnimalMuppet
Well, there's unsigned integers. This means that it can't be less than 0. But
if you have a 32 bit unsigned integer, and you decrement 0, you get
0xFFFFFFFF. That may or may not be what you want or expect.

You also could create a class that wrapped an integer, that would have a
range, and would maintain that range as a class invariant. It could either
clip or throw an exception when an attempt was made to go out of the range.
But that's just regular class stuff, so I don't think it's what you had in
mind...

------
josephschmoe
Would be pretty cool to have concurrency by default and then a lock
declaration I could do on a particular function to fix any concurrency issues.
Would need a new style of debugger/code view though specifically for this
purpose.

------
JupiterMoon
Isn't "Dependent types" just re-inventing how Fortran handles non allocatable
array and character variables i.e. those who's length is declared at compile
time using a parameter?

~~~
patrickmay
Common Lisp also supports specialized type declarations like this. (In Common
Lisp type declarations are optional.)

~~~
nbouscal
Common Lisp does not have dependent types. The example given is unfortunate
because it often leads people to think they already have (and understand)
dependent types, when that isn't the case.

------
kazagistar
If a programming paradigm does not change how you think about coding, it isn't
a programming paradigm. Good article though.

------
cowls
I read it, and how I think about coding remains the same as before I read it.

~~~
mcherm
I don't think the idea was that reading the article would change how you
thought about coding. I believe the point was that trying out these unusual
programming paradigms would change how you thought. For the cases on his list
that I have tried (declarative programming and Forth/Joy) I have found that to
be true.

I would add object oriented programming (eg: smalltalk), macro programming
(eg: lisp) and functional programming (eg: Haskell) to the list of things that
change your thinking, but since most of the article's prospective audience has
already heard of those I was happy that he stuck with paradigms that are LESS
well known.

------
joshlegs
.... did .... did you crosspost this from reddit ???

[http://www.reddit.com/r/programming/comments/22nhb2/six_prog...](http://www.reddit.com/r/programming/comments/22nhb2/six_programming_paradigms_that_will_change_how/)

------
SeanLuke
> If you've used SQL, you've done a form of declarative programming

This is so wrong I don't know where to begin.

~~~
dllthomas
Per Wikipeda, _" Common declarative languages include those of database query
languages (e.g., SQL, XQuery), regular expressions, logic programming,
functional programming, and configuration management systems."_

[http://en.wikipedia.org/wiki/Declarative_programming](http://en.wikipedia.org/wiki/Declarative_programming)

"Wikipedia is mistaken" is totally a defensible position, but it should be
defended, so please find a place to begin.

~~~
SeanLuke
Wikipedia is mistaken.

To me, in a declarative programming language you tell the computer what you
want without exactly telling it how to achieve it. It's the compiler's job to
figure that out. That classic example of a declarative programming language is
Prolog.

In databases there are two traditional approaches to extracting queries: the
relational algebra and the relational calculus. One of these (the relational
calculus) is very clearly declarative: you essentially are saying "give me all
students whose ID numbers are less than 1000 and who took a class from someone
who is no longer a member of the faculty." The interpreter figures how what
relational operations should be performed to do this.

The other option is the relational algebra, which to me is very obviously
_not_ declarative: you are telling the system exactly what to do, and giving
it an implicit ordering (though just like in any procedural language it can
change the ordering if it thinks it's a good idea). Thus in the relational
algebra you'd say "get the table of students. Reduce it to those whose ID
numbers are less than 1000. Join it with those students who took a class. Take
the list of faculty. Reduce it to those who are longer teaching. Join that
with the previously generated students relation. Project out just the student
names."

The primary language for the declarative relational calculus is Datalog.

The primary language for the (non-declarative) relational algebra is SQL.
Though SQL has a few extra gizmos added to compensate for the fact that it's
less expressive than the relational calculus.

~~~
dllthomas
I would appreciate it if you'd at least add a section to the Wikipedia talk
page with your concerns. I've not decided exactly how I feel on the matter,
but they're certainly not entirely groundless.

