

Ctrl+B For Concurrency: Visual Programming Languages - evdawg
http://hackety.org/2009/03/13/ctrlB.html

======
henning
Plenty of scientists, engineers, etc. use
<http://en.wikipedia.org/wiki/LabVIEW> , and watching someone use it, you
wouldn't believe how easily you can create say, a complex artificial neural
network with awesome visualization.

So yes, being biased against a programming paradigm like visual/dataflow
programming just because it's organized visually rather than textually is
silly the reason disliking Lisp because of the parens is silly.

~~~
mechanical_fish
Yes, LabVIEW is great until you learn about functional abstraction. ("Gosh,
you mean I can write this code once and then call it multiple times with
different arguments?") At which point you will start pining for the awesome
expressive power of Visual Basic.

Imagine if, every time you wanted to define a function, you had to open up a
PowerPoint document and draw that function's interface using the mouse. Then
you'd have to call your function by embedding its thumbnail into another
PowerPoint document. (The linking is done by embedding a pathname, which you
pick by using a File Open dialog box.) That's the feel of the LabVIEW
programming experience. It takes vast reserves of patience.

You might wonder what happens if you copy your program onto another system, or
-- god help you -- revise one of your functions in a way that breaks some
other program which invisibly depends on it. The answer is: you're screwed.
You can't even ask your version control system for help: Your code is in a
proprietary graphical binary format. Better keep those backups current!

As you can probably tell, I'm a veteran of LabVIEW, like many laboratory
scientists. LabVIEW is to experimental science what Excel is to accountancy.
It's a powerful tool in its domain [1], perhaps even an indispensible one,
which tempts amateurs to extend it farther, and farther, until they find
themselves doing things that really _ought_ to be done using a different tool.
One that can be configured using a text editor.

\---

[1] "Make the oscilloscope talk to the spectrum analyzer, then sample the
voltage with this interface card and draw a strip chart of the results. You
have twenty minutes."

~~~
michaelneale
I know nothing about LabVIEW, but I get your excel analogy. However, whenever
I have seen adequate (bug ugly) excel "systems" replaced by something, the
result is far from great, and rarely justifies the effort (but I am just one
data point). Not sure if that could flow back to the analogy or not though.

~~~
mechanical_fish
Your example does flow back, which I why I drew the Excel analogy. LabVIEW is
a fascinating piece of tech. I have had many occasions to loathe it -- what
emacs user wouldn't hate a language that can't be edited as text? -- yet I
love it all the same, and when you work in science you can't really resist
using it. It is fast, and effective, and if you carefully avoid the temptation
to use it for _everything_ it is a joy to use.

Part of its secret is that it fits the problem. You're standing in a lab
plugging things into other things using a snarl of BNC cables. You're not
trying to design the correct abstraction. You're trying to solve your problem
by lashing together the tools that you already have, sometimes in a hilarious
fashion. ("We're out of voltmeters, but we do have a $50k sampling
oscilloscope. We can use that!") LabVIEW code often looks like spaghetti, but
that's partly because its use case also looks like spaghetti.

There is more than one kind of power tool. Computer scientists like general-
purpose high-level tools, for good reason. They understand that "the pen is
mightier than the sword", and they spend a lot of time collecting pens. But
sometimes you need to cut something in a hurry, and at those times there's
really no substitute for a well-designed sword. You can use a pen to design a
sword -- perhaps even a _better_ sword -- or to convince the folks with swords
to move somewhere else. But it's a long and unreliable process. And it can be
immensely satisfying to just pick up a sword and stab something!

------
jksmith
For an aside, I like the comment about the developer being unapologetic about
using Delphi for development.

The unfortunate thing about the argument for what makes a programming language
productive (dense language or the libraries available to it), is that the
argument is strongly rooted in open source, IMO, where the tools continue to
be mostly deficient in the productivity department compared to working in a
language environment like Delphi, where millions and millions of lines have
held up to commercial rigor.

Delphi is a counter example to show that no matter what advanced language
features are available to the programmer in another language, lisp, haskell,
clojure, whatever, these languages still can't compete under commercial
conditions with Delphi in a native win32 desktop environment. Delphi is the
hemi engine of the legacy world, and would be a great candidate for
integrating some advanced features to make it a good tool for ramping into a
more advanced functional mindset.

I've written my share of Delphi code, and I've been sick of declaring
variables, sections, typing begin/end, and writing a host of other tiresome
cruft for a long time. But for development speed it really is hard to beat.
Programmers who have been successful in the windows world expect the total
package - language, libraries, dev environment. Maybe none of the three are
perfect, but together they still make the alternative of just language and
libraries mostly unacceptable under commercial conditions.

------
dave_au
Citeseer is down at the moment so I can't find the paper on the big hurdles
any given visual programming language has to deal with.

I've used LabView a little, wrote the backend code for a startup that was
trying to create an ambitious visual language and have a friend doing a Phd in
the field (which is where I got the paper reference from).

The biggest problem I've found - and it's on the list of problems from the
paper - is that there's a mental cost associated with the layout and wiring in
2D. It starts off as fairly negligible but eventually becomes a huge monkey on
your back. An auto-layout feature can alleviate that, but then you've got to
keep track of where everything is.

Some people seem to be immune to this, so I guess the proportion of people
effected (along with the average size of a logical unit in the language -
function, module, program, etc) might in turn effects the potential of a
visual language to get mindshare.

Talking to a number of people who've used / written / studied them about this
problem over a couple of beers the best solution we've come up with is to have
a visual language for beginners and a textual dataflow language for the more
advanced users, and the ability to convert between the two for the people
making the transition.

Thoughts?

------
nihilocrat
Honestly, with such a bird's-eye view of vvvv I'm not entirely convinced it's
a revelation. It obviously is good for rendering pretty pictures, and the ease
of concurrency is a big boon. I'm not convinced, however, that the language
has the flexibility you'd find in a general-purpose language, but I'd like to
be proven wrong. Has anyone written a webapp in it? A network multiplayer
game? I don't mean to denigrate the value of domain-specific languages, but
the code snippets only convince me that it's a cool tool for its intended
applications.

Looking at some of the less-than-elegant "source code" examples on their site
I see how there doesn't seem to be anything that magically restricts people
from writing spaghetti (pun intended) code.

------
amichail
Check out my visual programming approach to teaching binary search tree
algorithms:

<http://opsis.sourceforge.net/quals/project.pdf>

------
peregrine
I find this fascinating, I've always been a visual person and drawing my code
on paper with connections and such has always helped. I will be looking into
this.

