Hacker News new | past | comments | ask | show | jobs | submit login

In order to make a serious comparison of textual and visual programming, you'd be better off using a serious system such as Labview for the comparison, not a... toy for children.

Agreed. Simulink is an extremely serious visual programming language that powers many of our modern high technology. Spacecraft and the like.

The authors point on assuming reduced complexity makes me think that he doesn't understand how complex mathematical function models are even when the block diagram looks very simple to the untrained observer.

Simulink isn't really a visual programming language? It's a block-diagram simulator for simulation and analysis. You link together simulation blocks, and Simulink runs a ODE solver to "solve" for the time-varying output.

Simulink can generate C code for productionization, but to write custom algorithms, you typically need to write S-functions, which are typically in C, C++ or Fortran.

Visual environments let you set up topologies quickly and reliably. It's not really programming as such though -- kinda more like scaffolding. Any kind of complex logic is still more efficiently achieved in code because many abstract ideas cannot be efficiently expressed graphically. Visual abstractions are more useful for ideas that can be concretized (typically data-flows type ideas).

Source: used and taught Simulink for many years for control systems engineering.

I think it really depends on how you define a programming language. Simulink can generate a variety of codes from models, the most common being C as you mentioned, but Simulink models can also be compiled into an executable using the Real-Time Workshop toolbox. This packages the simulation engine into the model. You're right about needing code scaffolding like S-functions to compile into C but those physical model simulations are often used directly in the control system program.

While not efficient, thanks to Turing completeness and it's analog relatives, anything you can implement in code you could implement in Simulink flow. It took me about an hour to gin up Pong for example. I think that if it is fair to consider Scratch a visual programming language, then Simulink fits the bill.

I'm not entirely sure about that. Simulink solves an ODE (more or less) system internally. How did you write Pong in it, with interactive controls?

I'm also not sure if Simulink (without S-functions, .m or any external programming language) is Turing complete, and even if it is, Turing-completeness is a very low bar for a programming system.

Not trying to be contrarian, but from my experience, despite its many logic subsystems, Simulink isn't really meant to be a general purpose programming system so much as a simulation system for time-varying outputs. It's difficult to impossible to write most normal programs in it.

I wrote pong by encoding the ball position as a continuous complex variable and used a star chart (I-Q) with minimal persistence as the output. The paddle inputs were just variable sliders bound to a limit threshold equation. Ball motion was implemented as a normal physical system of momentum and boundaries.

Simulink models are Turing complete. You can set a discrete clock simulation (very common with the DSP toolbox) and implement Flip-Flops/Boolean logic. They are also for lack of a better term, Shannon complete, that is they have all the analog components to satisfy general function computability. They also have flow control outside of those conditions and full memory storage. If you were infinitely bored and long lived, you could write Windows in Simulink flow logic and run the modeler to make a VM.

I agree with you here. Simulink absolutely isn't meant to be a general purpose programming language. It's an extremely domain specific programming language but one that in it's specific domain offers considerable advantages over textual programming. I don't want to advocate writing software in Simulink because it's a bad idea, and I only do it myself in very limited cases because I have a hobby of analog computing and Simulink is very good for that at a certain level of abstraction.

Mostly, I just take umbrage with the Article and the author's reasoning. Simulink is a counterpoint to some of his arguments but those counterpoints were made well elsewhere in the thread so I was just providing an example.

Sorry if I am starting to drone on or come across hostile.

There are people who swear by Labview, especially those more familiar with electronics than code. There are very good researchers I know of in soft condensed matter Physics who know they can't code but feel confident using Labview.

One of the interesting things I found was that the 2-dimensional layout helped a lot in remembering where stuff was: this was especially useful in larger programs.

I believe Labview barely scratches the surface of what is possible.

Having spent a lot of time using Labview I would agree that it can be an amazingly useful tool (highly performant code and easily interfaces with data acquisition devices). That being said, LabView shows the limitations of Visual Coding:

Sharp Learning Curve Difficult to find help or code snippets online Lack of decent Version Control Finding the function you want is hard (requires navigating several submenus) *All of the icon symbols look the same

Also, while LabView offers a great platform for an object oriented coding style, I feel like the 2d layout always results in a messy sprawl rather than layers of abstraction. This could be because LabView is used by people with less traditional software background, or it may be that making new functions in a bit of a pain in LV.

Some of your complaints come from the fact that LabVIEW is pretty niche. It would need at least an order of magnitude more developers to see numbers like C/C++. That's why you're not finding much online help.

There is also the fact that LabVIEW is old. Like REALLY old. LabVIEW came out over 30 years ago and as such has a lot of legacy cruft along with lacking some of the more modern goals such as good scaling source control. LabVIEW was also designed by a hardware first company and has always been geared towards building that hardware ecosystem.

Finally, yea, poor programming practices exist in all languages. In visual languages, it makes them hard to read. Sometimes this forces you to be clever to reduce the flow complexity by increasing the logic or mathematical complexity which is it's own can of programming worms we could debate for hours.

Fair points. A lot of the issues with Visual Programming are related to the supporting tools - something the author mentions. That being said, text input easily supports search, pattern matching and code sharing, so what are the benefits to visual coding to justify building all these tools?

In my experience it is harder to implement good programing practices in a visual programing environment. For example, abstracting, consolidating and refactoring visual code is much more challenging because of the sprawl of wires...

Code sharing is actually really good in visual languages if done right because you are forced to have well defined inputs and outputs for your nested structure. Search is getting better and is not bad in Simulink. Pattern matching would be an interesting research project. Those big wire rats nests typically come about because the people using visual languages (especially Simulink and LabVIEW) are only doing that as their secondary or tertiary job. It's more important to get something working out the door than to do it well and proper and maintainable. Besides, it's hardly limited to visual languages; I'm sure nearly everyone here has gone back to some code they wrote in the past and found it nearly indecipherable.

The main benefits of visual languages is that they allow domain specific people to work in methods that are natural to the problem and closer to what they are used to. Some problems can be represented very easily visually but are a big pain in code such as physical systems. Data flow and timing diagrams lend themselves very well to visual descriptions and can help prevent race conditions. WYSIWYG editors open up software domains to huge swaths of people who go on to produce some incredible work. Yeah, ultimately, people run up against the limitations of these programming environments and then you either transition to a less friendly but more scalable code, turn into spaghetti mess, or give up.

I think the biggest thing text based code development has for it is that text is inherently "open" and simple to share so tools get built for it while most visual languages tend to be proprietary, niche, and locked down. Maybe one day we will see a solid FOSS visual programming language/environment that hits above it's weight and can drop into other languages specifically for handling types of problems. I think that would be ideal but articles like this one that are close to straw man arguments don't help get anyone excited to work on it.

I was very impressed by Mark Elendt's talk at CppCon about the Houdini system. It contains a very sophisticated visual programming "language".

My view on this is that text is one of the best media we have for representing precise, information dense data.

What I got from the Houdini presentation was that, when someone is trained to produce precise, information rich data using other media e.g. artists then other representations can be as effective.

Or an enterprise platform with actual enterprise customers: https://www.outsystems.com

Or their main competitor (with an online IDE): https://www.mendix.com/ which was sold to Siemens earlier this year for $730M

It's also a long-term maintenance challenge, especially when you have complex models.

Great for prototyping stuff fast, but hard to maintain in production. It was designed for people who aren't programmers to be rapidly productive especially when interfacing with hardware (in a lab environment). However, debugging complex topologies can be a real challenge.

We have Labview models that are being rewritten in conventional programming languages due to aforementioned challenges.

It really ultimately has to be if it's to be maintained. LabVIEW is fine for basic prototyping, the problem is that EEs tend to think they've created the mona lisa of program design when they've built a barely functional prototype and then distribute it as a finished product.

I find LabView absolutely horrible once you reach a certain level of complexity. We are converting most LabView code to C# and Measurement Studio and it's so much easier to deal with.

LabVIEW is a... toy for children.

People use what works for them. For prototyping it works fine. Not my cup of tea. I'm not a systems programmer, I only write tools in Python, Bash, and still awk. Since I work in a primarily *nix-based environment, I use mainly Kate as a poor-man's IDE and the shell, usually Bash.

I've got friends telling me I should switch over to VS now that it runs on Linux, but I'm too stuck in my ways to change, and I like my workflow minimalist as possible. Most of the time, I rapidly prototype something on a development server and then once I know the idea is feasible, I'll move over to Kate and write it more cleanly, since Kate has Konsole built in. I can make changes and run them immediately.

Don't be rude.

LabVIEW is used in production in a lot of serious environments.

I've been in those environments. I've had to try to repair horrific "code" written in it. I've participated in "code reviews" where the straightness of the lines was carefully assessed in the massive mess of spaghetti code they had created and thought was good code.

LabVIEW is a toy for EEs to write prototypes in. That would have been okay if it had stayed there -- but the problem is that people are actually distributing "applications" using it, and trying to maintain them is nearly impossible.

I still don't get why you call it a toy, or just for prototypes. Maybe think somewhat broader and not just about the bad experience(s) you had with it. For the things it's good at, it just works and it's pretty hard to find alternatives (well, Measurement Studio is ok, but for simple things it's usually still a bit more work than Labview). The hard part is figuring out what Labview is good at and not make the mistake of trying to use it for everything. Sound like that's where most of your bad experiences come from.

Here's an example of a place where Labview just shines: I needed something to plot 'rolling' analog and digital signals, i.e. basically provide a visualization of the inputs of a combined analog/digital input card. With ability to pause the thing, each line in a different color, data cursors, ... The data is acquired by C++ code, but since Labview has this stuff built in getting the whole thing up and running is just a matter of setting up a communication protocol between the C++ part and Labview. I just went for TCP/IP and got the thing up and running in a couple of hours. Has been used like that for years now. Not exactly a prototype, nor a toy. It just does this one thing, it's all we needed, and it does it extremely good.

I call it a toy because that's what it is. It's used almost exclusively to bypass doing things the right way. As you highlight quite well with your example.

Well, enlighten us then. What is the right way according to you? In case of my example: what is a better way than getting things done in a few hours, with no bugs whatsoever, with all functionality needed, and just working? I'm actually truly curious as to what would be better and more right.

To begin with, the labview run time is riddled with bugs. Every application I've ever seen written under labview is expected by its creators and its users to crash randomly and continually for no explainable reason.

Yes, drawing out some things in a visual IDE and having it work is nice. The problem is that eventually LabVIEW is going to update their runtime and it won't work anymore. Now what?

Take the time and do it right up front, and then when it needs to be updated people aren't cursing "whoever decided to do this in LabVIEW years ago" as they often do in these situations.

do it right up front,

Ok, but again what do you suggest is 'right' then, any example? Because as laid out, for me, it is right, since we have zero problems.

Every application I've ever seen written under labview is expected by its creators and its users to crash randomly and continually for no explainable reason.

Hmm, strange. Sounds to me like those creators must be doing something wrong. I mean, we've been running a couple of Labview applications for +10 years and I honestly think none of them ever crashed (where 'crash' means suddenly stops working without apparent reason because of a bug in Labview itself, not come to a halt due to programmer error). Also wen to, I don't know, 3 or 4 updates, without much problems. Again I think it just comes down to hwat I said first: the hard part might be fuguring out how to do Labview right, I guess.

Or the labview runtime is absymal and broken and you simply haven't tripped over any of its thousands of broken cases in the minimal use of it you've done.

I say again; Don't be rude.

Especially don't be mean about other people's environments for the sake of being mean about their environments. Which is what you just did.

It's not "rude", and it's not "mean" to describe something accurately. Using those adjectives is simply projecting your emotions onto a technical discussion.

LabVIEW is a cash cow that NI pushes on schools and junior EEs to make them think they're software developers, it allows them to write horrible cruft that is generally impossible to maintain. In almost every case I've seen, it's actually easier to rewrite the whole mess from scratch rather than try to decipher what someone a decade ago did as a hack that got turned into something people depend on.

You're clearly one of NI's rabid fans, you believe what you believe, but if you take personally criticism of a really poor tool, you know the rest..

I can create things in LabVIEW if I have to, I've done it and I've debugged and rewritten other people's abominations in it as well. The concept of visual programming languages is badly flawed for a variety of reasons and LabVIEW portrays each of them thoroughly.

What's "the right way"? Spending an extra few days on making your own visualization code, throwing away flexibility in the process?

Time spent doing it right up front pays off when someone has to maintain it.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact