From my perspective, this is the problem of this decade in HW and SW engineering. Around 2007, Intel hit the wall with single threaded CPU scaling. It has gone from doubling every 2 years to a few % increase per year. We at the beginning of this paradigm shift to massively multi-core CPUs. Both the tools and the theory are still in their infancy. In HW there are many promising advances being explored, such as GPUs, Intel Phi, new FPGA, and projects like Parallella.
The software side also requires new tools to drive these new technologies. I think traditional threading will be viewed as a stopgap hack and will be replaced by some form of functional, flow-Based, and/or reactive programming models.
A few years ago, I had to help our EEs write some testing software in LabVIEW. I was blow away by how elegantly it solves concurrency and fault tolerance to bad input data. It took no extra design to utilize multiprocessing and multithreading hardware. The synchronous model in our program eliminated the issue of deadlock and race conditions problems that come up when using asynchronous threads.
Another project with great potential is NoFlo. What SW solutions have others run into in this field?
Don't get me wrong, dataflow programming has its own gotchas– explicitly guaranteeing proper order can be tricky – and it's very hard for people from a traditional background to switch over (they're always reaching for the for-loop that isn't there), but it's quite clear to me that I can throw together a complex multimedia system with, say, camera input, some computer vision, effects, video playback, etc. in, say, 3 hours whereas it would take somebody starting from scratch in any other language at least twice that time if not 10 times. Moreover, after they've finished, they enter a much longer period of intensive debugging, whereas dataflow systems I've built have been much closer to bug-free on the first draft.
Whenever I introduce this stuff to engineers, it blows their minds. I'm beginning to feel like it's a matter of serious importance to spread the word far and wide about alternative programming paradigms that make many of the problems of this decade magically disappear and make programming more efficient and easier to learn.
Shoot me an email if you're interested in more thoughts / pointers on this topic – skiptracer at gmail.
 Somebody might be able to argue that they could be just as fast with Processing or openFrameworks, but anecdotally developing in those environments is still 2x slower and bug fixing goes on forever.
Author of NoFlo here. You're right in that Flow-Based Programming requires a bit of a context switch when compared to traditional programming.
In general things are stateless, and information sharing happens only through information packets (which in NoFlo can be anything you can use as a value in JS... objects, arrays, functions, numbers, etc).
For synchronization we have quite a lot of tools in NoFlo. Information packets can be annotated with metadata ("groups"), which can be used to merge data together, for sorting packets, to synchronize flows, and for routing things to different parts of the graph.
I just launched the new NoFlo website at http://noflojs.org yesterday. Now the next big task is to write more documentation and examples there on how to do different things. And of course there is Paul Morrison's book on FBP that talks a lot about things like synchronization, looping, and deadlocks.
In general FBP does provide a lot more power to the developer. Being able to see visually where and how things in your software are connected makes it a lot easier to understand the system, and to find potential flaws in your logic.
Do you know of anybody using NoFlo in production? Would be good to put that on your site if you do.
Good information on FBP can be found here: http://www.jpaulmorrison.com/fbp/
It's also interesting to see the relationship between process calculi based dataflow and actor model based agent implementations: http://en.wikipedia.org/wiki/Actor_model_and_process_calculi
I do hope that new developments in pull based FRP and push based implementations like rx lead to some elegant push/pull combination sometime.
A question for me is why we write things like web servers in a programming language at all. After using dataflow systems, I believe there are much better abstractions possible than Rails-style web frameworks.
Now in specific domain, industrial & control system, signal processing, even designing GUIs themselves, click and drag has worked. It lets non-programmers but subject matter experts get stuff done.
But for generic stuff it hasn't worked too well. UML was one such push. Managers were just going to draw a diagram on the screen, click, "Generate Code" and bam, no need for silly code monkeys anymore. Well it hasn't quite worked that way. Now the "Generate Code" button just does an SMTP send to a .ch or .in domain some place.
There is another such system. Quite exotic called DRAKON.
It comes the Soviet space shuttle program. Yes they had a successful space shuttle once that few unmanned and even landed itself (before US space shuttles could even do that). Some of the people working on that wanted to bring programming down to non-programming engineers. And created that system. There was some interested in it lately as well. But can't say it exactly took over the world.
They key is that the dataflow environment specifies a strict interface between "objects" and an overall execution pattern, but you can dig in to the stuff you're clicking on and change/optimize the underlying code.
Recently Cycling '74 (who make Max) added a new lower-level dataflow language that maps to C code or alternately GLSL shaders, which makes it super easy for people who don't know how to program to write efficient, auto-optimized audio and graphics routines. Another system along those lines is Faust: http://faust.grame.fr/.
For a more open-ended system that combines dataflow ideas with traditional programming concepts, a time-line editor, and in-text GUI controls, check out Field: http://www.openendedgroup.com/field/.
DRAKON looks like a naive low-level attempt at sticking imperative routines into flowcharts. Diagrammed UML systems sound awful – you probably lose all the power of object oriented programming by transposing it into a GUI. Signal chain diagram languages like MATLAB's SIMULINK, Analog Devices' Signal Chain Designer, or Cypress' PSoC Designer are usually very basic and nothing like what a proper dataflow system could be.
I realize this isn't an easy problem to deal with but certainly there has to be a way to take the advantages of creative development environments like Max to traditional programming - e.g. a GUI for ruby.
Focusing on the big picture and larger concepts of programming instead of getting lost in syntax would be very welcome for beginners like me.
Do you have examples of products you have built with one of these multi media coordination languages?
I am familiar with flow based and rule based programming, but in my experience these were very high level abstraction systems which were then used to solve specialized problems they were well suited for. While you could use them for any kind of programming - they were turing complete - it wasn't fun, or easy, or made any sense.
Indeed they are typically aimed at specific domains, but I think this is entirely natural for something that's a layer of abstraction above a high-level programming language.
One issue with the systems I'm talking about specifically is that they do run in an environment that is a quite heavy app in itself, so building an application or using the environment as part of another application does not work well – you end up with a lot of extra bulk. In order for this model to take off for consumer apps, you would need OS-level support for apps to make use of a single interpreter instance running in the background, or for the interpreters to be heavily optimized.
But that's not the point – the point is that there are other programming paradigms than interpreted vs. compiled, static vs. dynamic, imperative vs. functional text-based programming languages. The systems I mentioned are for realtime multi-media installations, performances, music-making, etc., but similar paradigms could be useful in other domains.
Also, the audio devices for Ableton Live were prototyped in Max/MSP: http://www.webcitation.org/5uKcsulCc
Could you describe this in more detail? I think it would be interesting. How does LabVIEW do those things and why did you find it elegant?
Currently other solutions are either Matlab (which sounds soul-destroying) or Python (which is still nontrivial).
My experience was very different from yours. I couldn't believe how simple it was to add an additional test in parallel. It was just a copy/paste, update the functions, test parameters, and I/O. If only writing multithreaded code in C++ or Java was that simple. After my experience, I could see how LV could be a good fit for computer vision apps running on highly parallel architectures.
It's too bad there not more modern, open dataflow languages out there. Massively complex modern CPUs wouldn't exist with tools like Verilog. As software grows in complexity, these tools may also offer some solutions on the software side.
Lightweight processes, per process heap, fault isolation, scheduler implementation, iolists.
Looking at it after years of dealing with other VMs I find myself nodding my head quite bit saying "Yup, they got this right, aha, got this right too, and so on"
Besides Elixir there is also LFE (a lisp flavored Erlang), also JOXA another lisp-like language for BEAM.
Other systems usually copy and get inspired by Erlang/BEAM VM but often they pick the concurrency model. The real money is in fault tolerance. They are tied together somewhat, but they are different.
As distributed systems grow larger and larger, it is nice if they scale or can parallelize their concurrent parts but they are not going to be practical if they fail. And failure modes of distributed systems are magnitudes worse than single node systems.
But it is also hard to sell it. In other words telling developers look this system will isolate fault and you can restart part of it and upgrade it while it runs, is not going to get them inspired unless they are experienced and have been woken up at 3am by a pager when a large system is taken offline because a minor fault wasn't properly isolated. Many are rather more impressed by marketing and small speed benchmarks.
Good luck with that. You might as well attempt to predict the weather for the next decade.
As for the content, it's equally flawed. It's not just one device anymore, and probably not going to be a PC. Well - obviously. And frameworks are apparently a house of cards and things might break down horribly - a bizarre statement if there ever was one as frameworks empower individual programmers and small upstart teams to do amazing things which would simply be impossible without them.
Here's my prediction: Everything is changing, and everything will continue to change. If you're a programmer, you have to change and keep on top of things. It doesn't matter if you're just starting out or if you're like my genius ex-boss and are currently fixing up a software product at the spry age of 80.
I can do that, with just as much accuracy as the author is striving for. It will be hot in the summer, and it will be cold in the winter. There will be storms in between. Some people may experience snow, while others will get rain instead, depending on your location.
I'm not entirely with him on the evils of frameworks. Software is built on software the way knowledge is built on knowledge. I'm happy to stand on the shoulders of giants. It means I don't have to rewrite what's already been well written (and well tested). There are tradeoffs, sure, and it's really important to understand the hows and whys of the frameworks you select (this is where open source really matters). Still, I think narrowing the problem scope so you can "execute the program you’re writing inside your head"—and inside the ambitious timeline of your startup—is the great gift of modern computing, not its curse.
Frameworks are the reason I can go and build a top class product in 6 months all on my own. If we didn't have them, each new product would require an army of programmers re-inventing wheels, and only big corporations with lots of money could afford to even attempt it.
Frameworks rock. Abstraction also rocks. Whenever I need something more than once, I make it abstract. I automate it, abstract it, make it re-usable. And 99% of the time - I am not kidding, it's really that high - I will re-use that piece of software, re-run that script, etc.
A senseless article, all in all.
Second, they provide a perverse incentive to force every new application feature into the structure of the framework you decided to use at the beginning of the project.
Frameworks are like debt. They give us a head start at the cost of paying more down the road. Sometimes that head start is worth the higher overall cost and sometimes it's not.
I think libraries are a lot less problematic and still provide most of the benefits of frameworks.
That's part of the future.
I'd love to be able to experiment both the physical and digital world at the same time, discontinuing my hand/eyes as an interface with the computer.
I believe it will eventually happen. I'm not exactly sure of the current state of the art, but I've seen research replacing lost limbs with robotic arms that are solely controlled by thoughts. Seen also images induced into one's vision by stimulation of the brain.
Our brain evolved with I/O devices meant to match our physical environment. As far as I know, we don't have any I/O device meant to convey ideas in themselves. We need to serialize our thoughts into words, images and movements to get them deserialized into other's minds. All this process is lossy and has a limited bandwidth.
Anyway, I'm not talking about things I really know, so I won't elaborate further into this. I try to limit my opinions on HN to stuff that I actually know. You can't fool anyone around here; next thing I know, a brain researcher will come and break all my ideas apart in the comments.
That's meant to exclude telepathy stuff
Most programmers will develop smart agents that will manipulate a decentralized and unique semantic data source. All of this is quite obvious.
That said, I still see a lot of potential for this sort of programming. One of my side projects is an attempt to bring intentional programming to embedded hardware design.
Have you seen wolfram alpha in action?
I was able to achieve automated programs for printing text, simple math operations, string manipulation, and conditionals. After that, programming the fitness methods started getting complicated.
Er...I thought one of the underlying parts of OOP was creating a domain for data, such that derivable attributes and relationships were encapsulated in the data model? The way that R treats data as a first-class citizen is nice for some statistical modeling, but doesn't seem robust enough for all the other ways that we need to organize and munge data.
Increasingly, modern system design tends to place an emphasis on bare, immutable data, whereas OO tends to work with encapsulated, mutable state. If the only thing you're using objects for is namespacing and polymorphism, then there's not much point to designing software within the OO paradigm.
I think lambdas more than immutability help one write clean abstractions without traditional OO-objects. See the "Lambda the Ultimate" series of papers for details.
Thanks for contributing to my knowledge!
The problem is the language and paradigms used not the framework or how many levels you abstracted the problem at hand.
The way as i see it, writing programs by manipulating data excessively, will lead us nowhere but complexity, which unfortunately is introduced by the program itself. Mainstream languages such as C++, Java, C# etc., should not be taught in schools as if they are the ultimate solution, and fp is sth not practical, it is a twofaced claim while stealing ideas from fp and patching these languages, nowadays.
Another one, saying how fast the IT changes at every chance and sticking with ancient programming languages. When someone points out the dilemma, then claiming not to having enough developers for say Haskell, Clojure, Go etc.. May be you should fix the education system morons, instead of building more complex frameworks, platforms.
Of course there are particular areas, such as simulations, modelling time dependent large data sets, embedded development etc.. where some languages will be the best suited while others will be overkill or not just fast/ viable etc.. Of course, I am not blindly saying, "Death to imperative languages" :P However, they shouldn't dominate.
As a final comment, The future of programming is already here, The question is are we ready for the future...
Then a lot of stuff gets layered on top of this. And sure, languages /do/ jit, but these are generally hothouse flowers, surrounded by an infrastructure provided by C, C++ and assembly. Few systems are native boot without involving a bunch of C.
I'm happy if people are satisfied to work in the layers above all this stuff. Frankly, not many folks (as a percentage of the programming population) can do good kernel level work. But don't pretend that it doesn't exist, or that it is somehow morally inferior to hacking away in Haskell.
Maybe this will change in thirty years; I think that's the time scale required to make a fundamental change in the way we program modern systems.
I'd /love/ to see a native Erlang system, soup to nuts. But there's little economic incentive to make one, given that the lower layers are actually doing a pretty decent job.
Isn’t that part of the problem? There is no technical reason we couldn’t have a language that offered the same fine control and hardware integration as C, compiled to native executable code in a similar way, but was both safer and more expressive. There is no advantage in having an awkward syntax for specifying types or in making all pointers nullable.
Mainstream industrial languages today are a triumph of good enough, and they continue to dominate primarily because of momentum and the size of the surrounding ecosystem rather than technical merit in the language itself. Unfortunately, this creates a vicious circle that reinforces the status quo, and the few organisations with sufficient resources to break that cycle have limited economic incentive to do so.
I can see some better version of the .net micro framework, being improved by the open source community(1), and benefiting from the ms ecosystems and tools, becoming fit for some large part of embedded systems work.
Also I can see this getting adopted since it let's embedded developers in small companies, who has some power on which tools to use, learn another skill that can better their employment opportunities.
(1)improved speed could be achieved through using the cito compiler project, which might be good enough for many projects. And for making the language hard real time, it could be done using specific implementation of reference counting, although at some speed/memory cost.
These languages have /great/ technical merit. They offer safety (in various forms) and other interesting technologies that C definitely lacks. They were lauded by academics and industry pundits. So why didn't they take the industry by storm?
Perceived technical merit is a terrible way to choose a language.
Pascal was widely regarded as a great language, a wonderful model, and it was widely used in the 80s by various large companies. Today it is mostly dead. I believe this is because Pascal only did an adequate job of expressing stuff at the hardware and kernel level, and that C was better. Certainly nearly everyone at Apple that I worked with breathed a sigh of relief when it became obvious that it was okay to write C instead of Pascal for new projects. For the most part we'd been writing C for years anyway, just in Pascal. About the only thing that people missed were nested procedures (whereupon, C++).
Your new "adult" language is going to need a set of very compelling offerings over and above "well, it's safer" in order to succeed.
Take a look at things people are doing /to/ C in order to be better:
- "analyze" builds that do control graph analysis and find bugs (not just ones endemic to C, but actual logic bugs, too)
- declarative sugar that helps tools to reason about what things like drivers are trying to do
- control extensions (commonly seen as macros providing 'foreach' like support)
- ways for tools to enforce local conventions (without spending tons of manpower on parsers and so forth)
Come up with a language as good as C at low-level programming, that has great debugger support, offers easy tool plugins, and that has interoperability with the gazillions of libraries already available [take a page from C#'s great interop story here], and you might have something. Go "academic" and just say "this is good for you, use it instead," and the working programmers will see nothing in it for them and ignore you, just as they've ignored or abandoned dozens of other offerings in the last 30 years.
Some possible reasons, based on my limited knowledge of those languages:
Eiffel — Emphasis on simplicity over performance optimisation; emphasis on OO programming style; legal issues around various parts of the ecosystem in the early days
Oberon — Limitations of basic type system, such as a lack of enumeration types and the way coercion of numerical types worked until recent versions
D — Many of the same major strengths and weaknesses as the more established C++; two rival “standard” libraries for a long time
Of course. You can throw in “it’s easier to write” and “it’s more powerful” and you still only have a small part of the big picture, because in reality so much depends on the surrounding ecosystem: development tools, libraries, and so on. However, there is no reason we couldn’t have a language that was superior to C in both safety and expressive power, remained compatible with calling to/from C functions at ABI level for library compatibility and ease of porting, and used a clean grammar to help tool developers.
While I don’t disagree with any of your examples, I’m not sure they really tell us anything useful. The absence of other things that people might do could be because they aren’t particularly valuable or it could be because they are valuable but also prohibitively difficult or expensive to achieve starting from C as the foundation.
As UNIX systems started to spread into the industry and getting market share from mainframes, developers also wanted to have those utilities in their home systems. This lead to C being spread outside the natural environment (C) and "infecting" non UNIX systems.
As Pascal refugee, I only touched C when required to do so. Even with its quirks I find C++ offers a more welcoming place, thanks to its stronger type improvements and better abstractions over C.
Like any systems programming language, C will only get replaced if the operating systems vendors force developers to use something else.
This is what Microsoft is doing by transitioning Visual C++ to a pure C++ compiler.
UNIX vendors will never do it because C is synonym of UNIX. All even C compilers get to be written in C++ nowadays.
Who knows what other OS vendors might still be relevant.
I see where you are coming from but I don't see how assembly and a layer of C drivers is going to disappear. That is the whole idea behind having layers of abstraction. It is what makes one plug in a mouse and it works and then another mouse made by another company and it also works. There is a lot of firmware, assembly, kernel driver, kernel syscalls taking place but it support you moving your hand and a pointer moves on the screen correspondingly. We take that for granted but it is built on an existing infrastructure. It took years and it wasn't always the same. I remember older DOS days having to fish for mouse drivers.
Same with Erlang. Erlang itself is built with C. Beam VM is mostly C (maybe with some assembly). It runs on standard hardware with a few standard operating systems.
Now if we dream of the future. I can see architectures perhaps that do something like you suggest. There is one for instance: Erlang on Xen. The idea is that Erlang runs on a very small hardware footprint, without an OS (since Erlang's standard library pretty much provides large number of OS like features). Now they picked Xen and you can say perhaps "ha that's cheating, there is also Dom0 running!". But it isn't practical otherwise. Maybe they could have picked a particular motherboard and CPU combination but now it is hard to reproduce that, test it and lots thing have to be rewritten.
Take a look anyway, I think you'll be surprised:
Oh system programming does exist, and I can understand the role of C, what I am saying is, even we have a better alternative to C, people would still stick with C.
There is no economic incentive, you are right but the reason itself mostly economical not technical.
Of course LOC != 'contributed value'. Much as people don't assemble in thousands to see a jukebox progress through a playlist but do for the experience created by a skilled DJ, the last 5% is the most significant 5% in programming (and perhaps, all creative endeavors)
I see very few domain problems where the boatload majority of the work hasn't been done for you, and you just glue legos together. Network stack? Most languages have an httplib or you could get libftp or some such from a foss repository.
The only real problem is figuring out which api is easier to use, since there usually are competing choices on a lot of these drop in solutions. qt or boost? django or bottle? backbone or angular?
And besides, someone still has to program these libraries - it isn't turtles all the way down.
Are you insinuating a DJ is not a musician?
That was my initial experiment with self-programming AI, although ultimately, the fitness methods were starting to grow in complexity themselves.
You can see the small version of that in a language like Prolog where a common pattern is to generalize the problem because more generalized problems tend to be easier to solve/reason about (often because they already come with easy to discover base cases and steps for a recursive approach)
This may be true, but in the real world, data and behavior are still intimately tied together. Data-oriented (i.e. functional) programming is going to enhance OO programming, not replace it entirely.
The "real world" is a pure function of time -- therefore functional programming is the "right way" to model it.
Examining our mental model of the "real world" is not going to give us good insight about which programming tools will model it best.
I suppose building basic tools as Apache Thrift services would allow one the ease of prototyping ideas in Python before building a performant system in something like Java or C++.
BTW, Python describes itself as object oriented, so it illustrates my point that OO and functional are not at odds.
I assume its the libraries available for Python (Pandas, NumPy, Blaze) that are the basis for this quote. Is it also the case for Clojure (as compared to other FP languages)?
My modern computer with an SSD feels subjectively faster/snappier to me than any computer I've owned in the past.
And those devices are packing more punch for their size -- just look at the Raspberry Pi and it's competitors.
It starts a level below what problems might be the next thing to tackle and focuses on what shiny objects have already got enough traction to be considered concern of the past 3-5 years.
Short-sighted. Boring. Couldn't dance to it.
1-3 years are feasibly predictable
3-7 are 50/50 hunches 20%CI
8-9 you might as well be talking jet packs
10+ You aren't talking about saving or controlling the world? Refactor!
10 years from now, we blow IP up. we napster algorithmic experiences because you can't patent wiping your ass
10 years from now, we make devices cheaper than water from scrap plastic
10 years from now, any human can talk to any other human anytime, anywhere
10 years from now, my meta-data is meaningless and we defund NSA because they are deaf dumb and useless
10 years from now, we eat the rich and feed the poor
Why bother looking 10 years ahead without setting some real goals or at least looking at things that might actually drive the innovation in the next 10 years vs what is already well-planned?
We're all going to be 10 years closer to death and you are believing we'll have "smart dust"? A Roomba in every house!
Do you know how long it's been since I first heard that there would be fucking smart dust? You mean we need to lay a powder of infrastructure down to detect what distributed, distant sensors could tell you? Looks like a whale just barfed in the ocean and we got a ton of our dust back online... water's still wet AND salty there! Dial back the Antarctica dust belcher for a minute to balance out the South Pacific by next year... stupid whales, when will you learn?
He should have stopped at sensors. We'll have 1984. Smart dust? We'll be crotch-deep in gray goo.
Whales are like, "IIIIIIII WIIIIIIILLLLLLL DEEEEEEEEEESSSSSSSSTRRRRRRRRRROOOOOOOOYYYYYYYYYYYYYY!!!!"