Hacker Newsnew | comments | ask | jobs | submitlogin
The Future of Programming (oreilly.com)
131 points by christianbryant 270 days ago | comments


slacka 270 days ago | link

> "For the programmer, that means we must grapple with problems such as concurrency, locking, asynchronicity,.."

From my perspective, this is the problem of this decade in HW and SW engineering. Around 2007, Intel hit the wall with single threaded CPU scaling. It has gone from doubling every 2 years to a few % increase per year. We at the beginning of this paradigm shift to massively multi-core CPUs. Both the tools and the theory are still in their infancy. In HW there are many promising advances being explored, such as GPUs, Intel Phi, new FPGA, and projects like Parallella.

The software side also requires new tools to drive these new technologies. I think traditional threading will be viewed as a stopgap hack and will be replaced by some form of functional, flow-Based, and/or reactive programming models.

A few years ago, I had to help our EEs write some testing software in LabVIEW. I was blow away by how elegantly it solves concurrency and fault tolerance to bad input data. It took no extra design to utilize multiprocessing and multithreading hardware. The synchronous model in our program eliminated the issue of deadlock and race conditions problems that come up when using asynchronous threads.

Another project with great potential is NoFlo. What SW solutions have others run into in this field?

-----

msutherl 270 days ago | link

I've been using multi-media 'coordination languages' like Max/MSP/Jitter, Pure Data, SuperCollider, Touch Designer, and vvvv for years after I got hooked on the model from my first synthesizer (the Nord Micro Modular). When I had to learn traditional programming in school (Java at first), it felt hopelessly barbaric to me. Imperative text-based languages are incredibly time consuming, error-prone, and difficult to use in comparison.

Don't get me wrong, dataflow programming has its own gotchas– explicitly guaranteeing proper order can be tricky – and it's very hard for people from a traditional background to switch over (they're always reaching for the for-loop that isn't there), but it's quite clear to me that I can throw together a complex multimedia system with, say, camera input, some computer vision, effects, video playback, etc. in, say, 3 hours whereas it would take somebody starting from scratch in any other language at least twice that time if not 10 times[1]. Moreover, after they've finished, they enter a much longer period of intensive debugging, whereas dataflow systems I've built have been much closer to bug-free on the first draft.

Whenever I introduce this stuff to engineers, it blows their minds. I'm beginning to feel like it's a matter of serious importance to spread the word far and wide about alternative programming paradigms that make many of the problems of this decade magically disappear and make programming more efficient and easier to learn.

Shoot me an email if you're interested in more thoughts / pointers on this topic – skiptracer at gmail.

[1] Somebody might be able to argue that they could be just as fast with Processing or openFrameworks, but anecdotally developing in those environments is still 2x slower and bug fixing goes on forever.

-----

bergie 270 days ago | link

Don't get me wrong, dataflow programming has its own gotchas– explicitly guaranteeing proper order can be tricky – and it's very hard for people from a traditional background to switch over

Author of NoFlo here. You're right in that Flow-Based Programming requires a bit of a context switch when compared to traditional programming.

In general things are stateless, and information sharing happens only through information packets (which in NoFlo can be anything you can use as a value in JS... objects, arrays, functions, numbers, etc).

For synchronization we have quite a lot of tools in NoFlo. Information packets can be annotated with metadata ("groups"), which can be used to merge data together, for sorting packets, to synchronize flows, and for routing things to different parts of the graph.

I just launched the new NoFlo website at http://noflojs.org yesterday. Now the next big task is to write more documentation and examples there on how to do different things. And of course there is Paul Morrison's book on FBP that talks a lot about things like synchronization, looping, and deadlocks.

In general FBP does provide a lot more power to the developer. Being able to see visually where and how things in your software are connected makes it a lot easier to understand the system, and to find potential flaws in your logic.

-----

msutherl 270 days ago | link

Listen to this guy. I've know about FBP for awhile, but haven't delved into it beyond hanging out on the list briefly[1]. Seems like there are many great concepts there that go beyond the stuff I use and I bet there are many more to discover in that space, if only more people were doing research in the area.

Do you know of anybody using NoFlo in production? Would be good to put that on your site if you do.

[1] https://groups.google.com/forum/#!forum/flow-based-programmi...

-----

Paradigma11 269 days ago | link

You might want to look at rx reactive for dotnet/mono. I have been using rx in combination with f# for a while and really like it so far. http://msdn.microsoft.com/en-us/data/gg577609.aspx I do not understand why MSFT pronounces the linq/monad aspect of rx so much and dont advertise it's push based dataflow aspect. People would have a much easier time getting additional information and perspectives.

Good information on FBP can be found here: http://www.jpaulmorrison.com/fbp/

It's also interesting to see the relationship between process calculi based dataflow and actor model based agent implementations: http://en.wikipedia.org/wiki/Actor_model_and_process_calculi I do hope that new developments in pull based FRP and push based implementations like rx lead to some elegant push/pull combination sometime.

-----

msutherl 270 days ago | link

There's something I've ignored here, which is that most of these systems are at a higher level of abstraction than text-based languages. Modules (called "objects") in Max, for instance, are written in C. Max is more analogous to Unix than an actual programming language (by admission of its author).

A question for me is why we write things like web servers in a programming language at all. After using dataflow systems, I believe there are much better abstractions possible than Rails-style web frameworks.

-----

rdtsc 270 days ago | link

These ideas are not new. And _general_ click and drag programming has been "just around the corner" since GUIs became popular.

Now in specific domain, industrial & control system, signal processing, even designing GUIs themselves, click and drag has worked. It lets non-programmers but subject matter experts get stuff done.

But for generic stuff it hasn't worked too well. UML was one such push. Managers were just going to draw a diagram on the screen, click, "Generate Code" and bam, no need for silly code monkeys anymore. Well it hasn't quite worked that way. Now the "Generate Code" button just does an SMTP send to a .ch or .in domain some place.

There is another such system. Quite exotic called DRAKON.

https://en.wikipedia.org/wiki/DRAKON

It comes the Soviet space shuttle program. Yes they had a successful space shuttle once that few unmanned and even landed itself (before US space shuttles could even do that). Some of the people working on that wanted to bring programming down to non-programming engineers. And created that system. There was some interested in it lately as well. But can't say it exactly took over the world.

-----

msutherl 270 days ago | link

In my casual experience with such systems, they are extremely limited compared to something like Max, which is in turn extremely limited and badly designed.

Max is not just 'click and drag' – you can do quite complex configuration, cross-modulation of signals and parameters, etc. plus general purpose programming. You can also script things with JavaScript or Lua and write new routines ("objects") in C, Java, languages that compile to Java, and JavaScript.

They key is that the dataflow environment specifies a strict interface between "objects" and an overall execution pattern, but you can dig in to the stuff you're clicking on and change/optimize the underlying code.

Recently Cycling '74 (who make Max) added a new lower-level dataflow language that maps to C code or alternately GLSL shaders, which makes it super easy for people who don't know how to program to write efficient, auto-optimized audio and graphics routines. Another system along those lines is Faust: http://faust.grame.fr/.

For a more open-ended system that combines dataflow ideas with traditional programming concepts, a time-line editor, and in-text GUI controls, check out Field: http://www.openendedgroup.com/field/.

DRAKON looks like a naive low-level attempt at sticking imperative routines into flowcharts. Diagrammed UML systems sound awful – you probably lose all the power of object oriented programming by transposing it into a GUI. Signal chain diagram languages like MATLAB's SIMULINK[1], Analog Devices' Signal Chain Designer[2], or Cypress' PSoC Designer are usually very basic and nothing like what a proper dataflow system could be.

[1] http://www.mathworks.com/products/simulink/ [2] http://www.analog.com/en/content/Signal_Chain_Designer/fca.h... [3] http://www.cypress.com/?id=2492&source=productshome

-----

nikster 270 days ago | link

Same for most user interfaces. I am writing client side iOS apps and frankly doing most of that stuff in Obj-C, or any text based language - seems to make very little sense. It's a tool that seems ill suited to the task.

-----

thirdsun 270 days ago | link

Can't agree enough. As someone who currently tries to take his first steps in serious web frameworks like rails, but has much more experience in Max/MSP/Jitter and FileMaker, I keep wondering why there isn't a more visual approach to traditional programming.

I realize this isn't an easy problem to deal with but certainly there has to be a way to take the advantages of creative development environments like Max to traditional programming - e.g. a GUI for ruby.

Focusing on the big picture and larger concepts of programming instead of getting lost in syntax would be very welcome for beginners like me.

-----

nikster 270 days ago | link

Alternative programming paradigms - what the original article _should have been_. Absolutely exciting and needed.

Do you have examples of products you have built with one of these multi media coordination languages?

I am familiar with flow based and rule based programming, but in my experience these were very high level abstraction systems which were then used to solve specialized problems they were well suited for. While you could use them for any kind of programming - they were turing complete - it wasn't fun, or easy, or made any sense.

-----

msutherl 270 days ago | link

I've used (and continue to use) them exclusively for one-off interactive installations and research prototypes (I don't work in the 'commercial' sector), but I believe a number of people have apps made with Max in the App Store. Max is not purpose built for creating applications, but you can bundle a "standalone" which comes with your 'patch' and a copy of the 'interpreter'.

http://cycling74.com/2012/04/19/get-your-max-standalone-on-a...

Indeed they are typically aimed at specific domains, but I think this is entirely natural for something that's a layer of abstraction above a high-level programming language.

One issue with the systems I'm talking about specifically is that they do run in an environment that is a quite heavy app in itself, so building an application or using the environment as part of another application does not work well – you end up with a lot of extra bulk. In order for this model to take off for consumer apps, you would need OS-level support for apps to make use of a single interpreter instance running in the background, or for the interpreters to be heavily optimized.

But that's not the point – the point is that there are other programming paradigms than interpreted vs. compiled, static vs. dynamic, imperative vs. functional text-based programming languages. The systems I mentioned are for realtime multi-media installations, performances, music-making, etc., but similar paradigms could be useful in other domains.

-----

nikatwork 270 days ago | link

I've built a number of audio toys in Max, eg a basic Granular Synth and a Sampler. The build process was very different to imperative programming, took a while to wrap my head around. Very fun to play with though.

Also, the audio devices for Ableton Live were prototyped in Max/MSP: http://www.webcitation.org/5uKcsulCc

-----

dkersten 270 days ago | link

I did some work in Max4live two years ago and while I've been into dataflow programming (visual and textual) for a number of years previous, this really changed my perspective of what I want from a language. Max/MSP-style languages (I've played with Synthmaker and a few others too) have a few major advantages over traditional text-based languages, especially for exploratory programming or prototyping (and the main problem I see cited for these languages is that code quickly devolves into a mess of lines - literal spaghetti code - but I found that with proper software engineering principles this hasn't been a problem for me, just most users are not trained software engineers). One thing that really jumped out at me is that how not having to name things until you're good and ready (because you can simply connect stuff visually without naming or labeling) means you can focus on designing the code and experimenting rather than trying to come up with names for things as you are forced to do in most text-based languages. Visual debugging in Max/MSP was also quite pleasant.

-----

ilovecomputers 270 days ago | link

Just when you think you've found all the environments for creative coding, you discover new ones and wonder who are these people outside of my clique who use them?

-----

msutherl 270 days ago | link

Those are just some of the big ones! The list goes on: Polycode, Field, Overtone, ChucK, LuaAV, Plask, Lubyk, and many more.

-----

ilovecomputers 269 days ago | link

Polycode looks remarkable! It's like Unity3D for creative coders.

-----

gruseom 270 days ago | link

A few years ago, I had to help our EEs write some testing software in LabVIEW. I was blow away by how elegantly it solves concurrency and fault tolerance to bad input data.

Could you describe this in more detail? I think it would be interesting. How does LabVIEW do those things and why did you find it elegant?

-----

TheLegace 270 days ago | link

I honestly had to use LabVIEW to generate a heart pulse(simulation for a pacemaker). God I hated it so much, so much so as to make one my passwords "labviewsux". I wish it would die, but that's just my opinion.

-----

toyg 270 days ago | link

Eh, not much love lost for LabVIEW here as well. It put me off visual programming for good.

-----

slacka 270 days ago | link

Yes, it's true LabVIEW has its own issues. It's a memory hog and the UI is cluttered and doesn't scale well for large projects. I don't think LV will be THE model for future programing, but I do think that its synchronous dataflow programming model could be part of the solutions to the concurrency issue that languages face today in the massively parallel domain.

My experience was very different from yours. I couldn't believe how simple it was to add an additional test in parallel. It was just a copy/paste, update the functions, test parameters, and I/O. If only writing multithreaded code in C++ or Java was that simple. After my experience, I could see how LV could be a good fit for computer vision apps running on highly parallel architectures.

It's too bad there not more modern, open dataflow languages out there. Massively complex modern CPUs wouldn't exist with tools like Verilog. As software grows in complexity, these tools may also offer some solutions on the software side.

-----

greenmountin 269 days ago | link

You may not like LabVIEW, but it is extremely useful in some areas. For lab scientists who need to iterate quickly between taking useful data and running the experiment, it's nice to just copy and paste instead of naming new variables, etc.

Currently other solutions are either Matlab (which sounds soul-destroying) or Python (which is still nontrivial).

-----

andyl 270 days ago | link

Erlang / Elixir

-----

rdtsc 270 days ago | link

No doubt. BEAM VM is a beautiful thing.

Lightweight processes, per process heap, fault isolation, scheduler implementation, iolists.

Looking at it after years of dealing with other VMs I find myself nodding my head quite bit saying "Yup, they got this right, aha, got this right too, and so on"

Besides Elixir there is also LFE (a lisp flavored Erlang), also JOXA another lisp-like language for BEAM.

Other systems usually copy and get inspired by Erlang/BEAM VM but often they pick the concurrency model. The real money is in fault tolerance. They are tied together somewhat, but they are different.

As distributed systems grow larger and larger, it is nice if they scale or can parallelize their concurrent parts but they are not going to be practical if they fail. And failure modes of distributed systems are magnitudes worse than single node systems.

But it is also hard to sell it. In other words telling developers look this system will isolate fault and you can restart part of it and upgrade it while it runs, is not going to get them inspired unless they are experienced and have been woken up at 3am by a pager when a large system is taken offline because a minor fault wasn't properly isolated. Many are rather more impressed by marketing and small speed benchmarks.

-----

pron 270 days ago | link

Definitely. Also, for Scala/Java, there's Akka[1], for Clojure/ClojureScript there's core.async[2], and for Clojure/Java there's Quasar/Pulsar[3]

[1] http://akka.io/

[2] https://github.com/clojure/core.async

[3] http://puniverse.github.io/pulsar/

-----

yareally 270 days ago | link

Also Scala Async[1] modeled after C#'s Async/Await, though still has a few "gotchas" in it.

[1] https://github.com/scala/async

-----

toddmorey 270 days ago | link

> On frameworks: "Why should we use computers like this, simultaneously building a house of cards and confining computing power to that which the programmer can fit in their head? Is there a way to hit reset on this view of software?"

I'm not entirely with him on the evils of frameworks. Software is built on software the way knowledge is built on knowledge. I'm happy to stand on the shoulders of giants. It means I don't have to rewrite what's already been well written (and well tested). There are tradeoffs, sure, and it's really important to understand the hows and whys of the frameworks you select (this is where open source really matters). Still, I think narrowing the problem scope so you can "execute the program you’re writing inside your head"—and inside the ambitious timeline of your startup—is the great gift of modern computing, not its curse.

-----

nikster 270 days ago | link

You're too kind.

Frameworks are the reason I can go and build a top class product in 6 months all on my own. If we didn't have them, each new product would require an army of programmers re-inventing wheels, and only big corporations with lots of money could afford to even attempt it.

Frameworks rock. Abstraction also rocks. Whenever I need something more than once, I make it abstract. I automate it, abstract it, make it re-usable. And 99% of the time - I am not kidding, it's really that high - I will re-use that piece of software, re-run that script, etc.

A senseless article, all in all.

-----

fauigerzigerk 270 days ago | link

Frameworks have a lot of issues. First of all, they tend to exclude each other, hence preventing you from reusing code that might be a better fit for a particular problem.

Second, they provide a perverse incentive to force every new application feature into the structure of the framework you decided to use at the beginning of the project.

Frameworks are like debt. They give us a head start at the cost of paying more down the road. Sometimes that head start is worth the higher overall cost and sometimes it's not.

I think libraries are a lot less problematic and still provide most of the benefits of frameworks.

-----

nikster 270 days ago | link

This article is based on a flawed promise: "The goal is to be able to describe the essential skills that programmers need for the coming decade.".

Good luck with that. You might as well attempt to predict the weather for the next decade.

As for the content, it's equally flawed. It's not just one device anymore, and probably not going to be a PC. Well - obviously. And frameworks are apparently a house of cards and things might break down horribly - a bizarre statement if there ever was one as frameworks empower individual programmers and small upstart teams to do amazing things which would simply be impossible without them.

Here's my prediction: Everything is changing, and everything will continue to change. If you're a programmer, you have to change and keep on top of things. It doesn't matter if you're just starting out or if you're like my genius ex-boss and are currently fixing up a software product at the spry age of 80.

-----

freehunter 269 days ago | link

>You might as well attempt to predict the weather for the next decade.

I can do that, with just as much accuracy as the author is striving for. It will be hot in the summer, and it will be cold in the winter. There will be storms in between. Some people may experience snow, while others will get rain instead, depending on your location.

-----

sarreph 270 days ago | link

I'm most looking forward to typing my program requirements into the STACK-OVERFLOW-PARSING-INTERPRETING-COMPILING-MACHINE, and it gobbling up all the nuggets of information together into a fully-formed program.

That's part of the future.

-----

AYBABTME 270 days ago | link

Looking forward to the day I'll put my mind-reading headband and go for a ride while I program in thought.

-----

tunesmith 270 days ago | link

Not sure why this was downvoted... I've wondered often what it would take to develop a programming language that one could create in while being physically mobile. I do some of my best programming/thinking while walking around since it's conducive to inspiration, but then I have to wait until I get home behind my keyboard to try and implement something.

-----

AYBABTME 270 days ago | link

Yup, I was serious in my idea. Maybe downvoters thought I was sarcastic.

I'd love to be able to experiment both the physical and digital world at the same time, discontinuing my hand/eyes as an interface with the computer.

I believe it will eventually happen. I'm not exactly sure of the current state of the art, but I've seen research replacing lost limbs with robotic arms that are solely controlled by thoughts. Seen also images induced into one's vision by stimulation of the brain.

Our brain evolved with I/O devices meant to match our physical environment. As far as I know[1], we don't have any I/O device meant to convey ideas in themselves. We need to serialize our thoughts into words, images and movements to get them deserialized into other's minds. All this process is lossy and has a limited bandwidth.

Anyway, I'm not talking about things I really know, so I won't elaborate further into this. I try to limit my opinions on HN to stuff that I actually know. You can't fool anyone around here; next thing I know, a brain researcher will come and break all my ideas apart in the comments.

[1]That's meant to exclude telepathy stuff

-----

bennyg 270 days ago | link

I also would like to see stuff like this. I hope it only evolves to stuff more like when I can see the whole algorithm (not in code, but in process I guess) in my head versus thinking "okay brain, for loop starting at x == 0, while x is less than this array's count."

-----

miguelrochefort 270 days ago | link

The future of computing is design by contract and intentional programming. You define your goal, and that's about it. Most technicalities will be delegated to machines.

Most programmers will develop smart agents that will manipulate a decentralized and unique semantic data source. All of this is quite obvious.

-----

chad_oliver 270 days ago | link

We've been making this prediction for decades, and it still hasn't arrived. I think the problem is that as our software tools become more capable, we tackle larger projects. As time goes on we'll be able to solve a greater number of problems by 'defin[ing] your goal, and that's about it', but we won't be able to solve a larger proportion of problems.

That said, I still see a lot of potential for this sort of programming. One of my side projects is an attempt to bring intentional programming to embedded hardware design.

-----

Sven7 270 days ago | link

>it still hasn't arrived

Have you seen wolfram alpha in action?

-----

Peaker 270 days ago | link

For me, Wolfram Alpha has always just worked on the examples given by them, but when I try my own inputs, it fails on almost all of them. Even though my inputs are just variants of the same queries.

-----

chad_oliver 269 days ago | link

Yeah, that's part of the reason why I said that "we'll be able to solve a greater number of problems by 'defin[ing] your goal, and that's about it'". Progress will happen at a steady rate, but the problem space will expand even faster.

-----

hexagonc 268 days ago | link

I agree with you, but all you've really said is that we will rely on general AI to accomplish our goals. The stupider the AI, the more you have to define yourself. You'll have to define the terms used in the description of your goal as well as terms used by those terms. This sounds like defining requirements. The problem with this is that, lacking general AI, the effort it takes to define your goals precisely will probably be equivalent to writing a program anyway. The only difference between defining requirements and coding is people writing requirements can rely on a lot of background knowledge by the reader. Most of us that have worked in the corporate world know how poor technical requirements can be.

-----

count 270 days ago | link

This sounds suspiciously like PROLOG...

-----

antrix 270 days ago | link

Curious to know, other than SQL, are there any other examples of successful technologies that achieve this intentional, goal-driven programming?

-----

primaryobjects 270 days ago | link

This sounds like genetic algorithm programming (see my other comment below https://news.ycombinator.com/item?id=6080492). Writing a program consists of defining the end state. The GA then runs through thousands of epochs, getting closer and closer to the end state (determined by a fitness score for each program), until the solution is found.

I was able to achieve automated programs for printing text, simple math operations, string manipulation, and conditionals. After that, programming the fitness methods started getting complicated.

-----

dualogy 269 days ago | link

"Answer Set Programming" or more generally, Constraint Programming.

-----

coldcode 270 days ago | link

Looking at all the languages near the top of language popularity chart, I don't think OO is going away. Last I checked Python was still an OO language. If you look at all the jobs on the job boards OO is still the dominant flavor as it has been for 20 years or so (only plain C is the oddball). Will that still be the case 20 years from now?

-----

weavejester 270 days ago | link

I think we're starting to see a shift away from the areas that OOP is traditionally comfortable with.

Increasingly, modern system design tends to place an emphasis on bare, immutable data, whereas OO tends to work with encapsulated, mutable state. If the only thing you're using objects for is namespacing and polymorphism, then there's not much point to designing software within the OO paradigm.

-----

cldr 270 days ago | link

I have noticed this too; all the "functional" languages I've seen (i.e. those with immutable data) encourage working with bare data. Is encapsulation only necessary for mutable data?

-----

weavejester 270 days ago | link

I think immutability removes one of the reasons for encapsulation: controlling state change. With that gone, the disadvantages of encapsulation might outweigh its advantages in a lot of cases.

-----

KMag 269 days ago | link

Also note that many languages that encourage immutable data also have first-class functions with closures, which are also a very powerful tool for encapsulation and abstraction.

I think lambdas more than immutability help one write clean abstractions without traditional OO-objects. See the "Lambda the Ultimate" series of papers for details.

-----

hexagonc 269 days ago | link

Encapsulation also helps enforce interfaces. Anything outside of an object should only interact with it via an external interface. This allows you to reason about the interaction of objects independently of the implementation of those objects. In theory, this allows you to change the implementation of an object without the interacting objects knowing, thus reducing the overall complexity of the program as well as potential bugs.

-----

thoughtpalette 269 days ago | link

I had to google Polymorphism and Immutable objects.

Thanks for contributing to my knowledge!

-----

dudurocha 270 days ago | link

It's the same as saying that NoSQL will undermine old and good Relational DB's.

-----

christianbryant 270 days ago | link

Though many see it as science fiction, quantum "programming" shouldn't be excluded from the list. I realize he aimed to generate a practical discussion around where we are now and what we need to do in the immediate (5 years) future, but just as nanotech really needed to get off the ground in people's heads before it could get a foothold in popular culture as an actual technology, quantum computing is in need of more public analysis and simulation. He might have added a last section there titled "And Beyond..." for topics like this :)

-----

goldfeld 270 days ago | link

Are there technologies aimed at quantum programming available and accessible today, even in a completely experimental status?

-----

mietek 270 days ago | link

Check out Quipper:

http://arxiv.org/pdf/1304.5485v1.pdf

-----

mike_esspe 270 days ago | link

QCL: http://tph.tuwien.ac.at/~oemer/qcl.html

-----

christianbryant 270 days ago | link

Good reference point here for QC simulators/environments under multiple languages, but QCL noted below by Mike is probably the best jump start.

http://www.quantiki.org/wiki/List_of_QC_simulators

-----

ericHosick 270 days ago | link

In the future, programming will not be done through coding but through composition. I'm a bit biased on this prediction.

-----

Goosey 270 days ago | link

It already is though, isn't it? As zanny points out (https://news.ycombinator.com/item?id=6081020) we already make heavy use of libraries and software packages (I don't see much difference between an embedded webserver library and hosting in a standalone webserver). Even discounting components such as the OS I'll blindly assert that nearly every significant program created today has a runtime path that is 95% 3rd party components, if you were somehow able to measure the LOC passed through.

Of course LOC != 'contributed value'. Much as people don't assemble in thousands to see a jukebox progress through a playlist but do for the experience created by a skilled DJ, the last 5% is the most significant 5% in programming (and perhaps, all creative endeavors)

-----

alatkins 270 days ago | link

Kind of like how OO was going to give us massive libraries of OTS software components with which we could just 'wire-up' programs?

-----

zanny 270 days ago | link

You generally can, though. All the web servers, all the frameworks, all the packages. If I wanted to make a graphing calaculator in Python I wouldn't write a reverse polish calculator or do more complex string parsing on user input, I'd use numpy and sympy, with a qt gui, probably written in qml.

I see very few domain problems where the boatload majority of the work hasn't been done for you, and you just glue legos together. Network stack? Most languages have an httplib or you could get libftp or some such from a foss repository.

The only real problem is figuring out which api is easier to use, since there usually are competing choices on a lot of these drop in solutions. qt or boost? django or bottle? backbone or angular?

-----

alatkins 270 days ago | link

Yes I agree, but software reuse isn't exactly new. The commenter was obviously referring to functional programming techniques, and I was trying to point out that other silver bullets have been spruiked in the past, only to fall short.

And besides, someone still has to program these libraries - it isn't turtles all the way down.

-----

lifeisstillgood 270 days ago | link

That's the difference between a musician and a DJ

-----

duaneb 269 days ago | link

> That's the difference between a musician and a DJ

Are you insinuating a DJ is not a musician?

-----

bergie 270 days ago | link

Working on that: http://noflojs.org/

-----

Peaker 270 days ago | link

What's the difference?

-----

narzac 270 days ago | link

Well the article touches some good points, since it made me think...

The problem is the language and paradigms used not the framework or how many levels you abstracted the problem at hand.

The way as i see it, writing programs by manipulating data excessively, will lead us nowhere but complexity, which unfortunately is introduced by the program itself. Mainstream languages such as C++, Java, C# etc., should not be taught in schools as if they are the ultimate solution, and fp is sth not practical, it is a twofaced claim while stealing ideas from fp and patching these languages, nowadays.

Another one, saying how fast the IT changes at every chance and sticking with ancient programming languages. When someone points out the dilemma, then claiming not to having enough developers for say Haskell, Clojure, Go etc.. May be you should fix the education system morons, instead of building more complex frameworks, platforms.

Of course there are particular areas, such as simulations, modelling time dependent large data sets, embedded development etc.. where some languages will be the best suited while others will be overkill or not just fast/ viable etc.. Of course, I am not blindly saying, "Death to imperative languages" :P However, they shouldn't dominate.

As a final comment, The future of programming is already here, The question is are we ready for the future...

-----

kabdib 270 days ago | link

Whether you like it or not, your world runs on C, C++ and assembly. These languages form the ninety-percent-plus core of modern computing's foundation. Lift the hood of nearly any embedded system, BIOS or OS and you'll find these in heavy use. You'll find C code in the networking layers that let you talk to the world, and in the very light switches that let you go to the bathroom at night. C runs your dishwasher, your car, your elevators, and probably your toothbrush.

Then a lot of stuff gets layered on top of this. And sure, languages /do/ jit, but these are generally hothouse flowers, surrounded by an infrastructure provided by C, C++ and assembly. Few systems are native boot without involving a bunch of C.

I'm happy if people are satisfied to work in the layers above all this stuff. Frankly, not many folks (as a percentage of the programming population) can do good kernel level work. But don't pretend that it doesn't exist, or that it is somehow morally inferior to hacking away in Haskell.

Maybe this will change in thirty years; I think that's the time scale required to make a fundamental change in the way we program modern systems.

I'd /love/ to see a native Erlang system, soup to nuts. But there's little economic incentive to make one, given that the lower layers are actually doing a pretty decent job.

-----

Chris_Newton 270 days ago | link

Whether you like it or not, your world runs on C, C++ and assembly. These languages form the ninety-percent-plus core of modern computing's foundation.

Isn’t that part of the problem? There is no technical reason we couldn’t have a language that offered the same fine control and hardware integration as C, compiled to native executable code in a similar way, but was both safer and more expressive. There is no advantage in having an awkward syntax for specifying types or in making all pointers nullable.

Mainstream industrial languages today are a triumph of good enough, and they continue to dominate primarily because of momentum and the size of the surrounding ecosystem rather than technical merit in the language itself. Unfortunately, this creates a vicious circle that reinforces the status quo, and the few organisations with sufficient resources to break that cycle have limited economic incentive to do so.

-----

kabdib 270 days ago | link

There have been many contenders, notably Eiffel, Oberon and D. There are many others that I don't immediately remember the names of.

These languages have /great/ technical merit. They offer safety (in various forms) and other interesting technologies that C definitely lacks. They were lauded by academics and industry pundits. So why didn't they take the industry by storm?

Perceived technical merit is a terrible way to choose a language.

Pascal was widely regarded as a great language, a wonderful model, and it was widely used in the 80s by various large companies. Today it is mostly dead. I believe this is because Pascal only did an adequate job of expressing stuff at the hardware and kernel level, and that C was better. Certainly nearly everyone at Apple that I worked with breathed a sigh of relief when it became obvious that it was okay to write C instead of Pascal for new projects. For the most part we'd been writing C for years anyway, just in Pascal. About the only thing that people missed were nested procedures (whereupon, C++).

Your new "adult" language is going to need a set of very compelling offerings over and above "well, it's safer" in order to succeed.

Take a look at things people are doing /to/ C in order to be better:

- "analyze" builds that do control graph analysis and find bugs (not just ones endemic to C, but actual logic bugs, too)

- declarative sugar that helps tools to reason about what things like drivers are trying to do

- control extensions (commonly seen as macros providing 'foreach' like support)

- ways for tools to enforce local conventions (without spending tons of manpower on parsers and so forth)

Come up with a language as good as C at low-level programming, that has great debugger support, offers easy tool plugins, and that has interoperability with the gazillions of libraries already available [take a page from C#'s great interop story here], and you might have something. Go "academic" and just say "this is good for you, use it instead," and the working programmers will see nothing in it for them and ignore you, just as they've ignored or abandoned dozens of other offerings in the last 30 years.

-----

Chris_Newton 270 days ago | link

There have been many contenders, notably Eiffel, Oberon and D. [...] So why didn't they take the industry by storm?

Some possible reasons, based on my limited knowledge of those languages:

Eiffel — Emphasis on simplicity over performance optimisation; emphasis on OO programming style; legal issues around various parts of the ecosystem in the early days

Oberon — Limitations of basic type system, such as a lack of enumeration types and the way coercion of numerical types worked until recent versions

D — Many of the same major strengths and weaknesses as the more established C++; two rival “standard” libraries for a long time

Your new "adult" language is going to need a set of very compelling offerings over and above "well, it's safer" in order to succeed.

Of course. You can throw in “it’s easier to write” and “it’s more powerful” and you still only have a small part of the big picture, because in reality so much depends on the surrounding ecosystem: development tools, libraries, and so on. However, there is no reason we couldn’t have a language that was superior to C in both safety and expressive power, remained compatible with calling to/from C functions at ABI level for library compatibility and ease of porting, and used a clean grammar to help tool developers.

Take a look at things people are doing /to/ C in order to be better:

While I don’t disagree with any of your examples, I’m not sure they really tell us anything useful. The absence of other things that people might do could be because they aren’t particularly valuable or it could be because they are valuable but also prohibitively difficult or expensive to achieve starting from C as the foundation.

-----

pjmlp 270 days ago | link

C's success in the industry is largely tied to UNIX's success.

As UNIX systems started to spread into the industry and getting market share from mainframes, developers also wanted to have those utilities in their home systems. This lead to C being spread outside the natural environment (C) and "infecting" non UNIX systems.

As Pascal refugee, I only touched C when required to do so. Even with its quirks I find C++ offers a more welcoming place, thanks to its stronger type improvements and better abstractions over C.

Like any systems programming language, C will only get replaced if the operating systems vendors force developers to use something else.

This is what Microsoft is doing by transitioning Visual C++ to a pure C++ compiler.

UNIX vendors will never do it because C is synonym of UNIX. All even C compilers get to be written in C++ nowadays.

Who knows what other OS vendors might still be relevant.

-----

ippisl 270 days ago | link

At least in embedded systems, this looks like something that can be done the open source way. The success of the arduino and the mbed in building popular ecosystems has shown us that.

I can see some better version of the .net micro framework, being improved by the open source community(1), and benefiting from the ms ecosystems and tools, becoming fit for some large part of embedded systems work.

Also I can see this getting adopted since it let's embedded developers in small companies, who has some power on which tools to use, learn another skill that can better their employment opportunities.

(1)improved speed could be achieved through using the cito compiler project, which might be good enough for many projects. And for making the language hard real time, it could be done using specific implementation of reference counting, although at some speed/memory cost.

-----

rdtsc 270 days ago | link

> I'd /love/ to see a native Erlang system, soup to nuts.

I see where you are coming from but I don't see how assembly and a layer of C drivers is going to disappear. That is the whole idea behind having layers of abstraction. It is what makes one plug in a mouse and it works and then another mouse made by another company and it also works. There is a lot of firmware, assembly, kernel driver, kernel syscalls taking place but it support you moving your hand and a pointer moves on the screen correspondingly. We take that for granted but it is built on an existing infrastructure. It took years and it wasn't always the same. I remember older DOS days having to fish for mouse drivers.

Same with Erlang. Erlang itself is built with C. Beam VM is mostly C (maybe with some assembly). It runs on standard hardware with a few standard operating systems.

Now if we dream of the future. I can see architectures perhaps that do something like you suggest. There is one for instance: Erlang on Xen. The idea is that Erlang runs on a very small hardware footprint, without an OS (since Erlang's standard library pretty much provides large number of OS like features). Now they picked Xen and you can say perhaps "ha that's cheating, there is also Dom0 running!". But it isn't practical otherwise. Maybe they could have picked a particular motherboard and CPU combination but now it is hard to reproduce that, test it and lots thing have to be rewritten.

Take a look anyway, I think you'll be surprised:

http://erlangonxen.org/

-----

narzac 269 days ago | link

You do realize your answer is based on the pointed out problem. Let me put it this way, I am saying, "We should wear better shoes, and you are saying, even you don't like it, you are wearing it right now, and it does a decent job". Yes that is what I am talking about :)

Oh system programming does exist, and I can understand the role of C, what I am saying is, even we have a better alternative to C, people would still stick with C.

There is no economic incentive, you are right but the reason itself mostly economical not technical.

-----

puredanger 270 days ago | link

Sounds like a recipe for Strange Loop http://thestrangeloop.com/sesssions !

-----

ShardPhoenix 270 days ago | link

> We’re making faster and more powerful CPUs, but getting the same kind of subjective application performance that we did a decade ago.

My modern computer with an SSD feels subjectively faster/snappier to me than any computer I've owned in the past.

-----

kriro 269 days ago | link

Increasing the problem space (as opposed to constantly shrinking it which the author suggests as a trend) can actually make things easier to solve.

You can see the small version of that in a language like Prolog where a common pattern is to generalize the problem because more generalized problems tend to be easier to solve/reason about (often because they already come with easy to discover base cases and steps for a recursive approach)

-----

brianberns 270 days ago | link

> In the mathematical world, data just is, it has no behavior, yet the rigors of C++ or Java require developers to worry about how it is accessed.

This may be true, but in the real world, data and behavior are still intimately tied together. Data-oriented (i.e. functional) programming is going to enhance OO programming, not replace it entirely.

-----

Peaker 270 days ago | link

I could counter your mental model of the real world with:

The "real world" is a pure function of time -- therefore functional programming is the "right way" to model it.

Examining our mental model of the "real world" is not going to give us good insight about which programming tools will model it best.

-----

textminer 270 days ago | link

I've moved recently from building data manipulation and machine learning systems in Python to a stack that's primarily C++, and it's striking how much less nimble I now feel transforming that data or in building machine learning pipelines.

I suppose building basic tools as Apache Thrift services would allow one the ease of prototyping ideas in Python before building a performant system in something like Java or C++.

-----

brianberns 270 days ago | link

I suspect that's probably due to C++ being a lower-level language than Python, not due to any inherent problems with OO programming.

BTW, Python describes itself as object oriented, so it illustrates my point that OO and functional are not at odds.

-----

pjmlp 270 days ago | link

Many developers tend to think OO == Java/C#/C++ as they never learned other OO paradigms.

-----

primaryobjects 270 days ago | link

I'm still hoping for computer programs to be written without humans http://www.primaryobjects.com/CMS/Article149.aspx

That was my initial experiment with self-programming AI, although ultimately, the fitness methods were starting to grow in complexity themselves.

-----

Peaker 270 days ago | link

Defining the spec would be the "programming".

-----

danso 270 days ago | link

> The prevailing form of programming today, object orientation, is generally hostile to data. Its focus on behavior wraps up data in access methods, and wraps up collections of data even more tightly. In the mathematical world, data just is, it has no behavior, yet the rigors of C++ or Java require developers to worry about how it is accessed.

Er...I thought one of the underlying parts of OOP was creating a domain for data, such that derivable attributes and relationships were encapsulated in the data model? The way that R treats data as a first-class citizen is nice for some statistical modeling, but doesn't seem robust enough for all the other ways that we need to organize and munge data.

-----

bsg75 270 days ago | link

> there’s a bias to languages such as Python or Clojure, which make data easier to manipulate

I assume its the libraries available for Python (Pandas, NumPy, Blaze) that are the basis for this quote. Is it also the case for Clojure (as compared to other FP languages)?

-----

skierscott 270 days ago | link

> Look around your home. There are processors and programming in most every electronic device you have

And those devices are packing more punch for their size -- just look at the Raspberry Pi and it's competitors.

-----

floor_ 269 days ago | link

Nothing about massively parallel programming. Bummer.

-----

tossmeup 270 days ago | link

This feels like an odd angle to take in concern around the next decade of programming.

It starts a level below what problems might be the next thing to tackle and focuses on what shiny objects have already got enough traction to be considered concern of the past 3-5 years.

Short-sighted. Boring. Couldn't dance to it.

1-3 years are feasibly predictable 3-7 are 50/50 hunches 20%CI 8-9 you might as well be talking jet packs 10+ You aren't talking about saving or controlling the world? Refactor!

10 years from now, we blow IP up. we napster algorithmic experiences because you can't patent wiping your ass 10 years from now, we make devices cheaper than water from scrap plastic 10 years from now, any human can talk to any other human anytime, anywhere 10 years from now, my meta-data is meaningless and we defund NSA because they are deaf dumb and useless 10 years from now, we eat the rich and feed the poor

Why bother looking 10 years ahead without setting some real goals or at least looking at things that might actually drive the innovation in the next 10 years vs what is already well-planned?

We're all going to be 10 years closer to death and you are believing we'll have "smart dust"? A Roomba in every house!

Do you know how long it's been since I first heard that there would be fucking smart dust? You mean we need to lay a powder of infrastructure down to detect what distributed, distant sensors could tell you? Looks like a whale just barfed in the ocean and we got a ton of our dust back online... water's still wet AND salty there! Dial back the Antarctica dust belcher for a minute to balance out the South Pacific by next year... stupid whales, when will you learn?

He should have stopped at sensors. We'll have 1984. Smart dust? We'll be crotch-deep in gray goo.

Whales are like, "IIIIIIII WIIIIIIILLLLLLL DEEEEEEEEEESSSSSSSSTRRRRRRRRRROOOOOOOOYYYYYYYYYYYYYY!!!!"

-----

celeryreally 270 days ago | link

You lost me somewhere between eating the rich, and whales...

-----

miester_barfie 269 days ago | link

Yes, but it was fun

-----




Lists | RSS | Bookmarklet | Guidelines | FAQ | DMCA | News News | Feature Requests | Bugs | Y Combinator | Apply | Library

Search: