Hacker News new | past | comments | ask | show | jobs | submit login
Can Programming Be Liberated, Period? (2008) [pdf] (weizmann.ac.il)
43 points by sktrdie on Dec 26, 2017 | hide | past | favorite | 48 comments



It's always interesting and amusing to read these sorts of articles. This has been a recurring dream and topic of conversation since well before I wrote my first program in 1975. It's easy to understand why. We've built these super fast number crunchers and added thick layers of abstraction that let us do useful things like encoding videos and tweeting. These things seem magically elevated above and beyond the simple operations from which they are assembled, and it's tempting to think we can continue piling up abstractions until we get to "intelligence."

I suspect, though, that the fundamental metaphors of human intelligence, whatever they are, far transcend "processing" the way our number crunching machines work now. That's why analogies to the way we interact with others ("Why can't I just tell the machine what I want?") fall woefully short of providing any insight into how we might program computers in the future. As long as the computers we are programming are processors of the same sort we program now, they will remain far, far too simple to ever understand what we want and act on our desires. To the extent they get part of the way to that ideal it can only be by relying on deeper and deeper layers of software, which at some point has to be written in the old-fashioned way. If that remains true then it would probably be dangerous to allow ourselves to get too far removed from the way things really work. We already know how painful it can be to end up relying on stuff nobody understands any more to provide critical functions.


That's my favorite: "Why can't I just tell the machine what I want?" (Often preceded by "What I want is simple.")

When people tell me something like that, I start asking questions about what they want. It only takes one or two questions to evoke a reply from them of, "Oh, I don't know... maybe it should do this, or maybe it should do that; it kind of depends..."

And therein lies the real problem. When the end user cannot even clearly define (or even decide!) what they want when a certain situation comes up, how could a computer possibly know? Put the user next to a good developer, and I promise the bottleneck will be the user.


Exactly. And this is the core "irreducible" difficulty of programming which visual languages, literate programming, etc. cannot solve. Sure, we can set up a nice set of defaults which bypass a lot of the starting-out work but as soon as you try to actually make something new, you're back to square one: What do you want to achieve? How are you going to do that?


Actually, literate programming can help with this, as it forces you to write what you mean to do (and then you can compare that to what you're actually doing in the code).

Literate programming is not "natural language" programming...


> When the end user cannot even clearly define (or even decide!) what they want when a certain situation comes up, how could a computer possibly know?

Is it too hard to say “No program for you until you make up your mind about what you want” ?

---

seanmcdirmid:

> ”programming to think“

If the user wants to take this approach, I am more than willing to give them programming manuals and references to other learning resources. I don't want to be involved any further than that with it, though.


> No program for you until you make up your mind about what you want

This is more or less the HN response to questions like this every time they come up.

You can't fully automate programming because it is the very act of communication that defines the program. Without some type of brain-computer interface that could download your mind and "execute" it, you still have to communicate. Programming is the most direct and precise way to do this.


> You can't automate programming

Nobody is talking about automating programming. The point to my previous message is “There's no point to writing programs for users who don't know what they want.” At least not if you're acting in good faith. If you want to manipulate other people, that's a-whole-nother business, but I'm not enough of a psychopath to want to do that kind of thing.

> it is the very act of communication that defines the program

Programs are defined by the syntax and semantics of a programming language.

> Without some type of brain-computer interface that could download your mind and "execute" it, you still have to communicate.

You only need to communicate if you want to transmit an already existing program to someone or something else.


You're both arguing the same point...


Not at all. I emphatically disagree with this:

> it is the very act of communication that defines the program.


  > it is the very act of communication that defines the program

  Programs are defined by the syntax and semantics of a programming language.
In what way is the language of programming not communication?


You don't need to communicate your programs to anyone or anything for them to exist. If I write a program on a piece of paper, and don't give this piece of paper to anyone else, a program still exists. (I might have to transcribe it into a computer if I don't want to run it manually myself, though.)


And written language isn't a form of communication? (In your example, from yourself to a future form of yourself, or from yourself to the computer).


Well, if you take the view that persisting any representation of information from one spacetime event [0] to another is “communication”, then pretty much any physical process is “communication”. But then the term kinda starts to lose its meaning.

[0] https://en.wikipedia.org/wiki/Event_(relativity)


Whatever happened to “programming to think”, is it just “thinking to program” now?


One common interview question I've run across is to write the code for a simple four-function calculator, the kind that you can find for a dollar or two in nearly any convenience store, right next to the pencils and erasers.

Hey, it's all just buttons and a display doing simple things, right? Only it turns out to be a fairly complicated task, with a lot of corner conditions, even when you treat the math library as a given and leave out all the really low-level stuff (button handling, display refresh, etc.).

You're not going to get to a usable calculator by having a casual conversation with a user interface builder.

"Computer, make that button green, um, no ... yellow" is comparatively easy. "Computer, now make that button add the two numbers together" is begging so much of the specification that it's not really programming any more. And if you do go deeper and try to teach the computer about pending operations, number entry and a stack (to handle parenthesis), you're programming again, only with a lousy IDE.


One common interview question I've run across is to write the code for a simple four-function calculator, the kind that you can find for a dollar or two in nearly any convenience store, right next to the pencils and erasers.

Coincidentally, I've previously posted here a detailed model/explanation of how those work: https://news.ycombinator.com/item?id=9456444

They're usually based on a 4-bit mask-ROM microcontroller running at several kHz. Very "simple" in relative terms to most computers today, and in hindsight I could probably write the code for one quite quickly, but I bet the 99% of programmers who haven't ever really given a thought to how they work in detail would struggle to come up with an accurate description in an interview.

Also worth reading: https://news.ycombinator.com/item?id=6302364


Out of curiosity, in the description you linked you say it's a two-register model but then you mention R, O and I but only refer to O as a register. You refer to both R and I as accumulators. I assume at least one of them is also a register, but that would suggest it's a three-register model, no?


I meant two actual registers/accumulators (R and I) which store numbers, and O which only stores the operation being performed.


Gotcha, thanks.


I’m about to do a 4-bit 4 op calculator with only logic gates, for a project. Not that I could do it in an interview!


I don't understand why this is a goal...

Yes if machines could self-teach the way humans do, they'd be really powerful. However, in many ways programs are better than people, and we try to get people systems be MORE like software.

Often times you want a system with hard, uncompromising rules, following a strict process without exceptions (e.g. an ATM). Bank tellers try to be non-human and ATM-like.

The real dichotomy is between predefined systems and adaptive systems, and there are times we want each. Also, despite the author's fantasy, I don't see any reason to believe life as a whole would be better if computers did both.


no. the hard part was never programming anyway but mapping the real world ever mutating quirks into regular structures


Maybe in your corner of the world. In mine, programming in and of itself, i.e. devising mechanically executable (a.k.a. algorithmic) solutions to well-specified problems, is by far the hardest task.


if it's well specified but hard to convert into an executable it's time you invest in selling a transpiler into your specific domain - however, if the problem is creating the algorithm and not whatever language you're using then we're telling the same thing


It's a completely different well-specified problem each time.


Well, polish your resume. Sounds like your job actually could be replaced by a computer.


Thanks for your concern. Of course I'm constantly learning new skills to avoid being left behind. However, present-day artificial intelligence is still very far from being able to replace my job.


I'm not sure if we hugged it to death, or if the website's down for maintenance, but looks like the site is unavailable.

Thankfully there's archive.org snapshots: https://web.archive.org/web/20171226133115/http://www.wisdom...


A lot of apps are just read write update to a database, and binding the data to views. I think "programming" such apps could be simplified, eg SQL++ where you make queries by talking to the computer ...


Not a chance. You just described the surface area. The complexity is in the business rules and logic behind those IO interfaces.

We could develop tools that would build those interfaces from words you speak, but until you can speak deterministic logic and rules to the computer to define the behaviors you need, you still need programmers. In fact, I would argue that the programmer/developer is less a person who types code and more a person who helps a user decide the rules of the systems they have requested.


Oh boy. This reminds me of the utterly failed attempts of the last 30 years to eliminate programmers. You know, when companies were telling us that creating business software will be as simple as connecting a few boxes on a flowchart, and their intelligent snake oil-driven framework will automatically translate that into working software. Even managers will be able to do that, right?!

Funny how we ended up needing more programmers than ever.


The tooling improved, so that the given tasks could be done even by non-programmers, but then actual programmers were able to use the improved tooling to do more complicated tasks, of which there never seems to be a shortage of, so that's what the industry moved to.


I wouldn't say utterly failed. The tools never completely replaced programmers, but they allowed programmers to be replaced in certain domains.

The obvious example is Excel. Imagine if there were no Excel. How many more programmers would every enterprise needed to hire just to reach the current level of productivity. Every bank, hedge fund, and financial institution would have hired a programmer for every one or two analysts. They hire a few now, but nowhere near the level that would have been necessary without Excel.

Another example is LabVIEW. The interface is basically a board where you arrange your flowchart boxes (virtual instruments in LabVIEW terminology), configure them through the menus, and connect them together with wires to express the flow of data. The mechanical/electrical/... engineer herself designs the programs that read data from sensors, save them, analyze them, and send appropriate instruction to actuators based on them [1]. I have seen entire assembly lines running on LabVIEW. To the best of my knowledge, the entire program was conceived without any input from software engineers. Had LabVIEW not existed, many many more software engineers would have had to be hired by laboratories, manufacturing facilities, and research institutes.

Not every attempt for eliminating programmers has been successful, but to call all of them snake oil-driven is incorrect. The field of software engineering is young and has not stabilized yet. I would expect that in the coming decades, increasingly more products would be created to replace specific uses of software development. The demand for programmers in those segments would fall. The question is, would the demand in new ventures and markets increase enough to completely neutralize losing those segments? That remains to be seen—with a field as young and rapidly-evolving as software engineering, historical trends do not tell much about the future.

[1] I should perhaps add that I am a mechanical engineer who also codes.


In all fairness, business would be wise to dispense with much of the pointlessly low-level technology it uses. You want to automate a business process? Then formalize the business process itself, not the tool that you think could help you automate it. And do it in a programming language, of course.


High-level programming is only possible once we admit that what we want to do is programming. Too often there's a perception that "configuration" is easier or safer than programming, even when that configuration is Turing-complete.


It's like the web development treadmill. You start out with HTML and some scripting. You build your non-technical users a web site, then an admin page so they can upload pictures and text to their web site. Gradually they request more flexibility and control, until you've ended up reimplementing a crappy version of HTML and some scripting.



Spreadsheets in a nutshell.


Reminds me a little of: http://thecodelesscode.com/case/231


Formalizing the process is a noble goal, but there will always be clients that want your formalized process with "just a few tweaks".

I think we should endeavor to formalize as much as is reasonably possible, pushing the bulk of the developer work to dealing with the special cases. But even that will eventually result in a small formal core surrounded by layers and chunks of customizations.


> Formalizing the process is a noble goal, but there will always be clients that want your formalized process with "just a few tweaks".

The client owns the process - it's their business after all - so they can define it however they wish. I'm not arguing for canned software. I'm arguing for a programming language in which business processes can be defined directly. (I'm aware of the existence of BPMN, ARIS, and other business process modeling tools, but those fall short of being actual programming languages.)


I'm for whatever tools lead to better outcomes (more complete solutions for less cost).

From my experience, however, non-technical decision makers often start from the assumption that the things they want are easy. As such, they tend to be reluctant to invest in new technologies or approaches. In fact, the most common first approach when they become convinced that what they want is actually "big" is to attempt to offshore. In other words, they naively assume that the solution to complexity is more (cheaper) bodies.

So even if a great tool comes along that allows us to meet needs more effectively, it's not guaranteed to be accepted.


kind of like how factories are built around the equipment rather than the equipment being built to work out how to function in a given factory.


It was more than the last 30 years. Eliminating the need for professional programmers was the original design goal of COBOL.


Well there is plenty of "unprofessional" COBOL code out there, so I suppose the language designers sort of met their goal?


And it percectly illustrates how flawed was their goal in the first place.


Good point!


Sadly, it was a bubble. While media pushes stories about self driving trucks and cars, big compute is sucking the air out of the it job market at an accelerated pace.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: