

FORTRAN should virtually eliminate coding and debugging - cavedave
http://www.softwarepreservation.org/projects/FORTRAN/BackusEtAl-Preliminary%20Report-1954.pdf

======
praptak
Well, it did eliminate some kinds of coding (the part where human produces
machine code from block diagrams) and debugging (the part where you look for
errors in the above translation.)

~~~
KC8ZKF
Yes, and the paper distinguishes between programming and coding. Step one is
"Analysis and Programming", step two is "Coding." It's step two that FORTRAN
virtually eliminates.

~~~
gruseom
Correct. It's only because of the success of higher-level languages like
FORTRAN that "coding" and "programming" came to mean the same thing. Before
that, programming was part of the requirements.

We've become inured to silver-bullet bullshit in software, but it's an
anachronism to think that about this report, which was dead right. In fact by
subsequent standards their claim was rather modest. The full quote reads:

 _Since FORTRAN should virtually eliminate coding and debugging, it should be
possible to solve problems for less than half the cost that would be required
without such a system._

------
Peaker
Compared to hand-writing machine code, it might even be a reasonable thing to
say. The majority of effort is eliminated.

------
DeepDuh
Being currently involved in GPGPU research, these descriptions remind me of
the state we are in for GPGPU computing. CUDA, OpenCL and now OpenACC have
been steps towards a higher-level abstraction of stream computing and every
time a new framework / language bubbles up, the inventors praise it as the end
of coding close to the machine.

------
jhrobert
Half a century later, people are still overly optimistic about software
development. According to recent studies, this is true for everything (and
society at large is actually in favor of it).

Yet it is particulary visible in computing. Why is that?

Side note: bullshit in marketing brochure is here to stay.

~~~
maclaren
Half a century later the change isn't from assembly to an actually useful
level of abstraction. The bullshit-o-meter is damped further when half the
document shows code examples that would be much more difficult in assembly.

------
mrgoldenbrown
If you limit yourself to the types of programs that were created before
FORTRAN existed, then this might be true. But of course as capability
increased, demand for more complicated program increased just as fast (or
faster?)

------
ericHosick
This vision of eliminating coding will eventually be realized. It is
inevitable.

However, people will still need to know how to program (AI not being a
factor).

~~~
josefonseca
> eliminating coding will eventually be realized > people will still need to
> know how to program

I think that by coding, you mean typing? In your example, we won't be
eliminating coding, we'll just be entering code using a different language.

~~~
ericHosick
Not coding and not a programming language to speak of. For example, when
someone configures their browser to use a proxy server they are programming
without coding (though they do need to type).

It could be possible to verbally configure the browser to use a proxy server
and I guess that would be the different language you speak of. However, this
leads to the need for AI or some kind of "intelligent" system: difficult to
make one that isn't domain specific.

Though such an intelligent system is also inevitable (in my opinion), I think
there is a step between describing software using such a system and writing
code as we do today.

This step would be some kind of domain agnostic software framework that can be
used to create software without the need to "code out" a solution. The
framework itself would need to be coded, but the usage of the framework would
not.

~~~
josefonseca
Coding is necessary for computing, it's not going away, just like math isn't
going away due to the evolution of computers. The better computers will use
math better, but math is there because it's the basis for what we call
"computing". Coding is the method by which we turn our ideas into practical
logic.

If you think typing in long programs is going away, you may be right - the
future looks more like a Lego type of programming rather than the current sea
of logical equations. But in that case I'd say you've coded your message in
the form of lego blocks, but coding is still there in essence.

When you speak instructions into a computer, you've still coded. When you
change your proxy like in your example, you've coded(in a non imperative way).

------
koeselitz
Thankfully, they were right - it did. I've known a lot of C programmers, C++
programmers, Python programmers, and Java programmers; only a tiny handful of
those programmers actually knew how to "code" (that is, write machine code.)
FORTRAN, and the interpreted languages that came after it, really _did_
"virtually eliminate coding and debugging."

------
lifthrasiir
Please mind that the report was written in 1954. The real complexity of
programming and computing in general was not fully understood at that time.
(I'm confident that we still do not understand it in its entirety, however.)

~~~
Edootjuh
What do you mean by that, exactly? Just out of curiosity.

~~~
praptak
Many results about unfeasibility of computer-based solutions to some problems
were not known. For example the first important results about NP-hardness come
from the seventies.

I believe it wasn't even clear that the big O complexity is important in
assessing how effective an algorithm is. This is pretty much obvious to us now
(sometimes too obvious - there are some edge cases when the constant factor
wins over the asymptotic complexity).

~~~
scott_s
I agree with everything you said, except the parenthetical. I have difficulty
considering matrix multiplication an "edge case," and I think that it's more
common than you imply for us to choose algorithms with a higher asymptotic
bound because of constant factors and architectural effects (mostly caching).

~~~
praptak
Ok, agreed about matrix multiplication and probably a few other problems,
disagreeing about cache.

You cannot really say that cache-aware algorithms have higher asymptotic
bound. Some of them might happen to have below-optimal asymptotic bound _in a
cache-unaware memory model_ , which sort of misses the point of the algorithms
being aware of cache.

~~~
scott_s
_You cannot really say that cache-aware algorithms have higher asymptotic
bound._

Don't worry, I'm not. I'm saying that sometimes, naive algorithms have better
cache behavior than more complicated algorithms with lower asymptotic bounds.

