
Grace Hopper - networked
https://en.wikipedia.org/wiki/Grace_Hopper
======
zackmorris
I worked with COBOL near the end of my last contract and found aspects of it
fascinating compared to today's languages. Everything is about structures that
map directly to the bits on disk, with fine grain control on precision and
data types. But then the language reads as a series of macros where you don't
have to remember the low level details: do this to this, put this here, if
this do that.

It's also a terribly difficult language to parse because it was designed for
ease of use by humans. This was at a time when programmers were thought of as
translators that mainly converted human language to binary. There is an
element of independence to COBOL that I wasn't expecting, and I imagine there
was some resistance to it when it was first introduced. I didn't know Grace
Hopper was behind COBOL until I was actually using it, but I wonder sometimes
if the challenges she faced as a woman at that time influenced its design.
It's probably the first language to really confront dogma in computer science.

I personally really enjoy learning about languages like Erlang and Lisp but
they have an achilles heal that everybody is in real denial about. They are
very difficult to read once they've been written. So most programmers scrap
what they've written and start over. COBOL isn't like that. Even someone
unfamiliar with computer science (all those business types who don't have time
for this stuff) could read it and get a general understanding of what it's
doing. The only other language I've used that surpasses COBOL in readability
is Hypertalk, invented by Bill Atkinson in the 1980s. I look at some of the
top languages and methodologies today, say Ruby and Angular.js and I wonder if
we're racing down a rabbit hole. Conceptually what they are trying to do is
admirable but they are starting to feel like dogma to me. How do they help the
average person accomplish what they are trying to do? I've only seen the one
interview with Grace Hopper on David Letterman but I have to wonder if she was
a hacker in today's world what she would think of the current state of things.

~~~
jerf
It's not dogma anymore. It may have been dogma at the time of COBOL, but now
we know plain-English programming _general purpose_ programming languages are
a bad idea, because we've tried any number of times, and in general if you
_don 't_ produce an atrocity, you're doing well above average. You end up with
a language that is still every bit as complex as a normal programming
language, _and_ you add the ambiguity of English to the problem. Instead of it
being a win, you lose, badly.

You can do better if you rigidly constrain your DSL's domain, but I'd argue
that in many cases you're still looking at "spending" some design juice on a
"plain English" interface and it's not at all clear that you've produced
something that is _improved_ by the "plain English" so much as produced
something that had enough design budget left to absorb the penalty without
being destroyed.

It's 2013 now, not 1970. Any time you're tempted to go "My goodness,
programming would just be so _easy_ if we did X", go out and look. Odds
approach 100% that we've done X, many, many times, and the reason you've never
heard about it is that it didn't work well enough to be talked about. See also
"fully visual programming language" and any number of "business logic"
initiatives over the years.

(To which the typical next cognitive reaction is to believe that it would work
if we just poured more effort into it. That may be true, but it's worth
pointing out you're entering into unprovable territory. And in many cases,
part of the idea really is that X is just so obvious and great that even a
partial implementation ought to show its promise immediately.)

~~~
zackmorris
I see what you are getting at that basing a programming language on one
dialect like english is a bad idea.

But what I disagree with is that pure mathematics is more useful than other
ways of thinking. Anymore, I find that having to adapt myself to whatever
methodology a language imposes on me is generally more work than the problem
I'm trying to solve.

Take for example the current hubbub over promises to try to reduce callback
hell in javascript. What I find almost amusing is that callback hell shouldn't
exist in the first place. It was solved trivially thirty years ago with
cooperative threads on pre-NeXT Mac OS. It's obvious in hindsight that
cooperative threads were a mistake for kernels, but they turn out to work
quite well for processes (the people who wrote Go realized this and did one
better, adapting some of the functional concepts from Erlang into a
cooperative-thread style that can run concurrently because there's little or
no shared data). You rarely have to deal with mutexes or atomic operations
(except in cases where one of your operations spans a yield). IMHO they left
yield out of javascript in browsers originally for political reasons, because
it was already handled by the runtime. Adding it back in is a reluctant
admittance that it's become a necessity in today's complex GUIs.

Anywhere I look today, I see these problems revisited that were really solved
in better ways years ago. A great example of dogma today is templates in c++.
Such an astonishing amount of boilerplate to do things that are so easy in
other languages, or with more fine grain control of macros. And a lot of the
verbosity goes away when you allow variables to infer their type from the
return value of a function. Statically typed variables are even a form of
dogma. What matters is the underlying piece of data, not the name of the
variables that refer to it. We should be dealing with things as a large graph
where the data is hashed or referenced somehow if we need a name for it.
Honestly most of the code I write now doesn't even have variables. I only use
them when I have to work around a limitation of the language (for a long time
php wouldn't let you use the array access [] operator after on function
result) or when a piece of code is too large to fit on a line. Variables
themselves don't actually exist, they are an abstraction. If C had a way to
make a const type and initialize it later, we could emulate many of the
advantages of functional programming by using the variable name to refer to
the result of a block of code. Another example are the blocks in objective-c,
which are just nested anonymous functions or lambdas. These should have been
part of the c standard from the very beginning, but with nicer syntax (like
javascript). I think they were left out for political reasons, because someone
early on recognized that they would lead to callback hell! How messed up is
that?

I could go on almost forever on this stuff. Lisp gets around most of these
problems easily, but to me it's the mathematical equivalent of machine code.
Haskell and Scala try to make it more readable but fail due to their obsession
with brief notation. Conceptually none of this stuff is all that complicated.
I remember the first time I saw someone using Excel and when it hit me what
they were actually doing, it shook my world so fundamentally that for a moment
I thought that everything I learned in computer science had been a waste of
time. That feeling never quite left me.

If I had a hope for computer science today, it's that we would take the sheer
simplicity of functional programming (graph-style like Excel) and build layers
above it that extend a programmer's leverage. Here are a few examples off the
top of my head:

* Dealing with everything as an array to provide inherent parallelism in Matlab

* Making concurrent programming look single-threaded by using select in Go

* SQL queries to retrieve data by its relationship, not its address

* Triggering actions to run when something changes like Prolog

* Dealing with atomic messages instead of streams like in Go

* Memory-mapped IO instead of loading files into memory

* Using pure sockets/file descriptors and pipes instead of proprietary APIs to access data (also pipes are just double-ended file descriptors if we could truncate from the front, and shouldn't exist as a separate concept)

* Tying objects to their representation, for example changing the visible attribute in javascript and seeing the element appear instead of calling a setVisible() function (in a fundamental way to ease the burden on programmers, not just syntactic sugar)

* Lazy evaluation/referential transparency (treating code like data and vice versa)

* Nested languages like Terra instead of writing glue (calling c from lua the way we used to call asm from c)

* Universal JIT compilers so we can quit arguing about one language being faster than another and focus on abstract concepts

* Integrating genetic programming into languages so we can express a function by its desired input and output instead of having to write every line of code by hand

* Losing the concept of a file altogether and focus on state like CouchDB

* Make all of this readable by treating the language as a view of the code (for example switching to tabs showing other views like prefix/infix/postfix or human readable like Hypertalk)

* Begin thinking of code as pruning a tree instead of manually solving the problem (let the computer explore every permutation of the problem space and return the results for review like solving equations in Mathematica x = yz -> y = x/z -> z = x/y)

* Standardize the way libraries communicate (Python should be able to talk to Wolfram Alpha and feed the results into a shell script then save each line in a SQL database with full unicode support)

* Get rid of drivers and adapt something like HTML for all devices so that they can be queried, controlled and listened to

This got way too long so I will stop there. It needs to be rewritten but I
don't have time..

~~~
MaxGabriel
I wasn't sure entirely what you were getting at with const variables in C that
are initialized later – and this doesn't fit your bill, if you were referring
to lazy evaluation – but you can do something like this (apparently works for
Clang and GCC):

    
    
        const int test = ({
            int a = 1+1;
            sqrt(a);
        });

~~~
zackmorris
Oh wow I didn't know about that notation! For anyone stumbling onto this, I
wanted to elaborate on what I was getting at with regard to functional
programming in c. When I learned Scheme back in the 90s, they had us do a lot
of expressions like this:

(* 1 (+ 2 3))

=> 5

But I always thought it was a kind of obfuscation. They didn't mention the
"define" keyword or that we could break things up into multiple lines until
later in the class than I would have liked. They wanted us to think in terms
of transformations on the data instead of sequential operations on it. So we
were writing these enormous functions enclosed in a single parentheses that
were terribly hard to decipher. This can be written as:

(define a (+ 2 3))

(* 1 a)

=> 5

So I always wanted something similar in c, being able to write code in a
functional manner without having to have these unreadable one line functions.
I just wanted to store the result of a computation in a temporary const
variable to break things up a little. I also didn't want intermediate
variables polluting my scope. For example:

const int test;

{

    
    
        const int a = 1+1;
        test = sqrt(a);

}

=> "Read-only variable is not assignable"

But your notation makes this possible (slightly embellished to show the power
of it):

const int someOtherResult; // 41

const int test = ({

    
    
        const int a = 1+1;
        sqrt(a) + someOtherResult;
    

});

=> test = 42

Now we can break up our statements and as long as we use only const variables
and don't use any global variables, we are programming in a functional manner,
for the most part. Cool! Unit tests become trivial to write, static analysis
finds most of our bugs, we don't have to use as many templates..

Then if we add lambdas/closures from C++11, we can pass around code as
variables and we are getting most of the benefit of pure functional languages
without being so restricted in syntax. The main drawback is that we lose lazy
evaluation, but in the real world that hasn't hindered me except when I was
dealing with databases. Supposedly java can retrieve database results lazily
but I don't know quite how that works. Perhaps something like it could be
adapted for c.

P.S. does someone know the name for the "({" notation? I couldn't find it
anywhere thanks!

~~~
MaxGabriel
I don't know the name; I found it through issue 105 of iOS Dev Weekly. Here's
the link to the article: [http://cocoa-dom.tumblr.com/post/56517731293/new-
thing-i-do-...](http://cocoa-dom.tumblr.com/post/56517731293/new-thing-i-do-
in-
code?utm_source=iOS+Dev+Weekly&utm_campaign=iOS_Dev_Weekly_Issue_105&utm_medium=email)

------
engineer40
I got to meet Grace Hopper many years ago. She gave me a nanosecond after her
talk. This was in the 1980s, probably while she was working for DEC. It made
quite an impression on me, and it's now framed on the wall. I wish I'd gotten
her autograph!

She was one of the original hackers. I don't think that she was a woman was
ever relevant to her, and by the force of her personality, and her efforts,
she made it irrelevant to everyone else. For instance, I keep forgetting it.
Not that she was female, but that it was unusual that she was female. EG: she
wasn't a "Pioneer of Female Programmers", she was instead simply a "Computer
Pioneer".

I don't know the specific barriers she had to overcome, but I know they must
have been significant-- for much of her career, people didn't really
understand what computers even _were_. Even as late as the 1980s, talking
heads on TV would often say things that implied they thought computers were
intelligent, or other equally silly perspectives... because they were so new
they had never been exposed to society at large. Now take that lack of
understanding in the population in general and project it onto the navy which,
as a military, is generally lower tech and more conservative? Of course she
was in the research area, but there were certainly huge amounts of
misunderstandings she had to deal with in people who simply had never been
educated about computers. So she gave these talks that continued after she
left the Navy and started working for DEC. She made it her mission to teach
people about this new technology. I can't imagine her refusing to speak at a
conference simply because there are more men than women-- I imagine most of
the conferences she went to, she was the only female speaker.

PS- I think her handing out the nanoseconds was brilliant, as a teaching aid.
She took an abstract concept and made it physical. How rare is that these
days? Plus if you can give someone something they're much more likely to
remember what you're trying to teach them. Even if it is a bit of old
telephone wire... that mundane piece of plastic and copper became imbued with
the story she told.

Quite and impression!

~~~
baldfat
I also meet Grace Hooper and I to received a nanosecond in the late 1980s :) I
was very impressed!

------
tshtf
Her interview on David Letterman in 1986 is quite impressive:
[https://www.youtube.com/watch?v=1-vcErOPofQ](https://www.youtube.com/watch?v=1-vcErOPofQ)

~~~
networked
Great find. I really like this video of her explaining nano- and microseconds
in a lecture:
[https://www.youtube.com/watch?v=JEpsKnWZrJ8](https://www.youtube.com/watch?v=JEpsKnWZrJ8).

------
psycr
She sounds totally and completely amazing.

    
    
      The most important thing I've accomplished, other than building the compiler, is 
      training young people. They come to me, you know, and say, "Do you think we can do 
      this?" I say, "Try it." And I back 'em up. They need that. I keep track of them as 
      they  get older and I stir 'em up at intervals so they don't forget to take chances."

------
chrissnell
The most fascinating thing (to me) about Admiral Hopper is her military
service. As an Army officer, I find it amazing that there was once an era
where you had so many very bright technical minds who were serving actively in
the military. These days, most of the military technical achievements are
being made by contractors working for the DoD but most of whom have never
actually served. Admiral Hopper not only served during the WW2 era (when
almost every able American adult was doing the same), she served for many
years afterward.

~~~
justin66
> As an Army officer, I find it amazing that there was once an era where you
> had so many very bright technical minds who were serving actively in the
> military. These days, most of the military technical achievements are being
> made by contractors working for the DoD but most of whom have never actually
> served.

Not coincidentally, she started her service during an era when war
profiteering was still considered an unethical thing.

~~~
cafard
"during an era when war profiteering was still considered an unethical thing"

In a time imagined by Plutarch and Livy, but never recorded by the clearer-
eyed historians?

------
danso
It always amuses me when, in the "Why aren't there more female coders?"
debate, people try to push the notion that women aren't genetically suited for
programming. Programming isn't the Navy SEALS, where no woman has yet been
admitted into the club for any variety of reasons...women were among
programming's _pioneers_.

~~~
nawitus
Your argument is not sound. If women are less "genetically" suited for
programming, then that fact doesn't imply that there are no women programmers
or that women couldn't have been programming's pioneers.

~~~
danso
You're not characterizing the debate correctly. The argument has primarily
been over whether pressures in society and in industry have been unfavorable
to women. Those who don't think this is the case sometimes argue, "Perhaps the
fault is within women themselves".

If that were the case, then we would indeed expect fewer women in tech. Just
as if we eliminated gender from Olympic sports, there would be very few women
at all competing, and pretty much _none_ on the medal stands (except in sports
that are mostly female only).

But that is not the case with computing. You have women among the elite of
their male peers and not just elite, but pioneers. And they did this during a
time, mind you, when women's equality was much, much less of a belief than it
is now.

~~~
nawitus
>You're not characterizing the debate correctly.

I did not attempt to. I clarified that your argument wasn't sound. My comment
was not about the "general discource".

>You have women among the elite of their male peers and not just elite, but
pioneers.

Eh. If women are not "genetically suited" for programming, then there can
still be women among the "elite programmers and pioneers".

By the way, the fact that many women were pioneers in programming was not
related to ability, it seems to be mostly an economical decision, as keypunch
operators were often female.

~~~
moocowduckquack
_If women are not "genetically suited" for programming, then there can still
be women among the "elite programmers and pioneers"._

Statistically, it becomes much less likely.

 _the fact that many women were pioneers in programming was not related to
ability, it seems to be mostly an economical decision_

Tell that to Ada.

~~~
nawitus
>Statistically, it becomes much less likely.

Yes, I agree with that claim.

>Tell that to Ada.

Why? I said "mostly", referring to the era of modern computing. In addition,
Ada Lovelace wasn't a pioneeri in programming, she was confused about the
subject.

------
dnautics
grace hopper used to live two blocks from me, but I never met her, I was too
young when she passed away, but she has always been an inspiration. Since he
was in the navy and although he couldn't code, he was in charge of a team
coding a digital inventory system, dad used to talk a lot about her. There's
now a park named after her in what used to be an awkward traffic triangle in
front of the building where she lived; when I returned to the DC area for a
spell, I wrote some code in that park in her memory.

one time I was at a party, and someone asked "what useful things have women
ever invented", without a second thought my answers were: "Kevlar. Compiling
languages".

------
rdtsc
Alex Martelli had a very interesting talk at PyCon 2 years ago about her.

She is the one how is quoted as the originator of "It is better to ask
forgiveness than ask permission". But the first part of the talk kind of
explains why she coined that phrase.

[http://pyvideo.org/video/650/permission-or-
forgiveness](http://pyvideo.org/video/650/permission-or-forgiveness)

------
ahmett
Every year there's Grace Hopper Celebration days for women in computing.
[http://gracehopper.org/](http://gracehopper.org/)

------
nonchalance
No discussion of female computer scientists is complete without a shoutout to
Hedy Lamarr
([https://en.wikipedia.org/wiki/Hedy_Lamarr](https://en.wikipedia.org/wiki/Hedy_Lamarr))

~~~
msgilligan
How was Hedy Lamarr a computer scientist? She was an intelligent and multi-
talented woman, and yes she co-authored a patent for a frequency-hopping
torpedo controlled by a piano roll. However, I don't think this qualifies her
as a computer scientist. Clearly, no list would be complete without Grace
Hopper, but putting Lamarr on the list, unless there's something I'm missing,
would detract from Hopper's accomplishments.

------
patdennis
I'm trying to identify the guy to the far left on this picture of Grace
Hopper. Does anyone have any idea?

[https://upload.wikimedia.org/wikipedia/commons/3/37/Grace_Ho...](https://upload.wikimedia.org/wikipedia/commons/3/37/Grace_Hopper_and_UNIVAC.jpg)

~~~
qbrass
[http://americanhistory.si.edu/cobol/getting-cobol-to-
run](http://americanhistory.si.edu/cobol/getting-cobol-to-run)

There's no names given, but the people in the picture are most likely the
programmers who wrote the first COBOL compiler for the Univac.

------
everyone
[http://www.smbc-comics.com/index.php?db=comics&id=2516](http://www.smbc-
comics.com/index.php?db=comics&id=2516)

------
sirmarksalot
I want to point her out every time somebody starts arguing about Ada Lovelace.
Whether or not she was the first programmer (depending on how you define
programming) almost doesn't matter when you've got a perfectly good role model
who was one of the _best_ programmers who've ever lived.

