Hacker News new | past | comments | ask | show | jobs | submit login
Grace Hopper (wikipedia.org)
190 points by networked on Aug 31, 2013 | hide | past | favorite | 54 comments



I worked with COBOL near the end of my last contract and found aspects of it fascinating compared to today's languages. Everything is about structures that map directly to the bits on disk, with fine grain control on precision and data types. But then the language reads as a series of macros where you don't have to remember the low level details: do this to this, put this here, if this do that.

It's also a terribly difficult language to parse because it was designed for ease of use by humans. This was at a time when programmers were thought of as translators that mainly converted human language to binary. There is an element of independence to COBOL that I wasn't expecting, and I imagine there was some resistance to it when it was first introduced. I didn't know Grace Hopper was behind COBOL until I was actually using it, but I wonder sometimes if the challenges she faced as a woman at that time influenced its design. It's probably the first language to really confront dogma in computer science.

I personally really enjoy learning about languages like Erlang and Lisp but they have an achilles heal that everybody is in real denial about. They are very difficult to read once they've been written. So most programmers scrap what they've written and start over. COBOL isn't like that. Even someone unfamiliar with computer science (all those business types who don't have time for this stuff) could read it and get a general understanding of what it's doing. The only other language I've used that surpasses COBOL in readability is Hypertalk, invented by Bill Atkinson in the 1980s. I look at some of the top languages and methodologies today, say Ruby and Angular.js and I wonder if we're racing down a rabbit hole. Conceptually what they are trying to do is admirable but they are starting to feel like dogma to me. How do they help the average person accomplish what they are trying to do? I've only seen the one interview with Grace Hopper on David Letterman but I have to wonder if she was a hacker in today's world what she would think of the current state of things.


It's not dogma anymore. It may have been dogma at the time of COBOL, but now we know plain-English programming general purpose programming languages are a bad idea, because we've tried any number of times, and in general if you don't produce an atrocity, you're doing well above average. You end up with a language that is still every bit as complex as a normal programming language, and you add the ambiguity of English to the problem. Instead of it being a win, you lose, badly.

You can do better if you rigidly constrain your DSL's domain, but I'd argue that in many cases you're still looking at "spending" some design juice on a "plain English" interface and it's not at all clear that you've produced something that is improved by the "plain English" so much as produced something that had enough design budget left to absorb the penalty without being destroyed.

It's 2013 now, not 1970. Any time you're tempted to go "My goodness, programming would just be so easy if we did X", go out and look. Odds approach 100% that we've done X, many, many times, and the reason you've never heard about it is that it didn't work well enough to be talked about. See also "fully visual programming language" and any number of "business logic" initiatives over the years.

(To which the typical next cognitive reaction is to believe that it would work if we just poured more effort into it. That may be true, but it's worth pointing out you're entering into unprovable territory. And in many cases, part of the idea really is that X is just so obvious and great that even a partial implementation ought to show its promise immediately.)


I see what you are getting at that basing a programming language on one dialect like english is a bad idea.

But what I disagree with is that pure mathematics is more useful than other ways of thinking. Anymore, I find that having to adapt myself to whatever methodology a language imposes on me is generally more work than the problem I'm trying to solve.

Take for example the current hubbub over promises to try to reduce callback hell in javascript. What I find almost amusing is that callback hell shouldn't exist in the first place. It was solved trivially thirty years ago with cooperative threads on pre-NeXT Mac OS. It's obvious in hindsight that cooperative threads were a mistake for kernels, but they turn out to work quite well for processes (the people who wrote Go realized this and did one better, adapting some of the functional concepts from Erlang into a cooperative-thread style that can run concurrently because there's little or no shared data). You rarely have to deal with mutexes or atomic operations (except in cases where one of your operations spans a yield). IMHO they left yield out of javascript in browsers originally for political reasons, because it was already handled by the runtime. Adding it back in is a reluctant admittance that it's become a necessity in today's complex GUIs.

Anywhere I look today, I see these problems revisited that were really solved in better ways years ago. A great example of dogma today is templates in c++. Such an astonishing amount of boilerplate to do things that are so easy in other languages, or with more fine grain control of macros. And a lot of the verbosity goes away when you allow variables to infer their type from the return value of a function. Statically typed variables are even a form of dogma. What matters is the underlying piece of data, not the name of the variables that refer to it. We should be dealing with things as a large graph where the data is hashed or referenced somehow if we need a name for it. Honestly most of the code I write now doesn't even have variables. I only use them when I have to work around a limitation of the language (for a long time php wouldn't let you use the array access [] operator after on function result) or when a piece of code is too large to fit on a line. Variables themselves don't actually exist, they are an abstraction. If C had a way to make a const type and initialize it later, we could emulate many of the advantages of functional programming by using the variable name to refer to the result of a block of code. Another example are the blocks in objective-c, which are just nested anonymous functions or lambdas. These should have been part of the c standard from the very beginning, but with nicer syntax (like javascript). I think they were left out for political reasons, because someone early on recognized that they would lead to callback hell! How messed up is that?

I could go on almost forever on this stuff. Lisp gets around most of these problems easily, but to me it's the mathematical equivalent of machine code. Haskell and Scala try to make it more readable but fail due to their obsession with brief notation. Conceptually none of this stuff is all that complicated. I remember the first time I saw someone using Excel and when it hit me what they were actually doing, it shook my world so fundamentally that for a moment I thought that everything I learned in computer science had been a waste of time. That feeling never quite left me.

If I had a hope for computer science today, it's that we would take the sheer simplicity of functional programming (graph-style like Excel) and build layers above it that extend a programmer's leverage. Here are a few examples off the top of my head:

* Dealing with everything as an array to provide inherent parallelism in Matlab

* Making concurrent programming look single-threaded by using select in Go

* SQL queries to retrieve data by its relationship, not its address

* Triggering actions to run when something changes like Prolog

* Dealing with atomic messages instead of streams like in Go

* Memory-mapped IO instead of loading files into memory

* Using pure sockets/file descriptors and pipes instead of proprietary APIs to access data (also pipes are just double-ended file descriptors if we could truncate from the front, and shouldn't exist as a separate concept)

* Tying objects to their representation, for example changing the visible attribute in javascript and seeing the element appear instead of calling a setVisible() function (in a fundamental way to ease the burden on programmers, not just syntactic sugar)

* Lazy evaluation/referential transparency (treating code like data and vice versa)

* Nested languages like Terra instead of writing glue (calling c from lua the way we used to call asm from c)

* Universal JIT compilers so we can quit arguing about one language being faster than another and focus on abstract concepts

* Integrating genetic programming into languages so we can express a function by its desired input and output instead of having to write every line of code by hand

* Losing the concept of a file altogether and focus on state like CouchDB

* Make all of this readable by treating the language as a view of the code (for example switching to tabs showing other views like prefix/infix/postfix or human readable like Hypertalk)

* Begin thinking of code as pruning a tree instead of manually solving the problem (let the computer explore every permutation of the problem space and return the results for review like solving equations in Mathematica x = yz -> y = x/z -> z = x/y)

* Standardize the way libraries communicate (Python should be able to talk to Wolfram Alpha and feed the results into a shell script then save each line in a SQL database with full unicode support)

* Get rid of drivers and adapt something like HTML for all devices so that they can be queried, controlled and listened to

This got way too long so I will stop there. It needs to be rewritten but I don't have time..


After years of reading HN, it was your comment that compelled me to register an account.

Thanks for the great comment, sharing with my coworkers and friends.


I wasn't sure entirely what you were getting at with const variables in C that are initialized later – and this doesn't fit your bill, if you were referring to lazy evaluation – but you can do something like this (apparently works for Clang and GCC):

    const int test = ({
        int a = 1+1;
        sqrt(a);
    });


Oh wow I didn't know about that notation! For anyone stumbling onto this, I wanted to elaborate on what I was getting at with regard to functional programming in c. When I learned Scheme back in the 90s, they had us do a lot of expressions like this:

(* 1 (+ 2 3))

=> 5

But I always thought it was a kind of obfuscation. They didn't mention the "define" keyword or that we could break things up into multiple lines until later in the class than I would have liked. They wanted us to think in terms of transformations on the data instead of sequential operations on it. So we were writing these enormous functions enclosed in a single parentheses that were terribly hard to decipher. This can be written as:

(define a (+ 2 3))

(* 1 a)

=> 5

So I always wanted something similar in c, being able to write code in a functional manner without having to have these unreadable one line functions. I just wanted to store the result of a computation in a temporary const variable to break things up a little. I also didn't want intermediate variables polluting my scope. For example:

const int test;

{

    const int a = 1+1;
    test = sqrt(a);
}

=> "Read-only variable is not assignable"

But your notation makes this possible (slightly embellished to show the power of it):

const int someOtherResult; // 41

const int test = ({

    const int a = 1+1;
    sqrt(a) + someOtherResult;
});

=> test = 42

Now we can break up our statements and as long as we use only const variables and don't use any global variables, we are programming in a functional manner, for the most part. Cool! Unit tests become trivial to write, static analysis finds most of our bugs, we don't have to use as many templates..

Then if we add lambdas/closures from C++11, we can pass around code as variables and we are getting most of the benefit of pure functional languages without being so restricted in syntax. The main drawback is that we lose lazy evaluation, but in the real world that hasn't hindered me except when I was dealing with databases. Supposedly java can retrieve database results lazily but I don't know quite how that works. Perhaps something like it could be adapted for c.

P.S. does someone know the name for the "({" notation? I couldn't find it anywhere thanks!


I don't know the name; I found it through issue 105 of iOS Dev Weekly. Here's the link to the article: http://cocoa-dom.tumblr.com/post/56517731293/new-thing-i-do-...


You guys are having a phenomenally good discussion. This thread is fantastic and what I love to come to Hacker News and see. Right on, maximum A+.

Just wanted to squeeze that in edgewise.


I've been looking into natural language programming, and I recognize some things along those lines have been done. Most of them don't go the length though as far as I know, and have not applied real natural language processing (which we're only getting slightly good at in recent years) to try to semantically understand the English in order to compile it to real code.

The only such system I know if is Inform 7 [0][1], which is not really general purpose (though it seems to be quite flexible). There's also livecode [2], which is general purpose, but they did not seem to do any semantic parsing beyond some built-in fixed grammar rules and the language is not exactly "natural" (correct me if I'm wrong, I haven't tried it yet).

I also know of a few projects that manage to do this well for constrained domains. For instance, these guys [3] were able to take input parser specifications written in English and automatically create the input parser from its specification. But constraining the domain is what makes this work, and is not the way towards general purpose natural language programming.

Would you be so kind as to enumerate some of the work done for plain-English general purpose programming? I've gone out and looked myself, but I have not found all that much.

[0] http://www.spagmag.org/archives/backissues/SPAG44

[1] http://inform7.com/learn/papers/

[2] http://www.kickstarter.com/projects/1755283828/open-source-e...

[3] http://web.mit.edu/newsoffice/2013/writing-programs-using-or...


Well, it was the first real programming language. I can picture people thinking "since machine code is so ugly for humans to read and write, let's come up with something that any human can read or write with extreme ease." They weren't even thinking about "elegance" or "beauty" of code; I'd be surprised if those concepts existed that early on.

We later realized it was a bad idea once we actually had some decent alternatives. But at the time I'm sure people were overjoyed that they could write in plain, clear English instead of digits or assembly.


COBOL, originally, probably wasn't thought of as a general purpose language. It was more of a business DSL (hence its name). It was about easily processing business data in standard formats, reading and writing files, and producing reports. And for those things, it's actually pretty good. Where it became a "bad idea" was when it was pressed into service beyond that scope.


I think you need to watch Bret Victor's talk at DBX.

http://vimeo.com/71278954


Yeah everyone's seen that talk. Any technology can look cool for 5 minutes. He really neglects to observe that all of the ideas he presented were thoroughly explored during their time -- and repeatedly in the decades since. The conclusion continues to be that such systems are inferior to our current ways. That may be because nobody has pushed hard enough, but it's not because those systems, as implemented, were good enough if only someone would just pay attention.


I've always wondered what people mean when they say Lisp & co are hard to read. I'd think having great powers of abstraction would produce code that reads more easily.

Of course you have to figure out what the abstractions do, but is that really worse than having to reverse engineer low-level non-abstractions on the fly even if they're in a "human readable" language?


This depends on the kind of abstraction. The closer programming is to math, the less likely it is that anyone other than mathematicians can read it, because they're not familiar with the abstractions you're using and they're hard to learn. (And even simple mathematical abstractions are hard because there's no domain-specific content to them; it just looks like abstract nonsense.)

It's also possible to use abstraction to get closer to how your customers think, but only if you use abstractions that they already know from the target domain.


Having had to deal with a little COBOL now and then, I wonder whether English words are really what makes for readability. I am reminded of something that Jacques Barzun wrote long ago about English, that it is the density of the thought, not the length of the words, that makes for difficulty in reading.


I did something like that once for a truly awful machine-language-like system I had to work with at a bank:

  8607,,,1
  11547,15475,22002,22002
  1316,1629,1,1649
  3001,1316,3,30078
  11528,22052,22002,22002
  9301,0,1528,1528
  31568,10001,800,107
  8560,,,1568
  8550,210,,
  3001,,,
  3100,1316,3,30089
  11547,15475,22002,22002
  3001,1316,3,30089
  3001,1317,3,10000
  8400,,,
  8550,700,801,
  3001,,,
  9301,0,522,522
  3000,1284,3,10001
  8500,,3,
  8500,,5,
  1547,,1,-2301
Through the translator, the result looked like this: https://github.com/jloughry/BANCStar/blob/master/README.md

The arrows in the margin duplicated the handwritten notation developed by programmers to deal with the numeric machine code. If a colour printer had been available at the time, it would have duplicated the highlighter pens those programmers used too; instead I used boldface, italics, underlining, and condensed characters to accomplish the same thing.


That's classic! You should make a post out of that link—it's great HN material.


Done. We'll see what kind of shreds the HN community tears it into. This ought to be fun.

https://news.ycombinator.com/item?id=6311717


To enlarge on your (really good) point..

We also have the capacity in english to substantially reduce the overhead of thought density. We do this by using symbols or signifiers (words or phrases) which refer to oft repeated concepts, allowing us to build more complex meanings without having to consciously process every part. This idea of density of thought can be expressed more specifically as both uniqueness and complexity of thought.

Conceptually, programming is communication, and all communication is essentially semantic. Though not being a programmer, I reckon the broader conceptual difficulties with programming are in the need for context, and with the necessity for specificity. Which, by the way, english is not that brilliant at either (just more flexible). It is instructive in the context of this discussion to remember that semantics rely as much on the experience and understanding of the listener, as on the speaker / writer.


I got to meet Grace Hopper many years ago. She gave me a nanosecond after her talk. This was in the 1980s, probably while she was working for DEC. It made quite an impression on me, and it's now framed on the wall. I wish I'd gotten her autograph!

She was one of the original hackers. I don't think that she was a woman was ever relevant to her, and by the force of her personality, and her efforts, she made it irrelevant to everyone else. For instance, I keep forgetting it. Not that she was female, but that it was unusual that she was female. EG: she wasn't a "Pioneer of Female Programmers", she was instead simply a "Computer Pioneer".

I don't know the specific barriers she had to overcome, but I know they must have been significant-- for much of her career, people didn't really understand what computers even were. Even as late as the 1980s, talking heads on TV would often say things that implied they thought computers were intelligent, or other equally silly perspectives... because they were so new they had never been exposed to society at large. Now take that lack of understanding in the population in general and project it onto the navy which, as a military, is generally lower tech and more conservative? Of course she was in the research area, but there were certainly huge amounts of misunderstandings she had to deal with in people who simply had never been educated about computers. So she gave these talks that continued after she left the Navy and started working for DEC. She made it her mission to teach people about this new technology. I can't imagine her refusing to speak at a conference simply because there are more men than women-- I imagine most of the conferences she went to, she was the only female speaker.

PS- I think her handing out the nanoseconds was brilliant, as a teaching aid. She took an abstract concept and made it physical. How rare is that these days? Plus if you can give someone something they're much more likely to remember what you're trying to teach them. Even if it is a bit of old telephone wire... that mundane piece of plastic and copper became imbued with the story she told.

Quite and impression!


I also meet Grace Hooper and I to received a nanosecond in the late 1980s :) I was very impressed!


> She was one of the original hackers. I don't think that she was a woman was ever relevant to her, and by the force of her personality, and her efforts, she made it irrelevant to everyone else. For instance, I keep forgetting it. Not that she was female, but that it was unusual that she was female. EG: she wasn't a "Pioneer of Female Programmers", she was instead simply a "Computer Pioneer".

Thank you! I recently stumbled on the Grace Hopper conference[1]. Initially I didn't know what it was about and thought maybe it was about compilers or whatever. I was a bit disappointed to learn that it was a conference for women. I mean, of all the awesome things she has done we're celebrating the simple fact that she's a woman.

[1]: http://en.wikipedia.org/wiki/Grace_Hopper_Celebration_of_Wom...


JGC followed up with a downloadable nanosecond.

http://blog.jgc.org/2012/10/a-downloadable-nanosecond.html

This is also my favorite Grace Hopper bit.


Her interview on David Letterman in 1986 is quite impressive: https://www.youtube.com/watch?v=1-vcErOPofQ


Great find. I really like this video of her explaining nano- and microseconds in a lecture: https://www.youtube.com/watch?v=JEpsKnWZrJ8.


She sounds totally and completely amazing.

  The most important thing I've accomplished, other than building the compiler, is 
  training young people. They come to me, you know, and say, "Do you think we can do 
  this?" I say, "Try it." And I back 'em up. They need that. I keep track of them as 
  they  get older and I stir 'em up at intervals so they don't forget to take chances."


The most fascinating thing (to me) about Admiral Hopper is her military service. As an Army officer, I find it amazing that there was once an era where you had so many very bright technical minds who were serving actively in the military. These days, most of the military technical achievements are being made by contractors working for the DoD but most of whom have never actually served. Admiral Hopper not only served during the WW2 era (when almost every able American adult was doing the same), she served for many years afterward.


Remember that they drafted everyone in World War 2. If you look at the pay scales for officers, you'll see some really weird things like Colonel with two years TIS. This is because they needed industry leaders and engineers to be in relevant spots in the chain of command - you wouldn't have a mechanical engineer who manages hundreds of technicians working as a sergeant.

Having all of this brainpower meant that the Army and Navy could actually do scholarly stuff. These days, it's all outsourced; it's considered crazy when an officer comes up with some innovation.


> As an Army officer, I find it amazing that there was once an era where you had so many very bright technical minds who were serving actively in the military. These days, most of the military technical achievements are being made by contractors working for the DoD but most of whom have never actually served.

Not coincidentally, she started her service during an era when war profiteering was still considered an unethical thing.


"during an era when war profiteering was still considered an unethical thing"

In a time imagined by Plutarch and Livy, but never recorded by the clearer-eyed historians?


It always amuses me when, in the "Why aren't there more female coders?" debate, people try to push the notion that women aren't genetically suited for programming. Programming isn't the Navy SEALS, where no woman has yet been admitted into the club for any variety of reasons...women were among programming's pioneers.


Woman arent Navy SEALs for "any varity of reason?" No. Women arent Navy SEALs cause its against the law. http://www.navy.com/careers/special-operations/seals.html

"By law, only men are eligible to apply for the SEAL program"


Your argument is not sound. If women are less "genetically" suited for programming, then that fact doesn't imply that there are no women programmers or that women couldn't have been programming's pioneers.


You're not characterizing the debate correctly. The argument has primarily been over whether pressures in society and in industry have been unfavorable to women. Those who don't think this is the case sometimes argue, "Perhaps the fault is within women themselves".

If that were the case, then we would indeed expect fewer women in tech. Just as if we eliminated gender from Olympic sports, there would be very few women at all competing, and pretty much none on the medal stands (except in sports that are mostly female only).

But that is not the case with computing. You have women among the elite of their male peers and not just elite, but pioneers. And they did this during a time, mind you, when women's equality was much, much less of a belief than it is now.


>You're not characterizing the debate correctly.

I did not attempt to. I clarified that your argument wasn't sound. My comment was not about the "general discource".

>You have women among the elite of their male peers and not just elite, but pioneers.

Eh. If women are not "genetically suited" for programming, then there can still be women among the "elite programmers and pioneers".

By the way, the fact that many women were pioneers in programming was not related to ability, it seems to be mostly an economical decision, as keypunch operators were often female.


If women are not "genetically suited" for programming, then there can still be women among the "elite programmers and pioneers".

Statistically, it becomes much less likely.

the fact that many women were pioneers in programming was not related to ability, it seems to be mostly an economical decision

Tell that to Ada.


>Statistically, it becomes much less likely.

Yes, I agree with that claim.

>Tell that to Ada.

Why? I said "mostly", referring to the era of modern computing. In addition, Ada Lovelace wasn't a pioneeri in programming, she was confused about the subject.


One of my biggest fears is that we can actually make that into truth. Fashion is powerful, and it's easy to google out problems women with the P word before their names can have dating.

And it wouldn't be without other costs (if the vision of society where the imagined genetic differences became true wasn't bad enough): increasing the variation between sexes would mean far more complicated genetic code. That would almost certainly have hard to predict, weird costs.

So far it doesn't seem to be happening, so I'll put that together with Peter Watts' fear that sentience is a side-product that will get optimized out, but still, it's a fear.


P word? Which one?


PhD.


How often is that "genetically unsuited" theory really pushed, though? I think it might be more a myth than reality ("people always try to argue with genetics").


I've seen it here, for example from half a dozen people just the other day, appearing as "but who says that the current representation of women in programming isn't the exact level it should be based on biological interest and ability? Where's the evidence that there is a societal pressure pushing women away and not just innate aversion to the field?"


I'll have to take your word for it - I missed that, and HN search doesn't turn up anything (looking for the phrase "innate aversion", sorted by date: https://www.hnsearch.com/search#request/all&q=innate+aversio... )


I paraphrased from memory, sorry - I was thinking of this thread https://news.ycombinator.com/item?id=6254218


grace hopper used to live two blocks from me, but I never met her, I was too young when she passed away, but she has always been an inspiration. Since he was in the navy and although he couldn't code, he was in charge of a team coding a digital inventory system, dad used to talk a lot about her. There's now a park named after her in what used to be an awkward traffic triangle in front of the building where she lived; when I returned to the DC area for a spell, I wrote some code in that park in her memory.

one time I was at a party, and someone asked "what useful things have women ever invented", without a second thought my answers were: "Kevlar. Compiling languages".


Alex Martelli had a very interesting talk at PyCon 2 years ago about her.

She is the one how is quoted as the originator of "It is better to ask forgiveness than ask permission". But the first part of the talk kind of explains why she coined that phrase.

http://pyvideo.org/video/650/permission-or-forgiveness


Every year there's Grace Hopper Celebration days for women in computing. http://gracehopper.org/


No discussion of female computer scientists is complete without a shoutout to Hedy Lamarr (https://en.wikipedia.org/wiki/Hedy_Lamarr)


How was Hedy Lamarr a computer scientist? She was an intelligent and multi-talented woman, and yes she co-authored a patent for a frequency-hopping torpedo controlled by a piano roll. However, I don't think this qualifies her as a computer scientist. Clearly, no list would be complete without Grace Hopper, but putting Lamarr on the list, unless there's something I'm missing, would detract from Hopper's accomplishments.


I'm trying to identify the guy to the far left on this picture of Grace Hopper. Does anyone have any idea?

https://upload.wikimedia.org/wikipedia/commons/3/37/Grace_Ho...


http://americanhistory.si.edu/cobol/getting-cobol-to-run

There's no names given, but the people in the picture are most likely the programmers who wrote the first COBOL compiler for the Univac.



I want to point her out every time somebody starts arguing about Ada Lovelace. Whether or not she was the first programmer (depending on how you define programming) almost doesn't matter when you've got a perfectly good role model who was one of the best programmers who've ever lived.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: