Hacker News new | comments | show | ask | jobs | submit login
Today's Google Doodle: Grace Hopper (google.com)
156 points by sheetjs 1446 days ago | hide | past | web | favorite | 72 comments

Grace Hopper on Nanoseconds: http://www.youtube.com/watch?v=JEpsKnWZrJ8

Her appearance on Letterman was impressive. Relevant part: https://www.youtube.com/watch?v=1-vcErOPofQ#t=304

Wow... amazing person!

Yes! I particularly liked:

"So you worked on the first ever computer in the U.S.?"


"And how did you know so much about computers back then?"

"I didn't. It was the first one."

Hands up who was expecting some kind of univac emulator in javascript?

Still, a nice bit of pixel art.

I think they found a nice balance between honoring her life and creating something that is pleasing to look at.

Ha, glad they included the "bug".

Thanks, I missed that

My first C++ book, 'A Computer Science Tapestry', had a page dedicated to her that has stuck with me since I read it in highschool.

Grace Hopper "was a proponent of innovative thinking and kept a clock on her desk that ran counterclockwise to show that things could be done differently. Although very proud of her career in the Navy, Hopper had little tolerance for bureaucracies, saying:

It’s better to show that something can be done and apologize for not asking permission, than to try to persuade the powers that be at the beginning.”

The reason I became a programmer. Grace -> COBOL -> My father -> Me. In the basement, pecking away at tiny green monochrome screen and loading 8" floppy drives on Cromemco Systems-3, COBOL was the first language I tried to learn, wee age of 12. Although, honestly I don't think I ever grokked it as much as the simpler language put forth in thy little white book on my father's bookshelf by fellows K&R.

Slightly off topic, but has anyone read "Grace Hopper and the Invention of the Information Age"[1]? It's been on my wishlist for ages, but I've not got round to buying/reading it yet.

[1] http://www.amazon.co.uk/Invention-Information-Lemelson-Studi...

Hot-linking to the gif omits the caption:

"Be a maker, a creator, an innovator. Get started now with an Hour of Code."


The mother of all Grace Hopper biographies (very long):


She was back from the era when most computer programmers were female. Until late 1940s, the word "computer" meant a clerk who used did long clculations by hand or adding machine. These were mainly mathematical tables or military operations. Some of these women transitioned into programing jobs when the first programmable digital computers began in the late 1940s. Programming was either switchboard rewiring or punch cards/tape then.

Then the men realized there was money in it...

If you just subtract the birth year from the current year, you would have a birthday every day.

Nope, you'd just get back the number of years you've lived, as of now (+- 6 months).

Nothing about this being your birthday or not. Of course every passing year the number would increase by one.

That's true, and I'm sure they're well aware of the maths, but doing it this way makes it really simple for people who aren't programmers to understand what is going on.

Also demostrates how easy COBOL is, once you move past the stigma of it not being flavour of the month.

As an educated guess I'd say it is also the longest serving programming language still in use today, but I'm sure somebody with a weaving loom would argue that.

Having recently met a soon-to-be retiring COBOL programmer, he talked about how much money there is in it because the work force is aging out and there aren't any people to replace them. It got me thinking that ultimately it was the culture of COBOL that killed it.

C was first and foremost designed to be a very portable language (for the early 70s). I think its endurance is largely attributable to this fact, that C was the first relatively easy to configure and deploy widespread throughout the world. It's difficult to imagine anything like Unix happening without C.

But COBOL on the other hand is written essentially in a vacuum. It's written in relatively closed environments, for closed systems, for closed purposes. You don't share COBOL code because, well, for the most part there isn't anything worth sharing, it's all to purpose specific.

And I think we see this in other places. Arduino is not the best, or cheapest, MCU hardware. But it's the easiest for which users can share code with each other. The "sketch" mentality for Arduino code has led to an explosion of copy-pasta examples across the net, and they are easy to integrate into one's own sketches on one's own hardware. Compare that to the TI MSP430 (which I personally think is every bit as good as the Arduino Uno, at a quarter of the price), pre-Energia, and it's a night and day difference.

Did the Web take off because it was fundamentally great, or was it because you could "view source" on any page? I spent hours in highschool, looking at other people's HTML and JS (but not CSS! It didn't exist yet!) on their sites. Saving their sites, hacking at the code (hacking as in "what one does with a machete"). Compare that to AOL and its keyword system.

So perhaps the lesson is: if your system doesn't encourage remixing, it's ultimately doomed to failure.

if your system doesn't encourage remixing, it's ultimately doomed to failure.

However, never forget that the market for IT can stay irrational longer than you can stay patient. Cobol is more than fifty years old. People are still emailing Word files to each other. And the post popular gaming systems are still closed consoles.

Fortran is a couple of years older. (And very actively used and developed.)

I knew there was something I was forgetting, your right and Fortran is a fantastic language for its uses, like COBOL.

More like not flavor of the past few decades :P

Anyway, having coded in COBOL for a bank, in my opinion it's an awful, wordy, clumsy language. It fully deserves its bad reputation.

Grace Hopper was an interesting person regardless.

It does that too. I wish I had the time to learn COBOL. It really is the dogma of modern programming languages.

Really? I didn't know it was Cobol. Looked a bit SQL'ly to me :)

But yeah, it does look easy to read

COBOL predates SQL by about 15 years. In the 60s and early 70s, there was a school of thought that programming languages could be made more like written languages, and that this would make it easier for the programmer to reason about programs, especially in the business context where companies were starving for programmers, companies who saw the university system as focusing too narrowly on academic theory. We get BASIC in the interim between COBOL and SQL as well. COBOL, BASIC, and SQL were all designed with non-Computer Scientists in mind, for "practical" applications.

> more like written languages, and that this would make it easier for the programmer to reason about programs,

It's also said that it was intended to help non-programmers:

1. Help managers better understand what the heck their employees were doing.

2. Enable other non-technical stakeholders to participate in reviewing the business logic.

As you can imagine that didn't actually help very much.

(I've heard people express similar skepticism about natural language testing frameworks, although I don't know if that's fair or not.)

At the very least, I think it's an unnecessary context-switch. Why introduce yet another language in your toolchain? The boss isn't going to be able to understand the natural language tests, because s/he will be introducing their own assumptions to the ambiguous nature of human language. Like how non-programmers tend to use "or" to mean "exclusive-or". Or the plethora of issues that come up when assuming N-based indexing when you actually have (1 - N)-based indexing (for values of N that are either 0 or 1).

And, personally, the wordiness just gets in the way, at least for me. Extra words just means I can make more mistakes. I truly despise these languages that seem forgiving yet have more rules that more concise, orderly ones (any other ones).

Yes, that is certainly the case. Sure, you make the programming language easier to understand for someone in their first week of learning it, but overall it ultimately slows everything down. That's probably why the concept has never really taken off. "The proof is in the pudding".

That said, I sometimes wonder what programming in such a verbose language with a sufficiently intelligent IntelliSense system would be like. I accidentally started a small, thumbnail project in VB.NET the other day, because I had just re-installed Visual Studio and had forgotten that it defaults project templates to VB on first-run. Instead of starting over, I thought I'd use it as a chance to check in on VB and see what it has become.

Some things really pissed me off. Block delineation with Begin/End is a pain in the ass. Generic parameter lists being grouped in round parens was extremely confusing. But otherwise the extra wordiness was almost completely obviated by the editor. It was a very odd experience.

I don't like the Python-style of significant white-space, because I think formatting of code should be the job of the editor, and the compiler or interpreter shouldn't care about layout. However, I've been very curious to see new experiments in making code more readable. LightTable (http://www.lighttable.com/) looks very interesting, with it's ability to (supposedly) group sections of code in arbitrary locations on the screen. I haven't yet figured out how to make it do it. I started using it on Windows and Linux for a Node.js project I'm working on, and it was definitely an interesting concept, but I couldn't seem to get it to work just right, or anywhere near what the demo suggested would be possible. That said, it was still quite easy on the eyes, so I find myself still using even if it isn't working 100% yet.

Racket has some interesting things going for it with code flow visualizations. The UI stuff is kind of fiddly on Linux, and the utility of reference arrows seems dubious at best, at least in their current implementation. But the macro expander is pretty amazing.

Ultimately, I think it's going to take a bit of new editor design, a bit of new language design.

Well its only Cobol esque psuedo code to give the flavor it doesn't have all the required divisions.

SUBTRACT BirthYear FROM CurrentYear GIVING Age

Isn't that off-by-one, for half the year (on average)?

If you're born on January 1, you get the right answer all year long. If you're born on December 31, you get a new value on New Year's day and it'll be wrong all year.

EDIT: After re-reading you, yes, it's off-by-one for someone in the population for half the year on average.

Yes, it is.

I would guess not, seeing as it's BirthYear and not BirthDate

In the vein of the Letterman clip. I also saw this talk she gave: http://www.youtube.com/watch?v=9Ra2kt1Mpg8

mother of debugging. should have worked on that.

Is the bug in that code snippet intentional?

Cute, but please fix your massive privacy and data violations before pissing about with "Doodles".

I will not be fooled into trusting Google without massive transparency changes to how they operate.

Your whole comment is off topic for this thread. There are numerous other threads on HN dealing with NSA snooping that you can troll in.

Drawing attention to massive privacy violations is not irrelevant whenever Google are mentioned. We should not be forgetting who we're dealing with.

If you were thinking of visiting North Korea for a holiday would it be "off topic" to mention human rights issues?

Nobody give this person any more coffee

> Nobody give this person any more coffee

.. not quite a human rights violation, but man, that's pretty harsh.

Actually, I believe that Google isn't responsible for all of the doodles, they outsource a lot of these and the interactive ones too.

Outsourcing isn't relevant. The issue is that Google use such methods to put a public friendly face on their service, while providing the NSA direct access to all their data.

What should be on the Google homepage:


An apology for the recent revelations How we plan to regain your trust

Google does not "provide" NSA direct access, the NSA cut into the dark fiber they run underground between their data centers without Google's knowledge.

A just created drive-by account doesn't inspire confidence that you have contrary evidence.

I appreciate that you don't forget that trolls need to eat too.

>Google does not "provide" NSA direct access, the NSA cut into the dark fiber they run underground between their data centers without Google's knowledge.

Sure they did....

The PRISM revelations prove you incorrect: http://www.theguardian.com/world/2013/jun/06/us-tech-giants-...

No "dark fiber" necessary, Google were complicit with the NSA.

Very irritating when Google apologizers aren't even aware of recent news.

Where does it say in the leaks that Google was providing the access?

That the NSA had access wasn't because Google was providing it. The NSA got access through some shady practices. They have access to all the files when the NSA can listen in on all the traffic. Not because Google gave them permission.

Very irritating when you claim something with a source when the source counters your own point:

> "If they are doing this, they are doing it without our knowledge," one said.

It's totally relevant. You implied that Google is wasting their time with doodles rather than these privacy issues. If they're not creating said doodles themselves, then they're not wasting their time.

You literally created your account 12 minutes ago just to bash Google?

You haven't actually read my comment properly, probably a Google Shill.

The longevity of an account has no bearing on validity of ones comments.

Yes, I read your comment properly. And yes, the longevity of your account matters if you're just giving us the usual tin foil hat spew.

As cromwellian also said, Google didn't provide direct access, the NSA cut the fiber. Sure, Google aren't the best company and have had their privacy issues before, but the Government is much worse.

It wasn't the NSA who did it. It was the GCHQ in Britain, who later shared data with the NSA.

Ah google, still using GIFs in 2013... can't use APNG because it's "not invented here".

Google didn't invent the GIF.

Only supported in Firefox and old Opera. http://caniuse.com/apng

Why should they use APNG? It's an unnecessary complication with no benefits for the aesthetics they are going for.

This doodle as GIF: 97 864 bytes Converted into APNG: 77 472 bytes

Browser share that supports APNG: ~25%

Browser share that supports GIF: ~99.9%

Google is not just posting doodles on the front page. Last time I heard they also have a browser. So they can shift those percentages, and by helping APNG, they can help themselves: their front-page will load faster. It's a win-win.

Chrome probably should support APNG. But I'm not sure it would be wise to use APNG even if Chrome did support it.

This site doesn't seem to have any problems serving better-looking APNGs for Firefox users, and plain old GIFs to all others: http://www.rw-designer.com/cursor-set/icq-flower

Google's browser already supports WebM, which should compress better than GIF even in lossless mode.

Can you convert this doodle GIF into lossless WebM? It would be interesting to see the numbers. I assume lossy compression would look bad, probably not worth it.

Lossy wouldn't look bad, it's just not a direct comparison :) Here are 3 files: a lossless WebP and two lossy ones with max and min quality settings https://www.dropbox.com/sh/q1jqoru5ljtq9on/P2ELh8NkA6?lst The lossless one is slightly bigger than the GIF which I did not expect. The high-quality lossy one is obviously overkill and is huge! The low-quality lossy one is smaller but looks like crap.

  GIF:     81 kB
  WebP:    82 kB
  -q 0:    59 kB
  -q 100: 259 kB

Yeah, that's what I would expect from WebP, as I played with gif2webp before, but I thought you said WebM, as in proper video?

Well... I don't know how to make a lossless WebM video. And according to the bottom of this page https://developers.google.com/speed/webp/faq it would probably be bigger anyway.

Really? That's it? Like a 20% reduction in filesize? Considering how archaic the GIF format is, I'm actually really disappointed.

No wonder nobody supports APNG - the only real benefit is animated alpha-blending.

Yeah it's old, but it's still one of LZ types. ZIP is archaic too, but you can only improve it so much, even with very modern methods.

But GIF was?

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact