Hacker News new | past | comments | ask | show | jobs | submit login
Tony Brooker, creator of Mark 1 Autocode, has died (nytimes.com)
433 points by gumby 7 months ago | hide | past | favorite | 77 comments

Wikipedia says of Autocode,

> The first autocode and its compiler were developed by Alick Glennie in 1952 for the Mark 1 computer at the University of Manchester and is considered by some to be the first compiled programming language.

which seemingly contradicts the implication in the NTY article that Booker was the principal author of Autocode.

It seems Booker and Glennie were coworkers at Turing's lab. Anyone know more?

EDIT: The Glennie claim is sourced to a 1976 paper coauthored by Donald Knuth, where Knuth says

> The unsung hero of this development was Alick E. Glennie of Fort Halstead, the Royal Armaments Research Establishment. We may justly say "unsung" because it is very difficult to deduce from the published literature that Glennie introduced this system.


There's no contradiction here, because Glennie's first attempt as autocode isn't referred to as "Mark 1 Autocode" (despite running on the Mark 1). Apparently it wasn't an influence on later developments. Wikipedia is pretty clear about this. https://en.wikipedia.org/wiki/Autocode

Yeah, "autocode" is just an old timey term for "compiled language". Like how "condenser" means "capacitor".

From Google:

"Previous to Brooker's work, Alick Glennie who was an MOS external user of the machine had developed his personal automatic coding scheme in the summer of 1952. Glennie's system, which he called autocode, was very machine-dependent and was not released to other users." -- Early computing in Britain, 1948-1958, pg 257


global _start

section .text


  mov rax, 1        ; write(
  mov rdi, 1        ;   STDOUT,
  mov rsi, msg      ;   msg,
  mov rdx, len   ;      sizeof(msg)
  syscall           ; );

  mov rax, 60       ; exit(
  mov rdi, 0        ;   EXIT_SUCCESS
  syscall           ; );
section .rodata

  msg: db "RIP Tony, Father of Assembly Language", 10
  len: equ $ - msg%

Always stayed away from Intel assembly, but thought I'd have a go at running this. First issue was this is I think, Intel format, whereas I was using GNU as, so I had to make a few changes:

  .global _start

    mov $1,%rax        # write(
    mov $1,%rdi        #   STDOUT,
    mov $msg,%rsi       #   msg,
    mov $len,%rdx      #      sizeof(msg)
    syscall            # );

    mov $60,%rax       # exit(
    mov $0,%rdi        #   EXIT_SUCCESS
    syscall           # );

    msg: .ascii "RIP Tony, Father of Assembly Language\n"
    len = . - msg
Then you can run with:

  gcc -c src.s && ld src.o && ./a.out

I think you can use ".intel_syntax noprefix" with GNU as to assemble Intel assembly.

Any chance you can elaborate the reasons for staying away from Intel syntax?

Not OP, but I think it is like indentation styles: you learn it one way, then everything else feels weird and ugly.

Mainly because the assembler I knew I had on my computer was the GNU assembler and it uses that syntax. I did quickly try the .intel_syntax directive but it was just easier to change it around as the code was so short.

Most of my assembler experience is with 6502 or ARM.

Thanks for correcting code :3

A couple online places to try this and where you can play around with this code. The "%" after msg on the last line seems to cause an error, at least for when I tried it with my version of nasm, so I removed it in the below links.

A small website with a few dozen compilers/assemblers, just press F8 or click "Run it" to try the sample above:


Hundreds of compilers and assemblers are offered online and configurable with tio.run in a bit of a more complex website, just hit the play button:


title is wrong it was autocode (precursor of high level languages) not assembly and Brooker not Booker

"One programmer was Vera Hewison, whom he married in 1957. (She died in 2018.) Another was Mary Lee Woods, whose son, Tim Berners-Lee, would go on to invent the World Wide Web."

The industry really was (still is I guess) very small.

Still so crazy that our field is so young that people like this are only now passing away.

Indeed. When I was a young CPU designer (working on walk-in, refrigerated, mainframes implemented in 100K-series ECL) several of the old bulls I worked with had been CPU designers in the days of vacuum tubes. I loved hearing the stories of the early days, and considered it a privilege to learn the craft from them.

Einstein was still alive when my mom was a teenager.

To be fair, 94 is quite an old age.

99.9% of tech was developed within about the last 3-4 generations, no?

Maybe arguably not true for scientific progress.

Modern technology is mainly the result of chemistry and materials sciences. Key events are the discovery of nitrocellulose in the 1840s and the introduction of the Bessemer process of steelmaking in the 1850s.

Nitrocellulose starts the development of guncotton, high explosives, celluloid, photographic film, and all sorts of modern plastics. Cheap steel enables engines, steel hulled ships, ICE vehicles, chemical plants, oil refineries, electrical machinery, etc.

Further progress in chemical engineering results in fuels, pesticides, fertilizers, pharmaceuticals, solid state devices, integrated circuits, lasers, etc.

Well I mean, yea... Modern tech is the result of scientific progress...

Depends on what you define as "tech", no?

Indeed: ships, metallurgy, chemistry...they had lengthy births.

Chemistry is a science. Chemical engineering is the application of that science (to create technologies).

Chemical engineering is mostly about 'plumbing' - pipes, tanks, valves, etc. Fluid dynamics. It's much closer to physics than chemistry most of the time from what I understand.

Chemistry doesn't have the same stereotypical relationship to engineering that physics does.

(This comment is based on what these terms mean in the UK at an undergraduate degree level. I'm an electronics engineer and I know a couple people who studied chemical engineering. If the terms mean something different elsewhere in the world I'd love to know :) )

If you saw how an industrial chemist works you’d see it more akin to engineering than science.

By value add, I mean. But I don't know what's ambiguous about what is and isn't tech?

tech is anything made, invented, not natural. So, the stirrup, eyeglasses, agriculture, our control of fire.

Yea. Anyways, what's so controversial about my comment you responded to?

That's a case that might be argued, though pinning down both what technology is and quantifying "innovations" (or inovation) is ... notoriously prickly.

My working definition borrows from John Stuart Mill, who identifies technology as the study of means (to an end or goal), while science is the study of causes (how or why things happen, to which I'd add a general notion of "structured knowledge").

There are other forms of knowledge, an interesting topic itself, but I'll skip that.

There's a tremendous set of ancient technologies, which can get expansive depending on your views. Everything from speech to simple machines, textiles, agriculture, medicine, metalurgy and mining, and ancient chemistry and alchemy.

What changed starting, arguably, at some point between roughly 1620 (publication year of Francis Bacon's Novum Organum) and about 1800 (patent expiry on James Watt's enhancements to the Newcomen steam engine) was a change in attitudes to both science and technology (or the practical arts, as they were then called), due to numerous factors. Much of that owed to the availability of better and more abundant (at least in the short term) fuels: coal, oil, and natural gas, and the capabilities afforded by those, especially in metallurgy (greater strength, purity, specifically-tuned characteristics, and of course, abundance), as well as in the understanding of natural phenomena: optics, thermodynamics, elements, electricity, and later radioactivity, affording more capabilities.

There was still a huge amount of pre-industrial, non-industrial technology, much of it originating in China and documented spectacularly in Joseph Needham's Science and Civilisation in China, a 30+ volume opus begun in the 1950s and still in development.

Many studies of technology look at patent filings, which is at best a rather poor metric -- one that's in many ways a bureaucratic, commercial, and ontological artefact. Looking at the costs and derived value might be of more use. I've been looking into an ontology of technological mechanisms, for which I've generally settled on about nine factors (discussed in other comments on HN, as well as elsewhere) which I've found useful.

Much of what is commonly called "technology" today falls into only a very narrow region of that. And much of what is considered economic growth can be traced very specifically to the increased energy available per capita in productive use.

There's also a pronounced set of diminishing returns to increased innovation and R&D, generally. Suggesting an other-than-bottomless reservoir of potential from which to draw.

Side point, but I found this article to be very well-written and a rather pleasant read.

Seems appropriate to me that he died in the city of Hexham.

    LD r0,$TONY

R.I.P. Tony Brooker

mov ax,0x4c00

int 0x21

I think there is a typo here where Kathleen Booth is suppose to be. You know, the inventor of the assembly language.

The submitter made an honest mistake. Please don't snark on HN. It breaks the site guidelines: https://news.ycombinator.com/newsguidelines.html.

Autocode isn't assembly. It's sort of a proto-Fortran. The title should probably be changed.

Also, his name is Tony Brooker, not Booker.

Indeed, the NYT article title: "Tony Brooker, Pioneer of Computer Programming, Dies at 94"

I thought Lady Ada was the first programmer?

"WE! DO NOT! TALK ABOUT! THE ORANGUTAN!" https://imgur.com/gallery/lnOAS

You can be a pioneer without being the strict first in a field.

I was completely unfamiliar with the subject matter but even before I saw the comments I was convinced that people are going to say that what he did is not assembly.

Lol, what else is HN for but for being specific to the point of being contrary on technical matters? : P

> Also, his name is Tony Brooker, not Booker.

I made the submission on my MacBook Air and yes, the r key is wonky. Still, typo is my responsibility.

For sure, no worries. I'm dyslexic and guarantee you I've made worse mistakes for various reasons. : )

It's "Brooker". Someone please correct the title.

Typo in the title. Should be "Brooker".

Can we get a black bar for this guy?

Secular amen. (Not sure what opcode that is. I'll have to look at the datasheet.)

RIP bro. :'(

> Not sure what opcode that is...




BR R14 / SVC 3



Please don't post unsubstantive and/or flamebait comments here.

Apologies dang! I guess I observed it as relatively substantive in the given context. But I did study ancient history, so probably have a warped perception of domain-specific quips.

This is something that bothers me a lot. I have traveled to the UK a lot and oh, what a fine nation! I love being in the UK. Good weather, amazing people, great food and drinks!

The literary culture is still very strong in the UK. But what happened to its science and technology landscape? Merely a century ago, it was at the forefront of science and technology. Where did the UK lose its steam? Anyone with historical insights into the UK care to shed more light on this?

Good weather!? Good?! Weather!? Where were you in the UK? Gibraltar?

Why do you think the weather in the UK isn’t good? Because it rains sometimes? Why’s that ‘bad’? There’s nothing wrong with rain. What do you want instead? Just bland boring sunshine all the time? How dull.

The rain and lack of vitamin d sucks. I dont mind the dull sunshine if it means everyone is a little less horribly depressed.

Well it sort of like.. depends. Comparatively speaking UK still has less Sunshines than lots of places on earth, but compare to 20 - 30 years ago UK, ( At least in the Southern Part like London ) now has really hot summer and decent weather.

Climate Change.

Why does rain suck? It doesn’t stop you doing anything except maybe parties outside.

Certainly here in eastern Australia at the moment, the lack of rain in this early summer is quite depressing. (The widespread bushfires and smoke is likely to be with us for the next few months)

Perhaps, the poster is from Greenland? Then, it would be relatively good weather.

A good British summer really is something special - but if you're looking for "consistent" weather, the UK isn't for you.

It’s very British to moan about the weather but I personally love it.

I dont think it losts its steam. Ww 2 gave USA huge advantage (Europe had to be rebuilt) and EU is still trying to catch it technologically.

One question, from where are you, I have head a lot about UK but never about "great food"? :D

A British friend of mine always took exception to the stereotype of Britain having bad food, his line was that they had the best food in Europe giving as the example all the imported Indian dishes.

Yes, when I lived there, about a decade ago, curry had just again been voted for as favourite homemade dish.

Though, if you did your research, you could find really good, authentic, British cuisine (eg: St. John's).

OTOH, how much of a colonial, shared history do you have to have to accept Indian cuisine as "traditional" in the UK?

That probably WAS true back in the day. But then again so did NYC. It was still a center of upscale dining but there were essentially no interesting mid-level options.

The Internet changed everything. Someone starts a food trend in Austin, and two weeks later it's on the menus in L.A.

It happened during the 1990s according to https://www.cntraveler.com/stories/2016-04-22/how-london-bec..., though when I visited in 1998 it was still notorious in Americans' minds for plain, boring food.

London has great food, like most megacities. But although the quality of food across the UK has improved considerably since the mid 90s, I wouldn’t qualify the general level as “great”, especially compared to their southern neighbor...

Well the British are not good at two things, Optimism and Marketing. It is just not their way to go and shout to the world how great they are. DeepMind, ARM, ImgTech, Icera etc..

Although one may argue they are no longer "British" given all of them are no longer owned by British.

RIP Mr. Booker.

Will HN host a black bar on its header for a day to honour him?

Aside: I understand that the decision to give someone a black bar is a subjective one, but I’m curious as to the criteria. Is it specifically people who have contributed in an important way to the computing community? The scientific community? Society at large? Does the number of upvotes on the “has died” article matter? Would love for ‘dang to weigh in.


Thanks for the link. I didn't mean to undermine the story here, it's just been building frustration today and finally one I really want/need to read and blocked again.

NY Times has a monthly threshold. Their obit has some nice social info the guardian does not and vice-versa. If you can use incognito mode to get round the paywall the nytimes article is worth reading. Tim Berners-Lee's mother worked with Brooker!

Bad freekin week for famous people. It's almost as if more people die in winter, and this is the leading edge of the people who could reach a global population.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact