Here's an article with some more details about how she came to invent assembly language. This is a "necessity is the mother of invention" tale that I always find fascinating starting with "Andrew Booth, whom Kathleen would eventually marry, had previously done X-ray crystallography research at the University of Birmingham and that included doing a lot of computations. This started him down the path of building computing machines to make the work easier.":
If I were him, I'd want to know if I'd saved people overall more time in the end. I don't expect any of my personal projects to produce significant personal benefits compared to time taken.
For me, coding in Assembly language during the C64 and Amiga Demo-scene in the 80s is the major reason that I went on to study Computer Science and ended up working as a system developer for the past 30+ years. One of my fondest memories from my developer experience is writing the first Amiga Demo Creator entirely in Assembly language back in May 1987:
My contact with assembly was during the 90s. I wanted to do cracks and keygens like the ones I saw cool groups like The Phrozen Crew and similar. I learned to use SoftIce, IDA, W32dasm and later OllyDBG to step through assembler code. I started doing simple JZ/JNZ cracks but eventually progressed to reverse engineer keygens and implement them in little C programs. My top crack was with a niche program that was distributed in 3 1/2 diskettes and checked for sectors marked as Bad (B in ScanDisk), if the disk didn't have those bad sectors it failed the copy verification.
We had a 1 semester assembler class in my uni. While most of the other students didn't get why were we studying assembler, I was extatic.
As a teenager I fell into writing a bunch of assembly language for the H8 microcontroller inside the Lego Mindstorms RCX (via https://en.wikipedia.org/wiki/BrickOS). I guess that was an era when it was much easier to beat the compiler because it yielded about a 10x speedup for the task I needed doing, and was a great learning experience.
When I look at modern 64 bit assembly instructions, it looks completely incomprehensible, so I'm glad I got to have that relatively gentle introduction on an 8-bit platform.
Dr. Booth (her late husband) was one of my engineering professors back at the University of Victoria back in the early ‘90s. It’s amazing that modern CPUs still use the multiplication circuits he created (along with his wife). Also one of several people that can claim to have created the first spinning magnetic storage!
I am very glad to hear that you like it, and I am delighted to see it doing so well. I think it's only the 2nd or 3rd obit I've done for the Reg and I did not get a style guide; I just read some previous ones.
It was fascinating to write this. I did a lot of reading and research, and the more I learned, the more interesting and impressive her and her husband's work become to me.
In the end my editor was really chasing me to file my piece, so I am happy that it proved worth the time investment. He too is surprised how well it's done. It may be my single most-tweeted story yet, in a year full-time at El Reg.
FWIW, some background context: Kathleen Booth got her BSc from the same college I got mine. J D Bernal's mentor, W L Bragg, was the son of perhaps the most famous graduate of my secondary school, King William's College: W H Bragg. They are tenuous connections but both pleased me.
I think everyone on here appreciates obits for tech pioneers, particularly those that are less well known. Frankly there's not a lot of websites, if any, that publish obituaries for anyone other than people like Steve Jobs. It's educational and interesting to learn how some tech was due to someone we've pretty much never heard of.
Sadly, there are too many to write. We discuss them when they happen but all too often it's someone who was hugely important in one particular tiny community or specialism or something, and who the rest of the world has mostly never heard of.
If there isn't a writer who knows that area or that field, then it's hard to cover and explain adequately. A few have come up in my time on the Reg who are worthy of coverage but none of the available team really has relevant expertise. :-(
If you're talking about the tongue-in-cheek, it couldn't die quicker as far as I'm concerned. Even as a proud Brit (from the North, so that may be a factor), I find the tongue-in-cheek style and excessive puns nauseating when I'm just trying to scan articles to catch up on news, to the point that I keep telling myself I need to unsubscribe from their newsletter. The only reason I'm still subscribed at all is because I know of no good equivalent for UK-specific tech news.
Then again, the writing style is not the worst part of The Register - it's the comments section that's overrun with boomers and their predictably bitter takes when it comes to news involving Big Tech, Silicon Valley or pretty much anything that doesn't have the effect of transporting them back to their nostalgic heyday.
The Register might not be to your taste, but some of us find it a refreshingly irreverent and amusing read. Anyway they are getting some stick for becoming too corporate and bland at the moment which just goes to show how difficult it is to make people happy.
If you're interested, you can find a link below to Kathleen Booth's original report from 1947 titled "Coding for A.R.C." (authored with her -later- husband Andrew Booth)[0]. It details how coding is done for that machine, with a very early form of symbolic instructions that closely resembles what we know as assembly language today.
You can see at the end of the report the mapping between machine codes and symbols.
That being said, there doesn't seem to be indication of building an actual assembler, i.e. a program that reads symbolic instructions and produces machine code. AFAIK, the first assembler was built by David Wheeler for the EDSAC.
Indeed, the IEEE credits David Wheeler with the creation of the first assembly language, for which they awarded him a computer pioneer award in 1985[1]:
> Wheeler’s “initial orders” allowed Edsac instructions to be provided in a simple language rather than by writing binary numbers, and made it possible for non-specialists to begin to write programs. This was the first “assembly language” and was the direct precursor of every modern programming language, all of which derive from the desire to let programmers write instructions in a legible form that can then be translated into computer-readable binary.
Back in the day, the work you might dream of computing might be as simple as a matrix multiplication, which while difficult to implement, is certainly not impossible for a beginner; we did exactly that for our first machine language MP in Computer Engineering 190 my freshman year of college. The use cases were just far simpler back then for computers!
Having read the materials, it seems misleading to call it the invention of assembly language. More like the invention of machine programming, i.e. what the paper calls "coding of problems", i.e. turning a problem into machine code.
The mapping of machine code to symbols seems completely ancillary, and as a side comment says, without the demonstration of an assembler and an intent to translate in the other direction, it is premature to say any language has been invented, vs. notation for writing a paper.
The hackaday reference below says she built an assembler. Given how little we know of her achievements and give her credit for those, I would err on the side of crediting her rather than the other interpretation. But just my opinion:)
That's a great find, it should be required reading for every intro to architecture and assembly course! I stumbled for a second over 'serial vs. parallel' but then realized it just meant a parallel data bus.
Thanks. My copy was actually the copy that they sent to J.D. Bernal, and was signed over to him "with the compliments of the authors". My father was another PhD student of Bernal's, and became Professor of Crystallography at Birkbeck later. The diagrams are scanned in from the original, but I re-typed the text for web presentation. It is astonishing how much of Computer Science was completely understood by the end of the war.
I don't think so. I couldn't tell you the inventor of the transistor or the operating system but I'm familiar with Grace Hopper and Eda Lovelace. The important people in a field's history are more or less remembered for reasons that may be arbitrary like a catchy name or funny story.
Christ. I thought I knew a reasonable amount about this. I've read biographies of Turing and Von Neumann and multiple popular histories of the birth of computing and computer science.
Having programmed a fair amount in assembly and occasionally in raw machine code, I would estimate at least a 10x productivity increase when an assembler is available. I would not contest if somebody suggested it's more like 100x. A massive milestone in the history of computing, without a doubt.
One of the most amazing things to me, and my fascination with computers, was that you could actually talk to the people that invented them, unlike steam engines, or cars, where the original inventors are long gone.
I sadly hadn't heard of her, and her Wikipedia page has no mention of any state awards such as an OBE or MBE (maybe she might have refused them) and she likely died in the window where there was no 100th birthday card from the Queen/King.
Richard Hamming called what we now know as assembler “automatic programming”:
> I give you a story from my own private life. Early on it became evident to me that Bell Laboratories was not going to give me the conventional acre of programming people to program computing machines in absolute binary. It was clear they weren't going to. But that was the way everybody did it. I could go to the West Coast and get a job with the airplane companies without any trouble, but the exciting people were at Bell Labs and the fellows out there in the airplane companies were not. I thought for a long while about, "Did I want to go or not?" and I wondered how I could get the best of two possible worlds. I finally said to myself, "Hamming, you think the machines can do practically everything. Why can't you make them write programs?" What appeared at first to me as a defect forced me into automatic programming very early. What appears to be a fault, often, by a change of viewpoint, turns out to be one of the greatest assets you can have. But you are not likely to think that when you first look the thing and say, "Gee, I'm never going to get enough programmers, so how can I ever do any great programming?"
I posted the quote and I didn’t down vote your post. I was however confused how you found either question from what I posted, Hamming codes are not mentioned, neither is the idea Hamming created assembler. I mean it seems like deliberate trolling.
The take away from the quote posted, to me, was the fact that at one time assembler was a huge innovation. Today it is seen as so low level no one would ever bother with it. But from Hammings perspective it was a huge boon to productivity.
I wrote my first "big" program in assembly language (IBM/360): it was a program to print parse trees for an arbitrary grammars. This was around 1970. The clever bit was to print in a way that let me tape the printout pages side by side if the tree branches exceeded the 128 (or was it 256?) character limit of the printer.
The IBM printers were fun: you could get them to play simple tunes by printing repeating characters on a line.
Very likely your IBM/360 was connected to one of their 1403 Model N1 line printers with 132 columns. The lid would automatically rise up when it was hungry for a new box of fan-folded paper (probably with horizontal green and white stripes, three lines each). To tape your final pages edge-to-edge, you'd have had to have first removed the perforated strips on the left and right edge that had the tractor-feed holes. Fun times.
Assembler then early NNets and then more NNets, I find careers at that time quite wilder than today. Kinda like the FORTRAN team using ad-hoc markov chains for register allocation (IIRC) just out of curiosity.
I've got a soft spot for assembly language. I learned it at age 11 or 12 to program my TI-83, and it gave me a huge head start on my computer engineering education. It was challenging, yet it also made the details of what the machine was doing very explicit. Higher level languages, like C and Java, seemed impenetrable to me before learning assembly, but it opened the door to me learning them in high school.
I didn't know I had Kathleen Booth to thank for laying the groundwork.
I cut my Assembly teeth on S/360. In those days IBM source code was widely available and provided an excellent practice example for learning Assembler and the finer points of various instructions.
We should not forget that macro processing vastly expanded the power of Assembly language.
There is an analogy with DCF, Document Composition Language. The first MarkUp Language was GML which was a set of DCF macros.
GML evolved into BookMaster which underlayed IBM becoming the world's second most prolific publisher.
I wrote XPSProf which was a DCF ML used by Xerox Canada for writing manuals for its IBM compatible software offerings.
I had just started learning assembly in earnest a couple weeks ago :(
(but mostly to "catch up" on an era I never got to dabble in, namely the old home computers like the C64 and ZX Spectrum, but also intrigued by the new Apple Silicon chips and their ARM stuff)
(Typo. 'Assembly', "contracted notation", is a language. Made to make human readable the elements of those pieces of binary information that the 'Assembler', executed code, assembles.)
For that matter, language in use has a "descriptive" side overwhelmingly larger than the "prescriptive" side.
Now, the "assembler" is that thing which takes program components - say, a loop, an array of data and a subroutine - and concatenates them, e.g. fixing addressing (see Wilkes, Wheeler and Gill 1951) - you have to compute, for specification e.g. in the loop, at which address the data and the subroutine will start. The result is hence an "assembly".
Assembling was a clerical work. And famously (or notably), when Donald Gilles wrote an automated assembler, John von Neumann protested that "to have computers perform clerical work is a waste of precious resources".
But since at some point you got a piece of software to assemble the chunks, you can then think of adding an interpreter to translate machine operation mnemonics into machine code. So its syntax becomes "the language for the assembler", and "assembly language" would be "the language to obtain an assembly". Both expressions are weak, and I do not know historical details in their use. Some adopted the use of 'assembly' for the language to reserve 'assembler' for the processing program.
Thanks for noting this. We are so accustomed to reading comments as of a point in time, but their meaning is also impacted by when they are made. In this case, a future historian may draw a different conclusion had they not seen your clarifying post. Whereas now it confirms that the community viewed it as a good thing, and HN also agreed steadfastly.
Honestly at this point, it should, as it is widely used. Maybe a page where be can see all the mentions so he can check them if he has time, not something disruptive like an email notification for every mention obviously.
I would just like to note that I collect RSS feeds from a large number of sources, and none of them contained this non-trivial piece of news (I just did a search on the database).
I just checked on the websites of the NYT, on the Guardian - nothing.
It is like in typical treatment of History: progress is at or beyond the margin.
It is extremely sad how the sciency or tech media largely ignores the demise of computer scientists and engineers who contributed a lot to the field. Most of them face a lack of recognition outside their immediate co-workers in their lifetime as well. The foreshadowing of Dennis Ritchie's death was the most flagrant one that comes to mind immediately.
I know that many great scientists and engineers have unfortunately been left in the depths of history, but I also believe that the people who contributed to the foundations of our modern digital world should be known by outsiders as well.
in late 2011 my teacher was giving a motivational speech about steve jobs and i was the only one to recognize jobs from an old photo the teacher showed us. so even the more public facing influences are not widely recognized for there past works.
I also feel ritchie's death was overshadowed by jobs death, despite his life having a more fundamental impact on this industry.
I learned of her death when The Telegraph published her obituary last Tuesday, but it’s paywalled. I haven’t seen her death mentioned elsewhere either.
It was mentioned on reddit August 18, 2022. How this person came to this conclusion is unknown. There is no source. Notably, this is more than a month before her death. Strange.
I only knew about her passing because I happened to open the Wikipedia page this morning, and I have not seen a single news article about her passing. Even then, the Wikipedia page only says "Died August 14, 2022 (aged 100)", and the most recent information the article contains is "In May 2019, Booth was reported to be living in retirement, at the age of 97."
Computing was originally seen as clerical work, and it was one of the few acceptable career paths to women even before the social upheaval of the '60s.
What changed was when home computers became widely available, causing society as a whole to redefine how they saw computing. Marketing campaign after marketing campaign promoted computers as a boys' hobby, with consumer software being dominated by games centered around playing sports and shooting things, and nearly all ads for those games showed them being played by boys. Rather than a clerical field, computers were now the latest expensive toys for boys. Computing was presented as "the cool new thing" to boys, and since marketing campaigns targeted boys pretty much exclusively, it gave girls the impression computing was a boys' club where girls weren't welcome. And since girls didn't feel welcome, they got out. The tipping point of this was in 1984, when the number of Computer Science degrees awarded to women peaked and then sharply fell off; it was so bad, the number didn't flatline until the very late '00s.
> As electrical computers became more available, human computers, especially women, were drafted as some of the first computer programmers.[47] Because the six people responsible for setting up problems on the ENIAC (the first general-purpose electronic digital computer built at the University of Pennsylvania during World War II) were drafted from a corps of human computers, the world's first professional computer programmers were women, namely: Kay McNulty, Betty Snyder, Marlyn Wescoff, Ruth Lichterman, Betty Jean Jennings, and Fran Bilas.[48]
Not only seen as clerical, the women who did the work were collectively known as "computers". These computers filled vital roles during both world wars, being tasked with computing the trajectories of shells fired from cannons in the first to working with the mathematicians of Bletchley Park, in England, with solving the decryption of the German Enigma machine. Even as late as the 1960s, women, primarily black women, were employed by NASA to perform a variety of complex and utterly essential calculations that predicted where the rockets were to go during the manned space program. It's really only been within the last 20 years are so that they have achieved the recognition and acclaim they most assuredly earned.
* Breadwinning is associated with the male gender role, so men face more social pressures to be breadwinners than women do. (The proportion of men who e.g. stay at home to parent is much smaller than women, and it's likely fewer women are willing to date a man with that goal, compared to the number of men willing to date a woman with that goal.)
* Once IT became a big industry with lots of good paying jobs, it became extremely attractive to breadwinners.
Then why aren’t single women dominant in tech? You seem to have the assumptions :
1) women are uniformly incentivized against making the most money for their time possible
2) women have children
3) women are married
4) … to men
Etc.
Given women were very prevalent in tech until about the advent of the home computer maybe the fact home computers were only marketed to and for boys and the computer was installed in the boys room?
> women are uniformly incentivized against making the most money for their time possible
What makes you assume men and women are working the same amount of time? Labor statistics show that men work more overtime and women are more likely to work part time.
Which fits the hypothesis that men are socially judged more by their breadwinning and women are socially judged more by things like having the flexible time to be a good mom.
> … to men
I'm taking about statistical trends, nobody said all women are married to men, but lesbians are a tiny minority compared to straight women. Even if all lesbians adopted masculine gender roles, that wouldn't change the fact that the majority of women follow female gender roles.
The passive voice is undermining your point, here.
Patriarchal ideology associates breadwinning with the male gender and men, fearful in light of the aforementioned social pressures, are uniquely hostile to women who, under patriarchy, our society has agreed to side-line as a class.
Then it seems the main enforcers of "patriarchal ideology" are those straight women who prefer to date men who are successful breadwinners over men who aim to be stay at home dads.
Women aren't obligated to have sex with anyone in particular, and it's extremely worrying how prevalent this opposition to romantic freedom has become in online discourse.
It's worthwhile to explore from where these social pressures arrive. Parents, teachers, peers, religious authorities, bosses, and advertising come to mind. I wonder where you'd rank the influence of all these? For reference, don't hesitate to look into bell hooks "The Will to Change;"[0] as feminist texts go, it's extremely sympathetic to men. I think you'll find a lot to agree with, right off the bat.
> Women aren't obligated to have sex with anyone in particular
Who said women (or men) are? But who people choose to date is obviously a very strong source of societal pressure. Have you not heard of "sexual selection"?
If men refused to date women who wear pants, that would be strong pressure on women to wear dresses instead.
If women refuse to date stay at home dads, that's strong pressure on men to be breadwinners instead.
It is only "male dominated" in the West. Look in the East (both Asia and former USSR) and you'll see far more females working in the computing/electronics industry.
I think in times past, the achievements of woman in STEM were downplayed and disproportionately overshadowed by men of their same era. You have to assume that most reporters and journalists covering tech would have been male as well.
> Between pioneers like Ada Lovelace, Grace Hopper and Kathleen Booth, it's still weird to me that IT has ended up becoming a male dominated industry.
The issue is segmenting "IT" as some special industry separate from any other skilled trade that relies more on knowledge than tools than say... car stuff.
When you have to clarify if someone meant someone is a rapist or just someone who TK when the word "asshole" is used in policy circles, you begin to understand why some things went the way they did.
I ended up enrolling in a PhD to get out of my hometown because they actually did a good job of making things like STEM admissions more gender equal -- but they did it with don't ask, don't tell in effect, when you'd be told sarcastically "maybe you should go smoke some weed" by folks who didn't seem to grok -- you're a social worker who can send people off to prison, how dare you just casually tell people who the system has failed to just go take a risk whose costs won't fall on you.
(But I wasn't mature enough to articulate it that way, so I drank a lot and made a lot of rude posts on the internet)
Anyways, it is ABSOLUTELY weird, and tied in with the fact that post WWII many GI bill benefits were not evenly distributed... and so on and so forth.
I'm not going to rehash the American civil rights movement in the HN comments section at shortly after midnight on what will be, later... devil's night.
PS: As always, I'm a guy, so take what I say on these matters with a grain of salt.
It's a solitary activity and it seems men are more prone to living single, solitary lifestyles and see no decline in lifestyle moving into IT. Conversely, women tend to be hardwired to be more social and gravitate toward more social jobs like nursing.
The real question is why more men don't enter nursing since it would allow them the social lifestyle that they don't have and won't get going into IT. Both fields pay about the same starting salary too.
> It's a solitary activity and it seems men are more prone to living single, solitary lifestyles
Software development is not a solitary activity, it is a highly social and collaborative activity, and besides that, introversion is no less prevalent in women than in men.
Do you believe Kathleen wouldn’t want to be considered in this way and didn’t regret the current state of tech for women? I sort of suspect she would be honored by the pantheon inclusion and strongly agree with the sentiment. Respecting someone who has passed isn’t only done by recounting their biography.
Wait, have you actually managed to convince yourself that your comment was responsible for that? Despite the fact that your comment was quickly downvoted down the parent’s comments?
Edit: seems pretty doubtful that your comment had a negative impact on the parent given that your own comment was overall downvoted. Personally, I skimmed the parent comment, then read yours, rolled my eyes to the back of my skull, then upvoted the parent.
It's a type of vice signaling. To appeal to their peers by making fun of virtue signalling. It's being contrary on purpose to fit into their own echo chamber's culture.
The fact is that demonstrating one’s values is a human thing.* These people aren’t really against “virtue signaling”**…that’s a rhetorical sleight of hand to avoid saying what’s often more unpopular—that they don’t like the virtue being signaled.
* of course there are always going to be people who are over the top and annoying about it.
** we know this because of how selectively the term is applied.
Medicine is not at all female dominated, just certain categories in it like nursing and pharma sales reps. If you move up the ladder of money and influence, you will find that the majority of doctors, administrators, CEOs etc. are very much male.
Breaking down just doctors, the profession is 2/3 male, and within it women lean towards specialties like gynecology, pediatrics and palliative medicine while the most prestigious ones like various kinds of surgery and radiology are male dominated.
1. these days
2. nursing may be female-dominated, but the gender ratio of doctors (for whom the industry is significantly more profitable) is still heavily skewed towards men, is it not?
At what point in time do you think computing became more expensive than it was before? If anything, the cost of computing has been strictly monotonically decreasing since the very first programmable computer.
Indeed. My personal pet theory is that this combination (expensive + unprofitable) might have been why so many early pioneers of computing were female in the first place.
> My personal pet theory is that this combination (expensive + unprofitable) might have been why so many early pioneers of computing were female in the first place.
That doesn't make any sense. Just 10 minutes before you've claimed that computing became male-dominated because it was expensive:
> That the field became male-dominated once computing became simultaneously expensive and highly profitable should be entirely unsurprising.
and now you claim that it was female-dominated at first because it was expensive?
https://hackaday.com/2018/08/21/kathleen-booth-assembling-ea...