We had a 1 semester assembler class in my uni. While most of the other students didn't get why were we studying assembler, I was extatic.
Nothing more adrenaline pumping than pouring over disassembly print outs and cracking tutorials in the middle of the night to make NOP sleds!
When I look at modern 64 bit assembly instructions, it looks completely incomprehensible, so I'm glad I got to have that relatively gentle introduction on an 8-bit platform.
and so on...
I am very glad to hear that you like it, and I am delighted to see it doing so well. I think it's only the 2nd or 3rd obit I've done for the Reg and I did not get a style guide; I just read some previous ones.
It was fascinating to write this. I did a lot of reading and research, and the more I learned, the more interesting and impressive her and her husband's work become to me.
In the end my editor was really chasing me to file my piece, so I am happy that it proved worth the time investment. He too is surprised how well it's done. It may be my single most-tweeted story yet, in a year full-time at El Reg.
FWIW, some background context: Kathleen Booth got her BSc from the same college I got mine. J D Bernal's mentor, W L Bragg, was the son of perhaps the most famous graduate of my secondary school, King William's College: W H Bragg. They are tenuous connections but both pleased me.
If there isn't a writer who knows that area or that field, then it's hard to cover and explain adequately. A few have come up in my time on the Reg who are worthy of coverage but none of the available team really has relevant expertise. :-(
Then again, the writing style is not the worst part of The Register - it's the comments section that's overrun with boomers and their predictably bitter takes when it comes to news involving Big Tech, Silicon Valley or pretty much anything that doesn't have the effect of transporting them back to their nostalgic heyday.
That being said, there doesn't seem to be indication of building an actual assembler, i.e. a program that reads symbolic instructions and produces machine code. AFAIK, the first assembler was built by David Wheeler for the EDSAC.
> Wheeler’s “initial orders” allowed Edsac instructions to be provided in a simple language rather than by writing binary numbers, and made it possible for non-specialists to begin to write programs. This was the first “assembly language” and was the direct precursor of every modern programming language, all of which derive from the desire to let programmers write instructions in a legible form that can then be translated into computer-readable binary.
The mapping of machine code to symbols seems completely ancillary, and as a side comment says, without the demonstration of an assembler and an intent to translate in the other direction, it is premature to say any language has been invented, vs. notation for writing a paper.
(Assembly and poetry were two classes I never got around to in undergrad).
But I've never heard this story before.
Having programmed a fair amount in assembly and occasionally in raw machine code, I would estimate at least a 10x productivity increase when an assembler is available. I would not contest if somebody suggested it's more like 100x. A massive milestone in the history of computing, without a doubt.
Wow long life. I didn’t know you until now, RIP.
> I give you a story from my own private life. Early on it became evident to me that Bell Laboratories was not going to give me the conventional acre of programming people to program computing machines in absolute binary. It was clear they weren't going to. But that was the way everybody did it. I could go to the West Coast and get a job with the airplane companies without any trouble, but the exciting people were at Bell Labs and the fellows out there in the airplane companies were not. I thought for a long while about, "Did I want to go or not?" and I wondered how I could get the best of two possible worlds. I finally said to myself, "Hamming, you think the machines can do practically everything. Why can't you make them write programs?" What appeared at first to me as a defect forced me into automatic programming very early. What appears to be a fault, often, by a change of viewpoint, turns out to be one of the greatest assets you can have. But you are not likely to think that when you first look the thing and say, "Gee, I'm never going to get enough programmers, so how can I ever do any great programming?"
If it's the former are you saying he claimed to have invented assembly?
It's an interesting speech and I'm glad you posted it but I'm struggling to understand the connection to Booth
I really wish I could have received an answer.
I don't follow why asking the question was considered wrong.
Really - if someone who downvoted this - or understands why it was downvoted could explain I'd really appreciate it.
The take away from the quote posted, to me, was the fact that at one time assembler was a huge innovation. Today it is seen as so low level no one would ever bother with it. But from Hammings perspective it was a huge boon to productivity.
The IBM printers were fun: you could get them to play simple tunes by printing repeating characters on a line.
; pour one out for the homie
xor rax, rax
xor rbx, rcx
xor rcx, rbx
xor rbx, rcx
I didn't know I had Kathleen Booth to thank for laying the groundwork.
We should not forget that macro processing vastly expanded the power of Assembly language.
There is an analogy with DCF, Document Composition Language. The first MarkUp Language was GML which was a set of DCF macros.
GML evolved into BookMaster which underlayed IBM becoming the world's second most prolific publisher.
I wrote XPSProf which was a DCF ML used by Xerox Canada for writing manuals for its IBM compatible software offerings.
(but mostly to "catch up" on an era I never got to dabble in, namely the old home computers like the C64 and ZX Spectrum, but also intrigued by the new Apple Silicon chips and their ARM stuff)
The difference between raw machine op codes and a nicely formatted asm
with whitespace/columns is quite a leap, if you've ever programmed
For that matter, language in use has a "descriptive" side overwhelmingly larger than the "prescriptive" side.
Now, the "assembler" is that thing which takes program components - say, a loop, an array of data and a subroutine - and concatenates them, e.g. fixing addressing (see Wilkes, Wheeler and Gill 1951) - you have to compute, for specification e.g. in the loop, at which address the data and the subroutine will start. The result is hence an "assembly".
Assembling was a clerical work. And famously (or notably), when Donald Gilles wrote an automated assembler, John von Neumann protested that "to have computers perform clerical work is a waste of precious resources".
But since at some point you got a piece of software to assemble the chunks, you can then think of adding an interpreter to translate machine operation mnemonics into machine code. So its syntax becomes "the language for the assembler", and "assembly language" would be "the language to obtain an assembly". Both expressions are weak, and I do not know historical details in their use. Some adopted the use of 'assembly' for the language to reserve 'assembler' for the processing program.
Hey @dang, care to do it?
I just checked on the websites of the NYT, on the Guardian - nothing.
It is like in typical treatment of History: progress is at or beyond the margin.
I know that many great scientists and engineers have unfortunately been left in the depths of history, but I also believe that the people who contributed to the foundations of our modern digital world should be known by outsiders as well.
I also feel ritchie's death was overshadowed by jobs death, despite his life having a more fundamental impact on this industry.
I only knew about her passing because I happened to open the Wikipedia page this morning, and I have not seen a single news article about her passing. Even then, the Wikipedia page only says "Died August 14, 2022 (aged 100)", and the most recent information the article contains is "In May 2019, Booth was reported to be living in retirement, at the age of 97."
-- x86 stuff --
What changed was when home computers became widely available, causing society as a whole to redefine how they saw computing. Marketing campaign after marketing campaign promoted computers as a boys' hobby, with consumer software being dominated by games centered around playing sports and shooting things, and nearly all ads for those games showed them being played by boys. Rather than a clerical field, computers were now the latest expensive toys for boys. Computing was presented as "the cool new thing" to boys, and since marketing campaigns targeted boys pretty much exclusively, it gave girls the impression computing was a boys' club where girls weren't welcome. And since girls didn't feel welcome, they got out. The tipping point of this was in 1984, when the number of Computer Science degrees awarded to women peaked and then sharply fell off; it was so bad, the number didn't flatline until the very late '00s.
There have been a few articles on the phenomenon:
Because it largely was. What people originally called "computers" were actually people doing manual calculations:
During the world wars in the 20th century, most of those "computers" were women since men were more directly involved in the war:
> As electrical computers became more available, human computers, especially women, were drafted as some of the first computer programmers. Because the six people responsible for setting up problems on the ENIAC (the first general-purpose electronic digital computer built at the University of Pennsylvania during World War II) were drafted from a corps of human computers, the world's first professional computer programmers were women, namely: Kay McNulty, Betty Snyder, Marlyn Wescoff, Ruth Lichterman, Betty Jean Jennings, and Fran Bilas.
I don't think so, at least in the early years of personal computing; look at these 80s computer ads and see how many females are in them:
WWII might have had some effect on what young men were doing when those two got their start.
* Once IT became a big industry with lots of good paying jobs, it became extremely attractive to breadwinners.
1) women are uniformly incentivized against making the most money for their time possible
2) women have children
3) women are married
4) … to men
Given women were very prevalent in tech until about the advent of the home computer maybe the fact home computers were only marketed to and for boys and the computer was installed in the boys room?
What makes you assume men and women are working the same amount of time? Labor statistics show that men work more overtime and women are more likely to work part time.
Which fits the hypothesis that men are socially judged more by their breadwinning and women are socially judged more by things like having the flexible time to be a good mom.
> … to men
I'm taking about statistical trends, nobody said all women are married to men, but lesbians are a tiny minority compared to straight women. Even if all lesbians adopted masculine gender roles, that wouldn't change the fact that the majority of women follow female gender roles.
Because single women are equally subject to workplace hostility: men "don't discriminate" when we discriminate.
The passive voice is undermining your point, here.
Patriarchal ideology associates breadwinning with the male gender and men, fearful in light of the aforementioned social pressures, are uniquely hostile to women who, under patriarchy, our society has agreed to side-line as a class.
It's worthwhile to explore from where these social pressures arrive. Parents, teachers, peers, religious authorities, bosses, and advertising come to mind. I wonder where you'd rank the influence of all these? For reference, don't hesitate to look into bell hooks "The Will to Change;" as feminist texts go, it's extremely sympathetic to men. I think you'll find a lot to agree with, right off the bat.
Of course not, don't be ridiculous. That doesn't change the effect of their preferences, though.
And, of course men's preferences also have an effect on women. There's no need to be defensive (or aggressive) here.
> Parents, teachers
Which gender are those mostly? Who is buying girls Barbie dolls and boys Action Man?
Who said women (or men) are? But who people choose to date is obviously a very strong source of societal pressure. Have you not heard of "sexual selection"?
If men refused to date women who wear pants, that would be strong pressure on women to wear dresses instead.
If women refuse to date stay at home dads, that's strong pressure on men to be breadwinners instead.
The issue is segmenting "IT" as some special industry separate from any other skilled trade that relies more on knowledge than tools than say... car stuff.
When you have to clarify if someone meant someone is a rapist or just someone who TK when the word "asshole" is used in policy circles, you begin to understand why some things went the way they did.
I ended up enrolling in a PhD to get out of my hometown because they actually did a good job of making things like STEM admissions more gender equal -- but they did it with don't ask, don't tell in effect, when you'd be told sarcastically "maybe you should go smoke some weed" by folks who didn't seem to grok -- you're a social worker who can send people off to prison, how dare you just casually tell people who the system has failed to just go take a risk whose costs won't fall on you.
(But I wasn't mature enough to articulate it that way, so I drank a lot and made a lot of rude posts on the internet)
Anyways, it is ABSOLUTELY weird, and tied in with the fact that post WWII many GI bill benefits were not evenly distributed... and so on and so forth.
I'm not going to rehash the American civil rights movement in the HN comments section at shortly after midnight on what will be, later... devil's night.
PS: As always, I'm a guy, so take what I say on these matters with a grain of salt.
The real question is why more men don't enter nursing since it would allow them the social lifestyle that they don't have and won't get going into IT. Both fields pay about the same starting salary too.
Software development is not a solitary activity, it is a highly social and collaborative activity, and besides that, introversion is no less prevalent in women than in men.
Breaking down just doctors, the profession is 2/3 male, and within it women lean towards specialties like gynecology, pediatrics and palliative medicine while the most prestigious ones like various kinds of surgery and radiology are male dominated.
Nursing (which is a huge industry of course) is the notable exception within medicine. But it’s still subordinate.
At what point in time do you think computing became more expensive than it was before? If anything, the cost of computing has been strictly monotonically decreasing since the very first programmable computer.
That doesn't make any sense. Just 10 minutes before you've claimed that computing became male-dominated because it was expensive:
> That the field became male-dominated once computing became simultaneously expensive and highly profitable should be entirely unsurprising.
and now you claim that it was female-dominated at first because it was expensive?