
Could Bill Gates write code? - cavedave
http://www.theregister.co.uk/2001/05/15/could_bill_gates_write_code/
======
Hoff
Programmers working with modern compilers and tools and in what are
comparatively resource-rich environments often haven't had to bum instructions
and bum bytes.

In (most) modern environments, the optimizations are toward maintenance and
support and speed of programming. Not toward memory optimizations.

All sorts of bizarre constructs can be commonplace when you are looking to
stuff an application into a tiny ROM or tiny RAM; into 32 KB main RAM memory
that was commonplace on a number of systems years ago, for instance. This
using instructions as data storage for constants, modifying the return pointer
in the call stack, using instruction side-effects, and otherwise.

When you're at the edge of needing a bigger (and then far more expensive) ROM
or falling out of available memory or needing to switch over to overlaid
segments or other run-time hackery, getting a dozen bytes back somewhere could
be a big deal.

Some of these coding techniques can be appropriate in and still show up in
very tight loops and embedded applications, but they're far less common now.

And thankfully, cheap "large" physical memory and virtual memory means we
seldom have to deal with anything like TKB and its overlays:

[http://wjh.conflux.net:16080/RSTS-V9/documents/DEC/AA-5072C-...](http://wjh.conflux.net:16080/RSTS-V9/documents/DEC/AA-5072C-TC_RSTS-
E_Task_Builder_Reference_Manual.pdf)

~~~
kabdib
I recently had to do this for an embedded processor. 1920 bytes to fit some
controller code in. It was a ton of fun, to the point where one of my cow-
orkers found himself compelled to help out. I think we're at around 20 bytes
free right now.

"I managed to save three bytes here."

"My God, you're a sick person. I love it."

Then I went back to my "day job" data mining a couple dozen terabytes of stuff
in a database.

I love working with computers.

~~~
BoppreH
This is called "code golf". You try to get the final program with the minimum
of "shots" (bytes, lines, tokens or characters). It's amusingly addictive.

And I really hope you misplaced that hyphen in "cow-orkers".

~~~
Uhhrrr
Scott Adams popularized this construction, which suggests that one's co-
workers are likely to ork cows (but he didn't originate
it:<http://en.wikipedia.org/wiki/Scott_Adams#Coined_phrases>).

EDIT: As far as I know, the definition of "ork" as a verb is left to the
imagination.

------
cpr
Bill Gates wrote code very well, and was an exceedingly good programmer.

I know, because I used to sit with him late at night at the Harvard CRCT
PDP-10 consoles (graduate research center in computing technology), ribbing
him about hacking on such silly hobbyist computers as 8008's and 8080's. (He
was working on his 8008 assembler/linker/simulator which he used to write the
Altair Basic before he ever saw the hardware. Worked the first time he tried
it on the real thing.)

He and I also had the same fate undergrad (I was '76, he was '77): we knew
enough CS that the undergrad courses at the time (fairly underdeveloped) were
too mickey mouse, so we took only grad CS courses (which were good even for
their time). And he did well in those courses.

So his brilliance and his skill aren't in question.

Nor is his drive and competitiveness--that was obvious even back then. He was
a serious player in the Currier House poker (bridge?) tournaments that would
go on for days and involve many $K pots. (Way over my head.)

------
atakan_gurkan
This reminds me of the following piece from
<http://www.sorehands.com/humor/real5.htm>

"Allegedly, one Real Programmer managed to tuck a pattern-matching program
into a few hundred bytes of unused memory in a Voyager spacecraft that
searched for, located, and photographed a new moon of Jupiter."

BTW, I would be very grateful if someone could verify this.

~~~
ColinWright
It appears that the story is first mentioned in a letter by Ed Post of
Tektronix to the editor of Datamation. A transcript can be found here:

<http://www.ee.ryerson.ca/~elf/hack/realmen.html>

Every other instance of the quotation seems to reference that letter (if
anything), and the letter appears to contain no further references.

------
skrebbel
Nice story. People still do this stuff by the way, for example in 4k
democoding. E.g. how these guys put an entire world (+ music) in 4 kilobytes:

<http://www.pouet.net/prod.php?which=50063> ('download' for the Windows
executable)

------
sayemm
Gates was an extremely talented programmer. There's a great chapter on him in
"Programmers at Work" - [http://www.amazon.com/Programmers-Work-Interviews-
Computer-I...](http://www.amazon.com/Programmers-Work-Interviews-Computer-
Industry/dp/1556152116/ref=sr_1_1?ie=UTF8&qid=1307462942&sr=8-1)

Joel Spolsky also mentions his talents in some of his old posts as well.

~~~
arethuza
The BillG Review article:

<http://www.joelonsoftware.com/items/2006/06/16.html>

"...a person who came along from my team whose whole job during the meeting
was to keep an accurate count of how many times Bill said the F word. The
lower the f __*-count, the better."

~~~
sayemm
Thanks, that's exactly the post I was thinking of.

------
bartl
>if that bit of code was small enough (ie one or two bytes) you could simply
encode those one or two bytes inside a two or three byte instruction thus
saving the three-byte instruction needed to jump over it.

The 8080 and Z80 had a relative jump, which took only 2 bytes. It had a more
limited jump range, but it was definitely wide enough to jump over 3 bytes.

>All that... for three bytes.

No, two.

~~~
ColinWright
If I recall correctly the opcode for the jump absolute was C3, and the
conditional versions we C2, CA, _etc._ The destination was two bytes, 16 bits,
little endian.

The JR - jump relative unconditional - was hex 18 followed by the number of
bytes to jump, calculated from the byte _after_ the JR instruction to account
for the pipelining.

From memory - I could be wrong, but it's close enough.

------
jgrahamc
It's a pity that people think these sorts of tricks are amazing in any way.
This is the sort of stuff you have to do when you have a very small amount of
available memory and need to make something as powerful as possible.

You are likely to still see the same sort of stuff going on when working with
microcontrollers. Now, if you want to read about something really cool, read
about drum memory and optimizing code so that instructions are ready for
execution when the drum they are on has rotated to the right spot:
<http://www.columbia.edu/cu/computinghistory/650.html>

~~~
k33n
The only reason you're able to make a statement like "This is the sort of
stuff you have to do when you have a very small amount of available memory..."
is because people like Gates came up with it first. It's only common knowledge
now because the trailblazers made it so.

The real pity is not appreciating the incredible creativity of those who laid
the groundwork for everything we take for granted today.

~~~
jgrahamc
That's not correct.

I have been programming for a very long time and have been involved in all
sorts of nasty tricks to fit things in memory (self modifying code, code that
uses subroutines from the OS to save having duplicates in its own base, code
that relocates itself while running, storing tiny amounts of code in 'free
space' inside the BIOS, temporarily storing code in screen memory because
there's nowhere else to go and you hope the user won't notice the funny image
on screen, etc.)

------
Jun8
I found the proposition absurd: Why would you suspect Gates to be less of a
coder than, say, Page, Brin, or Zuckerberg?

~~~
jcampbell1
Someone is doing technical historical research by disassembling a piece of
software written by Bill Gates. They found the software used numerous clever
hacks to stay small. What proposition are you referring to?

~~~
Jun8
From the article:

"'Could Bill Gates Write Code?' Or was he merely the luckiest man alive,..."

Maybe I'm wrong but I found the title to be kind of linkbait-ish, trying to
ride the prevalent anti-Gates sentiment. I'm no Gates fan, but to doubt his
coding skills or technical prowess either shows you're ignorant or just young
enough not to know. I use and like a Radio Shack Model 100
(<http://en.wikipedia.org/wiki/TRS-80_Model_100>, a dinosaur, but the battery
life on that thing is awesome, and works with stands AA batteries, too), this
is agreed to be the last piece of hardware that has Gates' code on it.

I, too thought he was just a corporate raider (like our managers, sigh), but
got corrected many times by friends who worked at Microsoft Research before,
who said his assessment and knowledge of new technology was spot on.

~~~
MatthewPhillips
I think I still have one of those... now I really want to dust it off and play
around with it.

~~~
Jun8
I thought it was junk but then found a vibrant community of users online. Now
I use it for word processing, great for focusing, like WriteRoom on Mac. Also
get some interesting looks. "Retro all the way baby" :-)

------
yc_peter
This doesn't mean Bill could write code. This means either Bill Gates, Paul
Allen, or both could code.

I'm inclined to say Bill could code, just because of his background and
upbringing, but being a developer on a joint project doesn't say much.

------
brudgers
The more relevant conclusion is that Gates/Allen could hack.

------
J3L2404
Apparently this sort of thing was relatively common. In the late 60's my aunt
worked for MetLife as a programmer and would routinely hide data in execution
code to save space. When she became pregnant they put her on 'leave' but
refused to reinstate her after my cousin was born. Unfortunately for them only
she knew about the hidden data and they were forced to hire her back as a
consultant, for much more money, when things inevitably broke.

~~~
gigamonkey
Another similar story, from my interview with Guy Steele in Coders at Work:

This may seem like a terrible waste of my effort, but one of the most
satisfying moments in my career was when I realized that I had found a way to
shave one word off an 11-word program that Gosper had written. It was at the
expense of a very small amount of execution time, measured in fractions of a
machine cycle, but I actually found a way to shorten his code by 1 word and it
had only taken me 20 years to do it.

Seibel: So 20 years later you said, “Hey Bill, guess what?”

Steele: It wasn’t that I spent 20 years doing it, but suddenly after 20 years
I came back and looked at it again and suddenly had an insight I hadn’t had
before: I realized that by changing one of the op codes, it would also be a
floating point constant close enough to what I wanted, so I could use the
instruction both as an instruction and as a floating point constant.

Seibel: That’s straight out of “The Story of Mel, a Real Programmer.”

Steele: Yeah, exactly. It was one of those things. And, no, I wouldn’t want to
do it in real life, but it was the only time I’d managed to reduce some of
Gosper’s code. It felt like a real victory. And it was a beautiful piece of
code. It was a recursive subroutine for computing sines and cosines.

So that’s the kind of thing we worried about back then.

~~~
ArbitraryLimits
Not really related, but Gosper's algorithm
(<http://en.wikipedia.org/wiki/Gospers_algorithm>) is the thing I've learned
in the last few years that blew my mind the hardest. It's how Maple or
Mathematica can reduce your crazy sums to a single formula. Earlier work was
apparently pioneered by a nun:
<http://en.wikipedia.org/wiki/Mary_Celine_Fasenmyer>

------
kahawe
I find this a bit of a moot point. Could Bill Gates really code? Yea,
probably. Does it really matter? Not that much as we like to believe.

Right time, right place, right people on the team and whatever other
circumstances you might want to describe as "luck" play a much bigger role
than people like to admit because it would diminish their own accomplishments.

Also his and Microsoft's success has less to do with how much and how great Mr
Gates can code; just think of Steve Jobs and many others. Good techs don't
necessarily make for successful CEOs - quite the opposite is just as plausible
if not likely considering how techs typically detest politics.

~~~
pasbesoin
I think coding at this level -- successfully; well -- reflects a kind of
relentlessness that was essential.

Gates had other qualities that contributed to Microsoft's success: The desire
and ability to focus on getting people to do things and on business, etc.

But his coding work demonstrates a focus, an intensity and persistence, that
was instrumental.

I'm not particularly fond of some of his and Microsoft's business techniques.
But he damned well executed them, relentlessly. As I see it (from afar), he
was, is never one for half measures -- not with regard to his true interests.

~~~
kahawe
And he came at the right time to the right place - with an operating system
for a new platform that would prove to be more widespread than all platforms
before.

------
chrisjsmith
I think the example provided was valid but it's bad, unintuitive code. Sort of
showboating.

~~~
ColinWright
You're missing the point. Such "tricks" were essential to fit the required
code into the necessary space. The same thing still happens (although rarely)
when fitting required functionality into limited devices such as FPGAs and
PICs.

You're not aiming for readable, maintainable code. You're trying to get the
cheapest device, and then squeezing the essential into what little space you
get. Such "tricks" as jumping into the middle of instructions are unavoidable.

~~~
chrisjsmith
They had enough spare bytes to not have to pull that trick. It'd have been
better to clean up some algorithms somewhere or reuse some code.

Prime example COS/SIN = same operation with 180 phase shift. I've seen BASIC
implementations with two separate implementations...

Note: I've worked with very memory constrained systems in assembly before
(actually hand assembled code on paper as well).

~~~
ColinWright

      > They had enough spare bytes to not have to pull
      > that trick. It'd have been better to clean up some
      > algorithms somewhere or reuse some code.
    

I assume from your clear and unequivocal statement that you have first hand
knowledge of that. I'll bow to your better information.

    
    
      > Note: I've worked with very memory constrained
      > systems in assembly before
    

As have I.

    
    
      > (actually hand assembled code on paper as well).
    

That's how I started, although largely I ended up writing directly in machine
code since it was quicker, and after I found my third bug in the assembler I
had occasional access to I gave up on writing mnemonics at all. It was only
much later when I had other people to communicate with that I went back to
writing assembly.

And I remember writing code that really, really needed to do things like
jumping into the middle of instructions.

~~~
chrisjsmith
Sorry should always "cite your sources":
[http://web.archive.org/web/20011211233332/www.rjh.org.uk/alt...](http://web.archive.org/web/20011211233332/www.rjh.org.uk/altair/4k/index2.html)

Total assembled output was 3826 bytes so they had a few bytes left including
the RAM used to switch in the tape loader.

~~~
ColinWright
Thank you - useful. Perhaps they really didn't need to use that specific trick
on that specific occasion. It might be interesting for someone to comb through
and find out if they used it lots of times, or just a few. perhaps they simply
got into the mindset and used it because they could, in anticipation of
needing it.

Certainly that sort of thing is easier to do first time round, rather than
having to go round again to find bytes when you find later that you need them.
It becomes a habit, much like these days it's a habit to layout code clearly,
name variables carefully, and comment tricky code.

~~~
chrisjsmith
A possibility and a fair one! I might write something to scan through looking
for jumps that jump inside opcodes (added to list of projects which I will do
one day).

------
pascal_cuoq
Yes, but does Altair BASIC still work on the Pentium 4 and its trace cache?
I'm afraid this trick may not have been any more future-proof than self-
modifying code.

Still, nice.

