Hacker News new | past | comments | ask | show | jobs | submit login
Steve Wozniak's floating point routines for the 6502 (tinyletter.com)
502 points by jamesbowman 12 days ago | hide | past | web | favorite | 170 comments





Woz was scary good at both hw and sw (author of Apple BASIC.) If you have time, go through his various Apple documents online.

I believe he was never the same after his small plane crashed. (Back then, wealthy people used small planes like ATVs, and had about the same safety record.)

I worked with another hw engineer of similar ability back in the 80s.

Mind-blowing watching him just pick up a pen and draw out a complete board schematic at lunch, then getting the working board back from mfg. the following week and launching it on the Space Shuttle.


A 50-hour pilot in a Bonanza A36TC is insane, I had no idea Woz landed off-airport. I do checkouts in Bonanzas, and the more typical A36TC/TN pilot has 500 hours' experience or more.

(Some of us still use small planes like ATVs -- but if the description of Woz' experience in the incident is accurate, this is like taking a Lamborghini out with a fresh driving permit)

[1] https://www.cultofmac.com/465778/today-in-apple-history-stev...


> With Woz, who had only flown for 50 hours at the time, at the controls, the plane climbed too abruptly. Then the aircraft stalled and careened through two fences into the car park of a skating rink. Woz later said he thought Candi might have accidentally leaned on the controls.

How could leaning on the yoke (or any other control) cause a stall? She'd have to pull back on it to force a climb like that. I'm kind of amazed he was able to fly a plane like that at 50 hours. A turbo-charged engine requires an extra certification step right? And the private pilot license minimum is 40 hours.


> I'm kind of amazed he was able to fly a plane like that at 50 hours. A turbo-charged engine requires an extra certification step right?

The wikipedia page for the Beechcraft Bonanza says: "The NTSB investigation revealed Wozniak did not have a "high performance" endorsement (making him legally unqualified to operate the airplane)" and links to https://web.archive.org/web/20121019022620/http://www.ntsb.g...


Minimum flight time to get your cert is 40 hours. You train in a fixed gear plane, wheels don’t come up.

To fly this he would need a complex sign off to fly the retractable gear/variable pitch prop and high performance sign off for having over 200hp engine.

It’s a lot of airplane for that much experience but only takes 5 hours or so to get those sign offs from your instructor. Bonanzas have a reputation -now- that keep people away from them until they are very comfortable with flying a complex and fast airplane.


I read a description why aircraft like the Bonanza A36TC are more unforgiving. I think he said because because of higher power, higher speeds, and airfoils to match these aircraft can rapidly leave the stable flight envelop. It takes fast reflexes and knowing what you are doing to get them back. And because they are fast they'll fall out of the sky very quickly in a stall.

Fast reflexes comes from being young or being older and using reflexes you picked up when you were young. Knowing what you are doing comes from knuckling under to someone that does. It's why these aircraft regularly kill middle aged doctors. They have neither the reflexes or the tendency to obey.

There is a reason the air force trains 20 year old's to fly jet fighters.


> It takes fast reflexes ...

No, most Bonanza owners are mid-career doctors, dentists and lawyers. Never seen a teen fly one.

Also, you generally have a second or two at least to deal with issues. Hence the old saw about "winding your watch before reacting to an airliner emergency."

> There is a reason the air force trains 20 year old's to fly jet fighters.

I talked to an ex-Air Force, now civilian flight school owner about that.

He said the problem with 30 year-old military aviation students is that they talk back when fed b.s., unlike kids. So it's a general discipline issue, not an age one.


It's somewhat common for plane sellers, like Cirrus, to allow purchase of the plane then do their primary training in it and in the process getting all necessary endorsements.

One of my flight instructors had a friend/business associate that bought a Kodiak @ Oshkosh after only having flown 182s. The insurance requirements were insane of course. Also, know a guy with 70 hours who bought a Lancair with a 330hp engine. In flying it seems more readily to be a case of if you can afford it, you can do it.

I was in a Mooney @ 120 hours. I just did a lot of extra training--EMT, Tailwheel endorsement, Instrument & commercial rating in it when I had the hours.


I wonder why rich people have such a high urge to fly small airplanes. It seems to expose you to a lot of risk, which you could easily avoid (I’ve lived a considerable number of years without facing a single situation where I was likely to fly an airplane).

Airplanes are a great hobby for people who need to get out of the office, love traveling, and need a hobby that has a lot of little detailed things to stay on top of. Flying real IMC to minimums in a small plane is like solving a real-time fluid dynamics/trig problem, where if you fail, you die.

I guess there are people who do it for the danger, but the beauty of it is that you can get really down in the details and master things so it's not that risky. Once you really master the game, your biggest danger is yourself: get-there-itis, sloppy preflights, pushing things a little at a time, getting away with them, then pushing more, and so on.

I have done nothing in my life that I loved as much as flying. Mankind has been staring up into the sky for eons wishing we could fly. Now we can. Who wouldn't want to be part of that?


Once you really master the game, your biggest danger is yourself: get-there-itis, sloppy preflights, pushing things a little at a time, getting away with them, then pushing more, and so on.

Normalization of deviance, for people interested in the literature.


This is a big killer, moreso than most pilots realize. The reason it's so deadly is that it turns up in a bunch of crashes where it's not identified. That VFR pilot who was scud-running and ended up in some culmulo granite? He probably did it and got away with it several times before the crash happened.

I don't fly anymore, but when I did? I used to get all the aviation safety magazines and read them cover-to-cover. I also knew some flight instructor instructors; people known for teaching safety in the industry.

A surprising number of well-trained pilots get in perfectly good airplanes and fly them until they run out of gas. Just because you pushed it those last dozen times doesn't mean you're going to get away with it this time.

EDIT: One of the more unusual crashes I will mention since this is HN happened out in the mountain states. A couple of thousand hour+ MEII pilots get into a multi-million dollar brand new jet. It's CAVU -- clear skies, visibility unlimited. They then proceed to fly the plane into the side of a mountain -- all the time trying to figure out the new switches and displays in the cockpit. I still think of that one when designing UX. Aviation is really complex, detailed, fun, and full of math like programming. But it is also a life-and-death endeavor if you take it too lightly. If you're safety paranoid, take up boating. It's much more forgiving than aviation.


John Denver ran out of gas while flying over a mountain pass. Exactly at the top - dropped and hit the tip of the peak while fumbling for the fuel switch.

Another good one. From what I've heard, he was a good pilot. The switch was mounted in an awkward and non-standard position. (Behind the pilot's seat, I think?)

Aside from the downdraft issue, running out of gas at altitude in VMC can be very interesting, but it doesn't have to be fatal. Even in heavily-forested areas, in a small plane you can put out the barn doors and decrease your horizontal speed quite bit, especially with a headwind.

We had a guy in Virginia in the 90s, I think. He was a student-ish pilot in a 150 that ended up in IMC (fog) in the heavily-wooded mountains while running out of gas.

He slowed the plane down as much he could, somehow kept the wings level, and ended up on somebody's back deck. Walked away. That's not guaranteed, of course, but in general planes are crash-rated based on flying directly into something. It's the failure to maintain control that kills many times more than the crash itself. Heck, they used to have shows where people crashed planes on purpose.


No. He crashed into Monterey Bay. He likely ran one tank dry then sent the plane into a dive while he tried to move the fuel selector that was located over and behind his shoulder.

https://app.ntsb.gov/pdfgenerator/ReportGeneratorFile.ashx?E...


If you have a free afternoon I would recommend booking a "discovery flight" at a flight school. You go up, with a certified instructor, and learn the fundamental controls in a relatively safe environment.

It can be an interesting experience, even if you have no desire to become a licensed pilot.


I checked that and around here (in Ireland) it's about €200 for a short flight. How is that in the US?

I think mine was $50 a few years ago. They're usually hoping you like it enough to take lessons where they will recoup the cost.

I've done a couple ULM (ultralight) discovery flights for 50€ here (Spain) near Madrid

It's very freeing. My grandparents would fly to the Bahamas for dinner on a whim from central florida for instance.

I think it's "humans have the drive to do exciting things" regardless of financial status. Add money, and now they can do more expensive exciting things ... like flying small fast planes instead of small fast(-ish) cars.

Planes are moderately expensive and very fun

There is no other way to buy yourself past traffic.

I read The Dog Stars after somebody here recommended it to me. Ever since then I often think about flying a small plane. If I hear one buzzing overhead I can't help myself - I have to watch it fly away.

In case you are interested, that book is a post-apocalyptic novel and the protagonist has a great relationship with his dog. Usually having a dog in the story is all it takes to keep me interested, but I loved everything about this book.

When I turned 16 and started driving, my life changed overnight in a very good way. I imagine learning to fly would give me that feeling again. I could go where I want and not be limited by going where the roads are. It's very appealing.


It's mildly dangerous, but far less dangerous than speeding (I'm guilty of driving recklessly tuned/modded cars in my youth), and a lot more interesting. There is also the satisfaction of operating a complicated machine (to which, I guess, a lot of HN'ers will relate).

I can hardly wait for my electric VTOL PAV.


For some people risk isn't something to be avoided at all costs. It's a challenge to overcome. It's the same principal when you watch an amazing ski jump, musician, or gymnastics routine--taming the chaos.

This is a fundamental drive in many people that calls on something innate.


I don’t buy such a grandiose thing is the reason rich people can’t stop buying planes. You can buy a guitar for $100 on Craigslist and then play it in your backyard with a low risk of blowing up.

Most people don't decide what they're interested in. Certain things grip your imagination.

I tried to get into guitar, but after I found skydiving I sold my guitar for my initial training and have been jumping ever since (20 years). That led me to wanting to drive airplanes when I could afford it.

Flying touches my imagination some how and was always interested in it. Being able to leave my local airport and bust out of the clouds hundreds of miles away safely despite the weather and lack of vision is rewarding.


It's fun!

Woz didn’t create “Apple Basic”. He created the first version of BASIC on the Apple // - Integer Basic. But Microsoft wrote the next version AppleSoft Basic.

Sorry to sperg, but "Integer BASIC" was almost certainly a retronym, a term which didn't exist prior to AppleSoft. Originally it was probably called Apple BASIC, just like ever other [manufacturer] BASIC (at least colloquially).

From the best I can tell, it was always referred to as Integer Basic and never “Apple Basic”.

https://en.wikipedia.org/wiki/Integer_BASIC

I do know that it was referred to that as far back as 1980. I’ve never seen a reference to it as “Apple Basic” and I had an Apple //e in 1986.


AppleSoft (MS) BASIC came out very early in Apple II lifecycle, and prior to that there would have been no reason to call the original BASIC anything other than BASIC. Afterwards, of course, all the documentation and commands were updated.

Woz originally called it "Game BASIC", some history at this page: http://woz.org/letters/apple-basic/


Lots of people were scary good at these things early on. Not everyone was as good as Woz, of course.

When you can fit the entire architecture into your head, all the way down to the chips and board layout, you'll find that you're capable of quite a lot on that platform.

A human with a skillet that has scaled up as much as computing hardware has scaled up is going to be virtually impossible to find, if not actually impossible.

Systems today are just too complex for one person to understand at the breadth and depth that Woz understood the earliest Apple hardware.


One of the things that I liked about the Apple II was that it was possible for one person to completely understand the whole system. I don't know that I'm capable of understanding my computer's keyboard now.

Woz did a gnomdex talk in 2004 describing the power of exactly that; having the whole system in your head and being able to optimize because of that connectedness in one mind.

A keyboard is actually a good place to learn some hardware stuff. You can (relatively) easily build a keyboard from scratch and program the chip (excluding USB, which is another can of worms). They are lots of tutorials and documentation online to get you started on it

Did early 8 bitters actually have a keyboard controller, or was it shift registers hooked directly to the cpu or something.

Tried googling and didn't turn up much.


Here is how the C64 keyboard works in great detail: http://www.c64os.com/post?p=45

Keyboard interaction on the Apple II is entirely CPU-dependent, memory-mapped, and works like this: when a key is pressed, the ASCII code of the character is put at $C000, with the high bit set (a consequence of this is that there is no way for the computer to know that you pushed the shift key, for instance, or to tell the difference between typing Return or typing Control-M). Your program is responsible for checking $C000 from time to time. When the high bit is set, you read the character and access memory location $C010, which clears the $C000 high bit. Interrupts are not used at all, despite the 6502 supporting them, probably because disk and cassette IO is similarly done under CPU control with tight timing tolerances, and getting an interrupt in the middle of writing data would wreak havoc.

Details on page 16 of http://www.classiccmp.org/cini/pdf/Apple/Apple%20II%20Refere...

An exception is the "Reset" key which is hard-wired to the reset pin of the 6502. On early Apple II models, pushing "Reset", which is located just above Return, had disastrous consequences. People used to put a washer under the Reset key to make it harder to push by accident. Magazines published a simple hardware mod that required Control to be held at the same time as Reset, and later versions of the Apple II came with the mod built-in.

The Apple IIe keyboard adds two "apple" keys on both sides of the space bar which are dealt with in a completely different way, and are mapped to the same locations as the two joystick/paddle buttons. This was useful for playing games which required repetitive button-mashing, as it is easier to type a key quickly than to push a joystick button quickly. Since I was used to controlling the joystick with one hand and pushing a key on the keyboard with another hand, I had no problem adapting to early Macintosh software that required shift-click, option-click and command-click to overcome the limitation of a one-button mouse.

"The Macintosh mouse has four buttons, it's just that three of them are on the keyboard."


Not sure about keyboard but video on the ZX81 was achieved by way of a shift-register and the CPU was clocked short of 4MHz just so that it could keep this fed at the right rate. Check out the 8-bit Guy's video on the topic: https://www.youtube.com/watch?v=1Jr7Q1yJOUM

Each button has a unique code. It's packetized as usb or Bluetooth & sent to your pc. That's it.

Can you describe to me the architecture and components of a typical keyboard USB interface chip, both internal to the chip and external?


The whole point of this thread is discussing how engineers used to keep the entire design of the system in their heads, even the CPU. Being able to point to a repository of tech specs really isn't the same thing.

"A human with a skillet" -- oh, I'm guessing skillset is what you meant to say.

I was feeling old and thinking I was ignorant of some new slang there for a few seconds.

:-D


Well, there are youtube tutorials on how to solder SMD parts in in skillet, so you were not that far off ;-)

That's pretty funny, because doing some sort of electronic wizardry with a stove and a cast iron skillet is one of the other things that crossed my mind in that brief moment of confusion.

And it is totally something I could see Woz doing. :-D


'skillet' is some new slang, but it refers to your homeslice. This is just a typo.

> Woz was scary good at both hw and sw

You damn-near gave me a heart attack then! :D

I thought he had died for a moment, Phew!


He had a head injury as mentioned above. Hence the past tense regarding his exceptional engineering ability.

The plane crash is older than I've been alive, as I understand it he made a reasonable recovery?

I'm trying to view your comment in perspective. Aren't there a lot of Chinese folks with similar skills, nowadays? I mean, there are a lot of cheap devices on the Chinese market, with an overall complexity greater than Apple ][. Yes, these systems are mostly glued together from SoC devices, but the Apple ][ was also built around a (then) sophisticated building block, 6502.

There are probably many more people in the industry today who are stronger in a single domain but far fewer who have expertise in multiple domains. Woz has serious skills in software, digital and analog circuits. You might say 'so what's the big deal? Just get three people to replicate the skillset'... this works for some things but not for others. Without the cross-domain expertise, there are things that even a small team of three just won't see / think to try that a single person would. So while your team of three was still trying to understand the problem, Woz would likely have designed a more elegant solution using a smaller/cheaper BoM than they would eventually come up with. (Woz wasn't just extremely competent in multiple domains, he also worked very fast)

I don't think many people realize that Woz's dad was an electrical engineer at Lockheed who would bring home computer manuals that Woz would read as a kid.

This seems to be a fairly effective way to create world-changing computer engineers.


My boss's son is like that. He has brought him into the office since he was a kid, he's 22 now and content to go home and read PLC manuals and such. He doesn't have any social skills and doesn't really interact with anyone except 2-3 people from work, it's kind of sad.

Breeding?

Regression tends to the mean. Imagine what Apple might have been if Woz's dad had been in the factory!

'Tends', not 'is guaranteed to go'. Sometimes kids are much smarter than their parents.

Regression to the mean works equally in both directions, yes. A tall kid likely has shorter parents (though still above-average), and so on.

Steve Jobs' father was a machinist for a firm that made lasers.

That mac is awesome -Look in here.. with your one remaining good eye...

Jobs would have found a similarly capable engineer. Apple wouldn't be much different.

Imagine where Woz's life might have taken him had his dad worked elsewhere...


The Apple I was a hobby project developed before Woz met Jobs, if I remember right.

If you gave computer manuals to kids today, they'd cry and beg for their phones back

Please don't take HN threads further into flamewar. Especially not generational flamewar, which is particularly contentless.

https://news.ycombinator.com/newsguidelines.html


When I was a kid, some of my friends would whine whenever their parents took away their gameboy. But I didn't have a gameboy, so I never had the opportunity to whine about it being taken.

What's your point? Most kids back then would have cried and begged for their comics back, or for the television to be switched back on.

I'm disagreeing with OP's suggestion that giving computer manuals to kids is a "fairly effective way to create world-changing computer engineers."

You could give computer manuals, but you have to:

a) praise achievement not intelligence

and

b) Create an interest first. Maybe make it seem like magic or show something that you can do with it.


And I think it depends on the kid, and the fact that smart phones exists now makes little difference.

Dont give them a phone in the first place

its mostly terms of use and legal disclaimers these days /s

Applies to college age kids as well.

Yes, those lazy milennials with their instant grahams and their books of faces

Steve Wozniak is one of the few people of note in the tech industry who, the more I learn about them, seems more and more worthy of my respect. When I hear others speak of their sense of humour, how they stuck by their friends even when those friends clearly took advantage of them, the work and exemplary engineering skill they so clearly poured into their projects, their financial success and, then after having enjoyed that success, their leaving that successful business behind to engage in improving things for the future - to invest in educating children - I feel a sense of loss. Steve Wozniak just seems like the kind of person one would be lucky to have the opportunity to have as a friend one could learn so much from. It's rare to feel that way about people one doesn't know first hand and, while it does feel like missing out, it's nice to have a prominent example of such a person, a financially successful person, within the tech industry - often a dark place indeed. A reminder that one doesn't have to be soulless in order to succeed in the valley - just skilled and, hopefully, lucky. "So shines a good deed in a weary world."

Maybe not in the same category (i.e. electronic design), but the other guy that I admire the hell outof is Adam Savage. The guy is just bloody brilliant at D-I-Y and, like Woz, a down-to-earth-nice-guy. I've had the great pleasure of meeting both men and they dispelled the myth about meeting your heroes.

Aside from being a saint, I think Woz singlehandedly jumpstarted the computer industry by 1-2 years with his genius early designs.

It would be 2017 now if it wasn't for him!


I think that is a bit of an exaggeration in as much as there were plenty of other designs under way, with at least Commodore and Tandy hitting the market at the same time as Apple reached any kind of volume, and a multitude of small volume designs around. If Apple hadn't been there, people would have just bought one of the others.

E.g Commodore's Chuck Peddle was part of actually designing the 6502 with Bill Mensch, and went on to lead work on the PET which reached the market about the same time, and Commodore machines outsold pre-Mac Apple machines by a substantial factor from the very beginning.

The early Apple machines were too expensive to get the same reach, and never penetrated many markets at all (I'd never seen one in person growing up in Norway until the Mac - it simply wasn't a thing there, same as in many other European countries).

Woz's designs may well have been influential in some parts, but e.g Peddle spoke to Woz and Jobs when peddling (sorry) the 6502 to them, and was subsequently very dismissive about them, so the Commodore team explicitly went a different direction because Peddle saw their approach as too hacky and decided to build his own instead of considering licensing theirs, and so he's one of few people that have designed a computer from the CPU itself and up.

While his assessment of Apple will have to stand for Peddle's personal opinion, the point is that Commodore as well as Tandy/RadioShack machines would likely have hit the market at the same time irrespective of Apple, and in many markets they - especially Commodore - totally trounced Apple in vokumes anyway, so while I know Apple was influential to some, I really don't see any of those tree taken out of the equation changing things that much.

In fact to me my primary interest in the early Apple machines is because it's like a window into an alternate past that's just totally foreign to me because to me early Apple was just not a part of my childhood (my friends all had Commodore, Spectrum, Atari or Amstrad machines with some poor soul stuck with a Tandy; the mix varied greatly by country, but Apple was in very rarely in Europe, with strong sales in only a handful of countries), and it's fascinating.


You could be right.

I base my estimates on several very admiring write-ups on Woz's genius early designs.

But of course those kinds of stories travel better than "his designs were half decent and did the job good enough to capture an empty market", and winners always write history, etc.


To be clear, I don't have a problem with people seeing his designs as amazing. I just think there were multiple amazing people racing to bring out home computers at the same time, and so while each of them did amazing things with what they had available, we'd have done OK with any one if them.

My wife follows him on 4square, now Swarm. I wish he would eat at better restaurants so we might enjoy his company in the world longer.

See also his (probably better known) 16-bit VM, Sweet 16. Published in Byte November, 1977:

https://archive.org/details/byte-magazine-1977-11/page/n147

http://amigan.1emu.net/kolsen/programming/sweet16.html


Woz says the article you linked in Byte is a followup to the OP. Good find.

What a great read. Thanks for that.

I jumped into reading this note without looking at the url or author and near the end I was wondering who wrote it -- Then I saw the reference to the SPI driver which was a dead give away for James Bowman. I am continuously impressed with his work in the embedded scene after discovering the gameduino. It feels like even though there are thousands of voices on the web I actually only care about what perhaps 50 people are writing.

I have never heard of James Bowman before. I’ll look into his writing some more.

Would you mind pointing me to some more of these interesting writers?


> They took the limitations of their computers as a challenge, sat down, and made these tiny machines do impressive things.

Poets knew from long ago that using pentameter and constraints fosters creativity. Here we see the same principle applied in computing.


This is what I love about the C256 project (https://c256foenix.com/). From an interview with her:

By limiting its resources, future developers will have to be clever to find new ways to create amazing things that early on we thought we could never do. Memory is cheap, so I could easily spend the same amount of money on a single chip that could have given me the chance to fill the memory space with RAM. But instead, I am choosing to use many chips with much lower capacity. It’s more limiting, but that’s the point."


If you haven't, order a Rev C board and contribute!

https://c256foenix.com/product/c256-foenix-rev-c-bare-board-...


> Within the Sonnet’s scanty plot of ground; Pleased if some Souls (for such there needs must be) Who have felt the weight of too much liberty, Should find brief solace there, as I have found.

https://www.poetryfoundation.org/poems/52299/nuns-fret-not-a...


Q: Does the creation of Design admit constraint?

Design depends largely on constraints.

Q: What constraints?

The sum of all constraints. Here is one of the few effective keys to the Design problem: the ability of the Designer to recognize as many of the constraints as possible; his willingness and enthusiasm for working within these constraints. Constraints of price, of size, of strength, of balance, of surface, of time, and so forth. Each problem has its own peculiar list.

- Charles Eames


The craft/art still exists in the demoscene

On the other hand, the first time I did 6502 assembly language programming, I was shocked.

It was so hard to do anything.

For instance, it did not have an add instruction.

You had to clear carry, then add with carry.

That said, I can see how having a microprocessor was amazingly awesome to a hardware designer like woz who could think fluidly at the transistor level.


Floating point can be tricky.

I competed in the first couple years of Sparkfun's autonomous vehicle competition. My robot had a keypad where you could enter GPS coordinates for waypoints. The microcontroller I was using had 32-bit soft floating point routines in its standard library, but I had to code my own string-to-float routine. (I think only float-to-string was provided, but not the inverse.)

Every year, the robot worked in Oklahoma but in Colorado would make a wild turn and head the wrong way. The morning of the last time I competed I realized the problem was in my string to floating point conversion code. I had made a programming assumption which was mathematically incorrect. It happened to work in Oklahoma because the fractional part of the GPS coordinates at home were in a range that didn't trigger the bug.

I also realized it meant the conversion algorithm was more subtle than I had assumed, and there was no way I was going to be able to figure out the correct algorithm in the field, under time pressure. In those years I either didn't yet have a smartphone, or I looked and couldn't find code on the web for my microcontroller.

(And I couldn't just drop in some C code - it was a Parallax Propeller and my robot's code was in the Spin language. Quirks of the chip made it unusually hard to port C to it and so C compilers for the Propeller were still what I would consider "experiemental" - or were when I started the project.)

The incorrect floating point code had been one of the first things I'd written - it'd been responsible for my robot crashing every year, including that morning. (I'd blamed hardware, upgraded the GPS unit, improved sensors, etc.)

One year later to the day, this was all on my mind again because I couldn't make it to the competition that year, and I had a shower thought. I had had a KNOWN GOOD floating point conversion routine on my hard drive and running inside my microcontroller that whole time!! I was using a Propeller GPS library. GPS data comes in NEMA strings which are... strings! By necessity, there was a private routine in the GPS object which had to be doing the conversion. I went and looked - yes, all I would have had to do was change this private routine to "public" and I could have called it.


Having written a lot of 8 bit assembler for the 6800 microprocessor for 2K eproms, I can attest it is a very different skill from writing modern software.

Would love to hear more about this - have you written on your experiences?

No, and sadly I've lost the code I wrote, I don't know what happened to it.

Anyhow, programming an 8 bit 6800 for an embedded controller means keeping track of 6 or more things going on at the same time, while modern software focuses on one thing at a time and isolating it from everything else.

Those things include the contents of each register, the stack level, the size of the code, interrupts going off, polling that has to be done, counting cycles, etc. Organizing code into functions is a luxury not usually affordable.

It sounds tedious, but it was fun. I attached my own hard disk drive to my LSI-11, meaning I had to build an interface board and write a device driver. To figure out how to write the device driver, I dumped the floppy driver code. It was a marvel of tight engineering. The bootstrap loader was executing code only an instruction or two behind it getting loaded. I was just in awe.


If you think about it, it's pretty crazy. Register allocation using graph colouring is an NP-complete problem. Writing assembler by hand, one is doing this constantly in one's head, and often with better results than a compiler would do.

Here's a piece of assembler I wrote for the LSI-11:

https://github.com/DigitalMars/Empire-for-PDP-11/blob/master...


You might be interested in lft's talk[1] - "Poems For Bugs" - about why people are still writing very impressive demos for the C64 and other highly limited platforms. Instead of thinking about programming as functions and objects that divide the problem into simple mostly-independent modules. It's more like trying to write a highly-constrained poem of asm instructions that fit together in incredibly clever ways[2].

[1] https://www.linusakesson.net/programming/poems-for-bugs/inde...

[2] e.g. https://www.linusakesson.net/scene/a-mind-is-born/index.php

edit:

I love the Zachtronics games "Shenzhen I/O" (easier) and "TIS-100" (MUCH harder/esoteric). They are a great (and fun) way to try asm/microcontroller programming.


Just for fun, here's my IEEE 754 implementation of double precision floating point for the 8088:

https://github.com/DigitalMars/dmc/blob/master/src/CORE16/DO...


I learned to program on a 6502 when I was younger - it's a little depressing that computers went and got so fast that there's no real justification for doing this sort of thing any more in almost any domain (except, I guess, if you manage to land a position developing embedded I2C drivers).

Join the ultra-low-power crowd! There are lots of fun things to make while trying to squeeze it in a couple of KB and a few kHz.

Yes but there are new primitive domains to master, like driving a motor with an arduino, or programming FPGAs.

I beg to differ; shader / GPU programming has been a haven for justification of programming 'that sort of thing'. Particularly on home consoles where the hardware doesn't change year on year but the need to improve is still prevalent.

You still need to know a bit of assembly to take advantage of SIMD (even when using compiler intrinsics). See for instance https://github.com/lemire/simdjson

As a senior Java / CPP developer, reading what Wozniak had written, it might have been some incantation in a forgotten language.

I really wish I had done more embedded and assembly at school. I can't understand it or appreciate it.

Maybe SW and HW shouldn't mix, bar a few visionaries who bridge both worlds?


Why should not they mix ? I now mostly work on high level code (C++, Python, Delphi/FreePascal/Javascript/etc ). But every once in a while fun project comes in. For example I had no problem creating 4 quadrant torque driven 3 phase AC motor controller using AT90USB1286 microcontroller from Atmel. Granted I did not use assembly for programming as plain C was good enough. Or police light flashing pattern controller on PIC32MX795. Fun fun fun ;)

> As a senior Java / CPP developer

This makes me cringe. A "senior" engineer is defined by whatever role a company slots you in. This is compounded by adding programming languages to the mix.

> I really wish I had done more embedded and assembly at school. I can't understand it or appreciate it.

There is no hardware involved here, per-se. What is there is the 6502 instruction set, plus binary floating point calculations. But the fact it is on the 6502 is only interesting because of the constraints.

If you are interested, pick up some tutorials on x86 assembly (or ARM, or whatever you have emulators or real hardware available) and go to town. Actually creating full programs(with syscalls) is a bit involved, but a small ASM function inside C++ (since you mentioned C++) should be perfectly doable (example: https://github.com/diasurgical/devilution/blob/master/Source... – you'll notice the Diablo game was not an embedded system either)


On Linux, a full asm program with syscalls isn't too bad.

Here's one example:

https://jameshfisher.com/2018/03/10/linux-assembly-hello-wor...


A 16-bit realmode DOS Hello World is also very simple, being less than 2 dozen bytes (8 bytes of actual machine instructions, and 15 bytes of message...) and something you can enter into DEBUG and run immediately if you have a 32-bit Windows system; on a 64-bit one, DOSBox or similar emulators will work well too:

    mov ah, 9
    mov dx, 108
    int 21
    ret
    db "Hello world!" 0D 0A "$"

And if you don't have a 32-bit Windows system, you could always just run Windows 95 on your browser

https://win95.ajf.me/win95.html


Why bother with Windows in that case? Just emulate DOS directly.

Not as cool though!

You can have fun writing simple programs too:

https://github.com/skx/math-compiler/


Why should SW and HW not mix?

Well, my day to day job is writing applications that will be managed at runtime by a JVM on a Linux distro which is in turn virualised.

I am so far away from the bits...


Hardware smells of rosin core and ozone. Software smells like someone hasn't showered or changed clothes in two weeks. Not a good mix.

play the zachtronics games if you want a taste of it.

I did assembly on 6502, 6809 family, and 68000 for quite a few years. While I did enjoy Tis-100 and other zachtronics titles, those are actually quite a bit more challenging than real assembly programming. In real assembly you’re pretty much never limited to like 1 or 2 bytes of ram to work with. Zachtronics games are more like puzzle games than a real taste of assembly. For a real taste of assembly, write a little game in real assembly, or maybe calculate Fibonacci numbers in 6809 assembly, or maybe play with Chip-8.

I'd say grab a free emulator for an 8-bit machine, like Fuse http://fuse-emulator.sourceforge.net/ or VICE http://vice-emu.sourceforge.net/ .

> Maybe SW and HW shouldn't mix

All of us who work in embedded software would disagree!


How curious that there's a reference to Jef Raskin on the facing page (p.205). Raskin, of course, was a visionary at Apple who would go on to advocate for a version of Macintosh quite different from what Steve Jobs envisioned (but which later appeared as the Cannon Cat). (a standalone system, but also an Apple ][ expansion card: https://en.wikipedia.org/wiki/Canon_Cat )

And yes, Wozniak is one of the great engineers of our time. I cut my teeth on an Apple ][, and it was both amazing and mysterious to me then, as a child. Now, with 40 years of professional programming experience, the mystery is largely gone but my appreciation for the elegance of his design is only amplified.

On the off-chance he'll see this:

Mr. Wozniak, thank you for creating something that has give me a career and a lifetime of joy.


The 8051 platform can present even more challenges for assembly language code developers than the 6502. It has been over 35 years since I wrote any 6502 code, and over 30 years since I wrote any 8051 code, but I remember the biggest issue on 8051 was the lack of RAM. You get a total of 384 bytes of RAM (256 bytes of normal RAM and 128 bytes of "zero page" RAM). When I needed to write floating point math functions for the 8051, I realized that fixed point decimal math would meet my requirements. I took advantage of the large ROM space to store a bunch of constants (powers of 10) and the code I wrote operated on decimal ASCII strings. Because of the limited memory, all of my operators would destroy the second argument and replace it with the result.

Code is available upon request.


Yeah, Intel is an odd bunch. If I had to program an 8-bit machine again, I think the 6809 is still my favorite.

The 6809 was a great 8-bit microprocessor. Any CPU with a SEX instruction is okay in my book.

Totally. That one is beautiful, fast, fun, etc...

I keep a Color Computer 3 because it has that excellent chip in it.


The title says "Wozniac" but it really should be with a final "k", as in "Wozniak".

As, curiously, it is spelt everywhere in the article itself except the title.

I was always very impressed by the floating point routines in BBC Basic (also on the 6502). Anyone know who wrote those, and if they were derived from this work by Rankin and Wozniak?

BBC Basic is the best all-round Basic of its era that I know of, with rich control flow features, proper recursion and built-in assembler. But I don't know how much was original work from Acorn and how much was borrowed from elsewhere. Are there any other 8-bit Basics that are especially good?


Sophie Wilson apparently did most of the BBC Basic implementation, no idea if that also included the floating point routines.

https://en.wikipedia.org/wiki/Sophie_Wilson


He was 2nd author on the paper. No doubt, Woz has a well-deserved reputation for brilliance, but I wonder about the 1st author's contributions too. Anyone familiar with Roy Rankin of Stanford?

https://archive.org/details/dr_dobbs_journal_vol_01/page/n20...


Roy contributed the log/exp routines. The code in the Apple II ROM are missing the log/exp routines and it's copyright just credits Woz.

For the two people who might not have seen it yet, here's that fabulous joint interview to Steve Jobs and Bill Gates, where they recount the story of the (missing) floating point routines in Woz's BASIC for the Apple, and how Microsoft came to rescue.

https://www.youtube.com/watch?v=-LUGU0xprUo#t=14m20s

The whole interview is fantastic.



He also wrote a nifty article back in '81 about calculating e to 118,000 decimal places.

https://downloads.reactivemicro.com/Users/Grant_Stockley/App...


"They presume a four-byte floating point operand consisting of a one-byte exponent ranging from -218 through +127..." Is this a mistake on the magazine?

There are a couple of famous books for reading Steve Jobs' biography. Are there similarly good books to read about Steve Wozniak's biography?


Kept going until about a quarter of the way in but I just couldn't finish it. It is told very matter of factly — not, I'm afraid, conducive to keeping my interest. YMMV, of course.

I need to confess. I am absolutely petrified by floating point. Is there a good resource I can learn it from.

Woz ur the man.

While I think Woz sets a good example, they're human - and I'm sure they've done and/or said things that are less than one might expect and which one probably wouldn't approve of or condone - or perhaps which one might even condemn. I'd caution against canonising people - it sets too high a standard; which no one real can meet.

Perhaps out of place but why are you consistently calling Woz "they"? I'm not a native speaker and it sounds very weird.

As I'm not aware of what pronouns they personally prefer, I fall back to using the singular, gender-neutral, "they/them/their". It's a fairly new usage in colloquial English but I find it to be common enough so as to not impair understanding.

completely off topic, but is this where we’re at now ? Calling someone « they » and thus create confusion, just in case he doesn’t belong to the 99.9% of the population who is just fine with his DNA, and eventhough he never expressed any kind of opinion leading you to think he isn’t fine with his gender ?

There is no implication that "he isn't fine with his gender". The implication is that the speaker doesn't know, and it doesn't matter enough to find out or make a judgement call.

Personally I still use he/she most of the time unless I have a reason not to, but I also try to get into the habit of assuming less, not least because online in particular it is increasingly noticeable how often we assume someone's gender even in situations we have no reason to.

It's not unsurprising that some choose to carry that over into discussing people they don't know the preference of when it doesn't otherwise matter.

One reason to do so may be to explicitly normalise the usage so it doesn't imply a value judgement e.g based on appearance.


While I'm sympathetic to this point of view, it seems pretty clear to me that Woz self-identifies as male. He has quite a visible profile and has never presented as anything other than masculine (with the exception maybe of the odd flourish on dancing with the stars), so to select such a pronoun in this case I believe is disregarding how he would wish to be identified.

Using a gender neutral pronoun isn't disregarding a person's gender identity, any more than referring to them as a "person" is.

I actually really don’t mind that much, I believe we can use language in whatever way suits us to express our thoughts and beliefs. Gp wasnt being prescriptive upon other people and I respect that. I am all for addressing people by their favoured pronoun.

« One reason to do so may be to explicitly normalise the usage so it doesn't imply a value judgement »

That’s some scary sentence here. so gender is now a « judgment ». Like, there’s no reason to assume because steve wozniak has all the physical characteristics of a male, and has always been called « mister », using « he » (which is the grammatically correct way of designing a male human in the english language by default) is the expression of a value judgement... Seeing this kind of mental twist in a scientific forum (which believes there is something called nature, with laws and hard facts) makes me really scared.


It looks as though you're just looking for an excuse to go on a pre-packaged rant about gender and sex. I find it scary that an innocuous use of singular "they" (which has been in use as long as English as existed) would provoke such a thing.

I'm not native english, so it could be possible that i overestimate the change to the english language using "they" as a default pronoun implies.

I'm extremely sensitive to ideology wanting to change a language to better reflect its theory. It's the first step to totalitarism, and it's almost always done with perfectly good intentions (until we start to see the damaging side effects).


Singular 'they' has been in use for hundreds of years. See e.g. this article that I linked to elsewhere in the thread: https://stroppyeditor.wordpress.com/2015/04/21/everything-yo...

Note that many of the historical examples involve contexts where gender would not even have been in doubt (e.g. all students at Cambridge would have been men).


Thanks for link. So am i dreaming when i think using this term is (re)appearing recently following the whole gender neutral pronoun discussions ? Has it already been as common but i just didn’t happen to notice it ?

I think you're conflating the issue of gender neutrality with more recent discussions of non-binary and trans gender identities. People have always needed gender neutral pronouns to say things like "Everyone has their flaws", or to talk about individuals whose gender is unknown or irrelevant. The only question has been whether it's 'he/him/his' or 'they/them/theirs'. If you speak a dialect of English where 'they' can function as a gender-neutral singular pronoun, then there's really no reason at all not to use that pronoun to refer to Steve Wozniak.

The use of singular 'they' has been common for (at least) several decades, long before any substantial number of people were worried about trans issues.


Can't speak for everyone in the UK, but at least in the area I live we use 'they/they're' a lot, and I don't believe it's a recent thing. We use he/she more, but using they isn't a rare thing.

You completely failed to understand what I was saying, which is that currently, when you say "they", some portion tends to jump to the conclusion that you're trying to make a point about the person you're talking about.

But some people use "they" just as a default when they're not sure. Using it as a default all of the time then serves to remove the consideration of whether or not it is done to try to send some sort of signal about the subject.

As a case in point, you here keep talking about Woz, but there is no reason to assume that using "they" was meant to imply anything about Woz at all. As such you're demonstrating exactly why to some using "they" in a bid to normalize it matters.

As for getting scared about it, it's just a pronoun. If that "scares" you, then that is an indication to me of why it is important.


I don’t think you’ll speak a very nice language if you start to take into consideration what every single word implied meaning could be either when it’s used or when it’s not used, and you start to use the most neutral generic term all the time for anything related to any topic that may potentially offend 0.001% of the population.

Remember one of the goal of a language is also to convey meaning, and if possible in the most precise way (which is already hard enough even without taking moral issues into consideration).


Nobody is asking you to take into consideration what every single words implied meaning could be.

The only question here is whether using "they" instead of "he" in this case implies something about Woz rather than is just a way for the speaker to be gender neutral in how they are writing about it.

It is already increasingly common to use singular "they" when we don't know the gender - I did it above, when writing about the person who originally used "they" about Woz. The only question then is whether there's any issue with doing so when their gender is relatively well known.

Since singular "they" is already common in English, the only reason why there might be an issue with it in the case of Woz would be because someone concerns themselves with the signalling effect.

We also never use language in the most precise way possible. That is a total strawman. Pretty much everything we write is full of ambiguity, sometimes because easier, but often also very much intentional, e.g. to signal that certain details (such as the gender of the person you're talking about) is irrelevant to what you're saying. And sometimes because it is political, such as to signal that gender does not matter can be.


Singular 'they' is pretty well established usage by now, regardless of gender considerations. Even people who complain about it still use it: https://stroppyeditor.wordpress.com/2015/04/21/everything-yo...

That is a great treatment. Thanks for the link.

As a sci-fi buff, I expected the new gender neutral pronouns to be something like zey, zem, zir, or whatever. Reusing "they" is unfortunate. And bad grammar.

They can be plural as well which is why its a bit jarring

So can "you", and speaking English as a first language all my life I still find it jarring. It's a constant irritation that colloqs such as ye, or yous are considered vulgar.

It is definitely not common when talking about a specific person

Native speaker here--it is extremely weird usage. I struggled to understand who/how many the comment was referring to and had to read it several times.

they -> people like woz

Except for the Woz



Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: