
Apollo 11 Guidance Computer vs. USB-C Chargers - indy
https://forrestheller.com/Apollo-11-Computer-vs-USB-C-chargers.html
======
xp84
I'd like to also see an article examining these computing power attributes on
the piece of garbage embedded computer on the front of the gas pumps where the
lag between registering every keypress is like 800ms, and every screen redraw
takes about 2000ms. It takes me an extra 60 seconds to get gas now thanks to
how slow these things are, and it is nearly universal despite whether they
appear to be 20 year old or new devices.

> Car Wash Y/N? (wait) > Loyalty (wait) > Fuel Rewards (wait) > Alt ID (wait)
> > 0005550000 (wait for the numbers to appear on screen.

I bet they run Java.

~~~
lunias
You forgot: Would you like a receipt? (wait)

It makes no sense at all. As soon as you input payment it should output gas.
There's really no need for buttons or a screen at all (well, maybe except for
entering a zip). If they really have to show all that stuff, it can come after
I have my gas.

Note: languages aren't slow, it's the programs that are slow.

~~~
folli
As a non-US citizen: What's the deal with entering ZIP codes in some gas
stations in the US? Tax reasons?

~~~
kemotep
From what I understand it is for fraud protection. If you steal a credit card
you likely do not know what the billing zip code is.

According to this blog[0] they also collect the data for "marketing purposes".

[0]:[https://blogs.creditcards.com/2014/05/zip-codes-gas-
station-...](https://blogs.creditcards.com/2014/05/zip-codes-gas-station-pay-
pump-fraud.php)

~~~
Sanzig
It's also extremely annoying when driving in the states as a Canadian, since
our postal codes don't follow the same format. Canadian postal codes are six
character alphanumeric strings with alternating letters and numbers, so you
can't enter them on the keypad.

The trick, which isn't obvious, is that you're supposed to drop the letters
from your postal code, leaving three digits, and then append two zeros to the
end. For example: K9Z 2P7 -> 92700.

In Canada we don't have to deal with this, since pretty much all our pumps
have supported EMV chip cards for years.

------
markshiz
Yes, and the firmware for the USB-C wasn't sewn by hand into the plug (core
rope memory). They did this for reliability purposes. It's amazing the
craftsmanship that went into much of that rocket by people of all walks of
life.

[http://www.righto.com/2019/07/software-woven-into-wire-
core-...](http://www.righto.com/2019/07/software-woven-into-wire-core-rope-
and.html)

Here's another great article on the hand-welds on the F1. In only a short
amount of time we've outsourced so much of these production tasks to other
software/machines. But it's really amazing to contemplate and appreciate what
a work of human hands Apollo was.

[https://www.wired.co.uk/article/f-1-moon-
rocket](https://www.wired.co.uk/article/f-1-moon-rocket)

It's hard for me to grasp it, but I can't help but think it beautiful when I
meditate on all the _people_ involved in the moon landing, each person playing
a small part in a very complex symphony.

~~~
sizzzzlerz
There is a scene in From Here to the Moon that takes place after the deaths of
the Apollo 1 crew. Frank Borman is testifying before a congressional panel
and, in talking about Ed White, he said "At the plant, Ed saw a group of men
off to the side so he went over to talk to them, which never happens. It
turned out, they were the men who made the tools that build the spacecraft
that will take us to the moon". It wasn't just the scientists, engineers, and
technicians. Everyone who had a role in the program felt a sense of
responsibility and pride in what they did to take these astronauts to the moon
and bring them back. It was an amazing time, considering all that was
happening at the same time. I feel very fortunate to have been able to witness
the whole thing as a kid.

------
crmrc114
This is awesome! The media loves to say your pocketwatch is more powerful than
X form $date. This is a fun technical rundown.

I think at some point we have to start talking about how far from optimized we
are
[https://en.wikipedia.org/wiki/MenuetOS](https://en.wikipedia.org/wiki/MenuetOS)
comes to mind when talking about what can be done in fasm. Are we ever going
to get compilers to the point where we can squash things down to that size and
efficiency? Is this yet another thing that AI can promise to solve someday?

~~~
bloopernova
Speaking of fun technical rundowns, I've idly wondered:

\- Approximately which year's average desktop PC is equivalent to a current
Raspberry Pi?

\- What year's _worldwide_ computing power is equivalent to a current
Raspberry Pi?

I'm sure folks can come up with other comparisons to make. I know it's not
very useful to the world, but it's fun to think of just how much astonishing
compute power we have these days.

~~~
ghaff
For the first question, probably somewhere in the mid-2000s. Maybe early Core
microarchitecture.

~~~
hnuser123456
I'll give you a mid-2000s Celeron, before core, except 4 of them because it's
a quad core, and with a few times more RAM since you can get pis with up to
4GB now. I had a 1.8Ghz single-core celeron with 512MB of RAM in 2005,
upgraded to 3.0GHz with 768MB by 2006, then leaped to a Q6600/4GB in 2007. Not
sure how my Radeon 9200 and later X1600 would've compared to the Pi GPU
though.

This guy puts the rpi 4's GPU at 8 GFLOPS:

[https://www.raspberrypi.org/forums/viewtopic.php?t=244519](https://www.raspberrypi.org/forums/viewtopic.php?t=244519)

Looks like the X1600 was around 6 GFLOPS:

[https://en.wikipedia.org/wiki/List_of_AMD_graphics_processin...](https://en.wikipedia.org/wiki/List_of_AMD_graphics_processing_units#Radeon_X1000_series)

~~~
numpad0
You could do dual socket on Socket370, don’t know about quads though

------
perl4ever
I note that the Google charger has much lower specs than the others. This is
what I would expect from them. I worked for a company that was excited about
getting them as a customer for our data processing services. We thought (at
least I assume management had high hopes) "wow, Google, they must have tons of
data and they will pay us tons of money to deal with it". Nope, they optimized
and filtered it to the point where they weren't sending us even 1% of what
other customers were. Since they were being charged based on volume.

~~~
zaat
One is a phone charger, the second is a power bank, the third is a charger for
2 laptops, the fourth is a spacecraft navigation system.

Now, if you'll calculate clock speed per W you'll see that Google charger
easly beats the Huawei and the Anker.

------
gok
Yeah modern microcontrollers are probably less efficiently used than they
could be, but the AGC essentially crashed twice during the lunar landing
because it didn't have enough compute available.

~~~
dave_blob
The real fun with the AGC on landing happened on Apollo 14, with a flipped bit
in the abort landing flag. Percussive maintenance temporarily flipped the bit
back, giving enough time for ground control to develop a hack. Didn't they
just ignore the warnings on Apollo 11? And I think it was non synced timing
references rather than strictly resource availability that got them on 11

~~~
jvm_
The docking (like find your way back to the mothership radar) was still turned
on, so the CPU was receiving the data and having to process it.

The CPU was throwing an I'm overloaded signal, but dealing with it, as the
process priority put docking radar as less priority than the landing sequence.

So the CPU was complaining but still doing it's job.

All this is from memory :P

~~~
dave_blob
Yes, just looked it up.

I was (partially) wrong about the un-synced frequency though. The synchro's
for the docking radar ran from a different power circuit than the AGC, and
whilst they were frequency locked they weren't in phase; resulting in timings
getting thrown off and the priority issues.

Edit: Superb deep dive on AGC in Ars from last week
[https://arstechnica.com/science/2020/01/a-deep-dive-into-
the...](https://arstechnica.com/science/2020/01/a-deep-dive-into-the-apollo-
guidance-computer-and-the-hack-that-saved-apollo-14/)

------
microtherion
Sure, you could fly to the moon with 4 USB-C chargers. But how would you
charge your phone en route?

~~~
jniedrauer
5 USB-C chargers

------
jedberg
And despite the fact that a wall charger has more compute power than the
Saturn V, we can't actually go to the moon today.

We've actually lost the ability to go. No one understands how the engines on
the Saturn V work anymore. It would take years to tear them apart and analyze
them, and most of the original engineers are dead.

And the guidance systems would have to be built back up from scratch.

Kind of sad if you think about it.

~~~
xkemp
We could easily go to the moon. It's just a completely useless exercise to do
so, unless you can find a way to taunt China into another round of the space
race.

The scientific case for the moon was always weak, and most anything they could
come up with has been done. Just look at the useless lets-grow-salad-in-space-
oh-look-its-just-salad timesinks the ISS has been busying themselves with.

So, in a way we've become much smarter. And SpaceX, a commercial outfit with
both solid financials, a good track record on safety, and rather unlimited
ambition is probably the most exciting thing happening since the first moon
landing.

~~~
op00to
> useless lets-grow-salad-in-space-oh-look-its-just-salad timesinks

This is called basic research. It's geared towards building greater knowledge
of a study area without specific concerns towards application. That's how
science works. We don't just fund the stuff that's immediately profitable
(though things are shifting that way).

------
overgard
This is brilliant and hilarious, but, as a person that cares about this I
honestly have no idea what to do. Fighting software waste feels like getting
in a fistfight with the ocean or something. I could write everything in C or
assembly, I don't even think that notion is _crazy_ , but my client's want
javascript. Not that it's even javascripts fault, or any particular languages
fault, I just don't know how you convince anyone that maaayybee we could
actually take advantage of the hardware in front of us instead of just adding
extra waste every 18 months.

~~~
dave_blob
As we're in a closed energy loop we're going to have to address this, sooner
rather than later.

Every starry eyed AI/ML piece has me thinking, yes but where's the power going
to come from? And where's the power on top to encrypt all this?

And quantum computing? Super-cooling at scale? Aye right pal.

The elegance of the AGC and it's approach to computing could provide a good
starting point to a solution out of our mess.

to quote from a 2004 paper by Don Eyles, one of the AGC programmers:

"When Hal Laning designed the Executive and Waitlist system in the mid 1960's,
he made it up from whole cloth with no examples to guide him. The design is
still valid today. The allocation of functions among a sensible number of
asynchronous processes, under control of a rate- and priority-driven
preemptive executive, still represents the state of the art in real-time GN&C
computers for spacecraft."

[https://www.doneyles.com/LM/Tales.html](https://www.doneyles.com/LM/Tales.html)

------
sansnomme
One of the most important things that is left out of most computing narratives
is that the early space programs had a tremendous amount of stuff pre-
computed, i.e. cached and memoized through the hard work of mathematicians and
physicists back on Earth. It is not like these days we don't have to run those
numbers, just that our computers taken over most of what used to be done by
hand. Noting the amount of human computing manpower taken in conjunction with
the guidance system would give a much clearer picture.

~~~
rawoke083600
Interesting... I wonder what part of those pre-computed numbers is actually
now safer because instead of relying on dynamic inputs from sensors(which we
have seen, can break, iceover , installed the wrong way around etc)

------
avianes
It would be interesting to also compare the power consumption of the embedded
ARM processor in USB-C chargers with the power consumption of AGC (Apollo 11
guidance computer).

------
ryanmercer
Wow. I'm a vintage computer enthusiast, specifically Atari, and my beloved
Atari 800xl computers only have a 1.79MHz clock speed.

I'll never cease to be amazed by technology. I was born in 1985 and our first
computer in the house (aside from an Atari 260 and NES) was a machine in 1995
although my computer usage began in 1990 with Apple II variants. My first
portable (Game Boy aside) ran Windows CE around 1998ish and my first remotely
'smart' phone was a Treo 650 about 14 years ago. I still get plenty of value
out of 8-bit Atari computers yet the USB charger sitting on my desk I'll
discard without a thought if it breaks and it has a clock speed 5.5x that of
my beloved Atari 800xl computers but the 800xl did come with 64k of memory
from the factory so at least there is that, ha.

Just looking at how far the technology has come from 1979 (800xl release date)
to 2002 is mind boggling. Had you described an Android or iOS phone to me in
1995 I'd have asked "are you writing a science fiction novel?" and now I have
5 of them I use on a daily basis (multiple accounts on a freemium game) and 2
that I carry on my person. It's crazy and wonderful and terrifying and even a
little unbelievable.

~~~
ryanmercer
*to 2020 is mind boggling.

------
dreamlayers
I find it a bit a bit scary that a microcontroller controls charger voltage.
There might be a risk of power glitches or bugs causing output voltages which
destroy devices.

~~~
afraca
I remember there being a guy who reviewed a lot of usb-c cables for compliance
(NathanK) having a video on how non-compliant cables can wreck devices:

[https://www.youtube.com/watch?v=SjeZB12985c](https://www.youtube.com/watch?v=SjeZB12985c)

Maybe that's an example of what you mean?

~~~
jaclaz
The spreadsheet by NathanK is here:

[https://docs.google.com/spreadsheets/d/1vnpEXfo2HCGADdd9G2x9...](https://docs.google.com/spreadsheets/d/1vnpEXfo2HCGADdd9G2x9dMDWqENiY2kgBJUu29f_TX8/pubhtml)

------
informatimago
And that's nothing. Wait for AI in power sockets! For now, they only have
simple processors in USB-C charger, but soon there will be more intelligence
in there, than you can imagine! :-)

~~~
jonplackett
I think this is the thing we'll be looking back on like this.

AI in EVERYTHING. because it'll be so cheap in 20 years.

Like maybe those useless automatic taps or hand dryers will finally actually
know if hands are under them properly because they'll have the equivalent of a
modern day supercomputer of AI power trying to figure it out.

~~~
kragen
Maybe. 20 years ago Moore's Law was in effect. We've become accustomed to
"it'll be so cheap in 20 years" because, from 1947 to 2010, transistors
dropped in price by a factor of two every 2 years, while getting faster
starting in 1975 due to Dennard scaling. Roughly, for ten thousand transistors
or so, the prices (in inflation-adjusted US dollars) and response times varied
very roughly as follows:

1950: $1'048'576 (1 μs)

1960: $32'768 (100 ns)

1970: $1024 (100 ns)

1980: $32 (20 ns)

1990: $1 (3 ns)

2000: $0.033 (0.5 ns)

2010: $0.001 (0.2 ns)

2020: $0.0005 (0.1 ns)

This is pretty inaccurate, but close enough to explain my point, which is that
most of the humans have lived their entire lives and their parents' entire
lives in this Moore's-Law regime, and it's coming to an end. But their
cultural expectations, formed by three human generations of Moore's Law, have
not yet adjusted; that will take another generation.

But who knows? Maybe some other similar exponential economic trend will come
along and make AI cheap. But it isn't going to happen just by virtue of
Moore's Law the way it would have in the 1980s or 1990s.

~~~
ghaff
GPUs, FPGAs, TPUs, etc.--combined with open source and public clouds that
abstract a lot of the complexity--have helped mitigate the impact of the
slowdown of CMOS process scaling especially for the workloads that really need
the performance. But it's reasonable to wonder what happens when we start to
run out of various "hacks" and performance stops getting cheaper and more
power efficient.

Big implications that I'm not sure the tech industry fully appreciates.

------
aasasd
Forget AGC: Cypress CYPD4225's clock rate is higher than that of Playstation 1
(48 MHz vs 33). However, I don't see graphics specs here so there's room for
growth.

~~~
NegativeLatency
Might be hard to compare since the PS1 had a custom cpu arch designed
specifically for 3D graphics.
[https://en.m.wikipedia.org/wiki/PlayStation_technical_specif...](https://en.m.wikipedia.org/wiki/PlayStation_technical_specifications)

------
knolax
> The CYPD4225 is definitely not rated for space. I have no idea if it would
> work in space.

There lays the rub. Even today computers rated for space are relatively
underpowered compared to consumer hardware. Comparing the microcontrollers
used in USB-C wall warts to the AGC is like saying your car has a higher top
speed than a tank. You wouldn't be wrong but you'd also be purposely ignoring
some key differences in design goals.

------
cnst
This just made me realise that these chargers are probably not running on Free
Software.

Is it possible to update their firmware? I.e., is there an equivalent of an
"update firmware" button anywhere? If so, Richard Stallman would not approve
of using these non-free chargers. We should not even mention them, lest anyone
would think it's acceptable to use them for anything.

~~~
pinewurst
Neither do pacemakers. God forbid RMS ever needs an arrhythmia corrected...

~~~
cnst
But is updating the firmware a normal function of pacemakers?

He makes a distinction on the update-firmware level — if the microwave has no
such functionality, then he does not consider it a computer. Look for
"microwave" at [http://stallman.org/stallman-
computing.html](http://stallman.org/stallman-computing.html):

> As for microwave ovens and other appliances, if updating software is not a
> normal part of use of the device, then it is not a computer. In that case, I
> think the user need not take cognizance of whether the device contains a
> processor and software, or is built some other way. However, if it has an
> "update firmware" button, that means installing different software is a
> normal part of use, so it is a computer.

~~~
Spooky23
I think so. A relative has a similar device with remote monitoring, and it had
to be re-swizzled when they switched services.

------
Andrew_nenakhov
Whenever one of my developers tells me that my phone is too slow when their
app lags on my Android 4.1 phone, I tell them the specs of Apollo 11 computer
and scold them for being unable to deliver a smoothly scrolling list of
messages on a freaking supercomputer in my palm.

~~~
masklinn
Yes because the AGC was well known for its smoothly scrolling message lists on
its large high-density display.

~~~
Dylan16807
Are you actually trying to argue that smooth scrolling is hard, and
specifically because of the display size?

On the CPU side, you can figure out how to position a line of text in less
than 100 instructions if you store it right. But even if you use 10 million
instructions to lay out a handful of lines, you'll never have to hitch.

On the GPU side, the pre-retina A5 had a fill rate of 2 billion pixels per
second, and a brand new high-DPI phone has 100-200 million pixels per second
to draw at 60fps. Let alone comparing like to like. A high-end qualcomm chip
from 2013-2014 would match that, and a low end phone would have half the power
and half the pixels.

We are in a wonderful age of overpowered GPUs for 2D work. There are no
excuses for dropped frames.

~~~
masklinn
> Are you actually trying to argue that smooth scrolling is hard, and
> specifically because of the display size?

What I'm arguing is that the problems have little to do with one another, and
I'm guessing so do the means. That's like saying building a viaduct is easy
because the pyramids exist.

~~~
Dylan16807
It's all about quickly responding to input.

Because it's the CPU stuff that's the problem here, and that's not
_fundamentally_ different. The display differences are a very separate thing
and not the cause of the problem.

------
kazinator
But, on its way to the Moon, how would that Cortex M0 fare against cosmic
radiation? One bad bit stops the show.

The smaller are your transistors, the more they are susceptible to impact from
a particle.

~~~
Dylan16807
That particular chip, probably badly. But you can buy a rad-hardened M0 with
similar specs:
[https://www.voragotech.com/products/va10820](https://www.voragotech.com/products/va10820)
The price is under a thousand dollars, too.

NASA's also working on a hardened A53 at ~800MHz, and you can get hardened
POWER chips at similar frequencies.

------
ngcc_hk
Working a bit now on fpga I wonder whether many even light switch using fpga
can be computer for open source coverage.

------
duxup
I admit I didn't understand much of that but enjoyed reading it none the less.

I wonder about the hardware selected for say a charger, is it maybe just that
the power (CPU, memory) are just a factor of that being the most cost
effective choice and the power (cpu, memory) are far more than a USB charger
needs?

~~~
cnst
I'd guess at one point you simply can't buy anything cheaper than 10MHz?

I'm updating my laptop; it'd probably be fully sufficient to have just 16GB
DDR4 (I have just 4GB right now, and it ain't that bad), but I can get a 32GB
stick for only like 110 USD, so, I might as well go for 32GB to max out the
slot and not have to worry about it later on. If the price for 32GB sticks was
more like $500, I'd probably not bother (although it'd arguably still be a
good investment in our profession here, it's just a little more hard to
justify when you know the price will come down relatively soon, and you don't
quite need that much RAM in the immediate future anyways).

~~~
duxup
That's kinda what I'm assuming. As a chip maker one chip that covers like X
use cases might be more powerful than a lot of them need, but it covers all
those use cases.

------
janpot
A row that compares the manufacturing cost per unit would make this table
complete

------
pkaye
I seem to remember that are additional computers and a whole group of
engineers and scientists on the ground to support the smaller AGS in flight.
Kind of like cloud computing.

------
ngcc_hk
Is agc more a microcontroller than a computer. Do they have programmer with
idea of circuit (paralkel, device control etc) than “computing”

------
phillipseamore
Reading this was a lot of fun. Great effort!

------
mianos
The ARM M0 thumb software divide takes 45 cycles or less. It is still faster
than the AGC in terms of clock cycles.

------
goodguy1234
I cannot wait, the day my charger comes with cuda cores or tensor core and my
on charger AI modulates my power.

------
urda
It's wild to see how far our technology has come, I enjoyed the retrospective
offered here.

------
guggle
TIL there are cpus in USB chargers. Ain't it crazy ?

------
chironjit
Super interesting article!

------
gowld
> Program Storage Space

Solar System

