Hacker News new | past | comments | ask | show | jobs | submit login
Harvard Mark I (wikipedia.org)
44 points by angrygoat on Aug 8, 2019 | hide | past | favorite | 23 comments

A fun thing about this computer is that you see it (for free) every time you're in the Science Center at Harvard. You don't have to go to some special museum.

But it wasn't actually a computer. It was technically just a calculator. The way it was designed meant that it was not the Turing-complete and/or von Neutmann-type machine that we now describe as a computer. It did some things very very well, but was not a general purpose machine capable of running any program asked of it.

You may know this, but for others: In the 1940s the word 'computer' would be more associated with a human performing computations (aka calculations) [0]

0: https://en.wikipedia.org/wiki/Computer_(job_description)#War...

Well, it was still more than a calculator, since it was programmable: a calculator can add or subtract, but can't be instructed to do one and then the other without human intervention.

Yes and no. It is a programmable calculator, but it cannot run an unending program. It doesn't have the "go to X" functionality to bounce back and forth along a long tape of instructions. It can run a sequence of tasks, much like a player piano can, but eventually gets to the end of the program and is therefore limited. A proper computer can calculate anything. It might take trillions of years to complete some tasks, but a proper "computer" can run programs with no defined end. Calculators can only run sequences.

That does not compute.

If only WW2 lasted another 5yrs I'm curious what else would have been accomplished in that timeframe. They were all just starting to perfect some advanced technology.

The motivation from people dying and patrotic goals, combined with budget (but a constrained one) and every person in the country attempting to contribute to the effort seems like a good recipe for new technology development. Especially for finding the stuff that will actually do something useful today and not in 10yrs, since it was already needed yesterday.

Obviously the costs of the war outplay the value from this but it's interesting to think about.

If John von Neumann only lived another ten years, I'm curious what else would've been accomplished!

There are other alternative paths of technology that we could've went down. A turing machine is isomorphic to lambda calculus and possibly isomorphic to many other machines.

Like how javascript is an arbitrary way to do frontend programming, it's possible the current computing paradigm is also arbitrary.

Not just possibly. Here's a non-exhaustive list of models of computation that are all equivalent.

+ Turing machines

+ Lambda calculus

+ General recursive functions

+ Mu-recursive functions

+ SKI calculus

+ Pi calculus

+ Register machines

+ The semantics of essentially every mainstream general programming language

This remarkable profusion of equivalences is the main reason why the Church-Turing Thesis (an informal statement that our intuitive notion of computation coincides with Turing machines and/or the lambda calculus) is commonly accepted.

“it's possible the current computing paradigm is also arbitrary.”

I think that’s true. Makes me wonder how things would look if you redesigned things with today’s knowledge if you didn’t have to worry about existing infrastructure.

Not just that. Arbitrary choices are made at every level of computing from even logic below the turing machine all the way up to javascript for browsers.

Imagine if they decided to use 3 voltages to represent numbers. Instead of 3.5 volts for 1 and 0 volts for 0 what if we had a trinary representation 5 volts for 2, 3.5 for 1 and 0 volts for 0.

Or what if analog computing took off instead. Instead of trying to make everything digital we built machines that are functions of analog voltage waveforms and poured all of our abilities into reducing noise instead of upping clock speed.

Is the technology we use now the result of arbitrary choices made out of pure luck or given enough time all civilizations will develop digital computing due to inherit limitations of the physical world.

No need that imagine. Ternary computers have been built. They do work but are less practical than binary.


The first modern, electronic ternary computer was Setun, built in 1958 in the Soviet Union. It is claimed that it had notable advantages over the binary computers which eventually replaced it, such as lower electricity consumption and lower production cost. But as far as I understand it, the Setun actually used 2 bits to record 1 trit, according to the Google Translated English of the Russian Wikipedia article (https://ru.m.wikipedia.org/wiki/Сетунь_(компьютер):

"Based on the Gutenmacher binary ferritodiode cell, which is an electromagnetic proximity switch based on transformer-type magnetic amplifiers , N. P. Brusentsov developed a ternary ferritodiode cell [1] [2] , which operated in a two-bit ternary code, i.e. one trit was recorded in two binary digits, the fourth state of two binary digits was not used. ... Two-bit binary encoded ternary digits (2-Bit BinaryCodedTernary, 2B BCT representation, “two-wire”) using all 4 out of 4 possible codes (two out of 4 codes encode the same ternary digit out of 3)."

Maybe the memory used actual trits, while the control logic used bits underneed?

From https://en.wikipedia.org/wiki/Ternary_computer it's not clear to me if there ever were computers that used real ternary logic, instead of just ternary memory and/or ternary logic implemented on top of binary logic.

Yes I know. I'm saying what if we had put all our resources into this or some other form of computing. They have analog computers made out of gears as well.

Question is : are they really less practical or have we just put a lot of effort into binary and with the same effort ternary would be better?

The beauty of digital is you are avoiding the “linear” portion of transistors and are always running them saturated. A ternary machine would require running the transistors in the linear region - which ultra small transistors are terrible at.

There are alternative designs using +/0/- saturated signaling but those have higher voltages and larger transistor counts.

I’ve played around with linear designs which can indeed use fewer parts but you need high quality transistors and they’d still be slow. https://hackaday.io/project/3628-trinity/log/11995-2-transis...

I'm not a semiconductor engineer but it's hard to see how ternary logic could work reliably with modern low voltage 7nm process chips.

There's no clear benefit either.

The transistor of course was 1947, so that five year addition wouldn't have pulled it forward much. I think there was a lot that could have been done with WW2 budgets + desperation to force the arrival of the microprocessor forward from 1971 by many years, 10-15 years faster perhaps.

TIL "patch" and "loop" didn't used to be metaphorical -- they did it physically with punch card tape!

Also, that has to be the classiest enclosure ever made. Check out some of the designer's other work: https://en.wikipedia.org/wiki/Norman_Bel_Geddes

> After two rejections,[4] he was shown a demonstration set that Charles Babbage’s son had given to Harvard University 70 years earlier. This led him to study Babbage and to add references of the Analytical Engine to his proposal

It's amazing how old rediscovering old ideas can create scientific/technology revolutions.

This was in the lobby of the old Aiken Computing Center my undergrad years, but moved out when they built the new "cheese wedge" Gates/Ballmer building on the same spot.

Always entertaining to walk by and see the "bugs" (moths) in the (then already defunct) relays...

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact