Hacker News new | comments | show | ask | jobs | submit login
Alan Turing wanted to base early computer memory on gin (theregister.co.uk)
67 points by JonnieCache 1572 days ago | hide | past | web | 11 comments | favorite

Here are Wilkes' words from 1967. The gin part is interesting, but more interesting is that use of programming techniques based on knowing the type of memory in use to time instructions correctly.

"In ultrasonic memories, it was customary to store up to 32 words end to end in the same delay line. The pulse rate was fairly high, but people were much worried about the time spent in waiting for the right word to come around. Most delay line computers were, therefore, designed so that, with the exercise of cunning, the programmer could place his instructions and numbers in the memory in such a way that the waiting time was minimized. Turing himsdf was a pioneer in this type of logical design. Similar methods were later applied to computers which used a magnetic drum as their memory and, altogether, the subject of optimum coding, as it was called, was a flourishing one. I felt that this kind of human ingenuity was misplaced as a long-term investment, since sooner or later we would have truly random-access memories. We therefore did not have anything to do with optimum coding in Cambridge.

Although a mathematician, Turing took quite an interest in the engineering side, of computer design. There was some discussion in 1947 as to whether a cheaper substance than mercury could not be found for use as an ultrasonic delay medium. Turing's contribution to this discussion was to advocate the use of gin, which he said contained alcohol and water in just the right proportions to give a zero temperature coefficient of propagation velocity at room temperature"

Reminds me of The Story Of Mel (http://www.catb.org/jargon/html/story-of-mel.html)

Yes, this sort of programming was pretty common. Today we don't have to do much about that but we do something very similar if we want lots of performance: we make sure that our instructions are in the CPU cache.

Yup. GPU pipelines are another very common example, with the notable difference that pretty much every GPU programmer will optimise for the render pipeline, which is not the case for the CPU.

> EDSAC - widely accepted as the first proper "stored program" computer

This is debatable at the very least, since Manchester Baby was completed an year earlier[0] and Manchester Mark 1[1] was operational before EDSAC ran its first program.

[0] https://en.wikipedia.org/wiki/Manchester_Small-Scale_Experim...

[1] https://en.wikipedia.org/wiki/Manchester_Mark_1

That raised my eyebrow as well. The 65th anniversary of the Baby was just a week ago, so it was fresh in my mind. It was experimental, but the follow-on Mark-1 does seem to have beat the EDSAC by about about a month. I love reading about those old machines, so much happening so fast.

Posting this here in the hope that I'll see some kind of TTL+booze monstrosity on hackaday in the near future.

You want some tonic with your memory storage?

Gives a whole new meaning to the phrase "drink to forget"!

Funny. I use gin to remove memories.

Bender doesn't usually drink gin but would agree in principle...

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact