Hacker News new | past | comments | ask | show | jobs | submit login
The Water Computer (2016) (inquisition.ca)
61 points by tejohnso 3 months ago | hide | past | web | favorite | 26 comments



What is referred to as a "water transistor" in the article is in fact a standard industrial part, known as a hydraulic relay:

https://www.bermad.com.au/category/hydraulic-relays/

Instead of the BJT that is shown in the article, these act more like MOSFETs; and in fact, the "water analogy of electricity" translates quite well, since the "gate" has no current flow besides the tiny amount of "leakage" past the seals, and the volume of the actuating chamber and its return spring acts like the gate capacitance.

This is just to clear the OUT hose, so it will be ready for the next cycle. If this wasn't done, the WT could block in the "Logical One" in some circumstances.

...and that corresponds to the "high-Z" state, which along with the "capacitance" of the circuit would actually be usefully analogous to dynamic CMOS.

One more remark from someone who has worked with hydraulics before: the valve drawn in the diagram is known as a spool valve, and usually for diverting the flow of fluid, the chambers would be so arranged that they don't create any forces on the valve spool --- see how pressure on the "VCC" or "GND" ports would act on the land and tend to push the valve? In practice a "balanced spool" design is used, where the pressure of the controlled fluid is balanced on both ends:

http://www.modernhydraulics.net/images/hydes/uploads/2011/01...


I wonder if one day we'll ~program other than linear topologies. Something more like chemical reactions in 3D. Or maybe that would lead to biology.


Remember that the complexity of programming is built on the simplicity (structured, independent, repeatable actions) of the processor/language/OS.

Mechanical (esp 3D) and Chemical systems are so complex that modeling the results of highly simplified systems is often almost impossible, results are nonlinear and mixed with "unrelated" inputs and prior state, and thus results are "chaotic" and unpredictable. That's not usually what you want to base a complex system on.

Now if you can simplify operations like DNA... you may have a chance, but it's still quite difficult.


Understood, but complexity ~just needs a proper abstraction (see quantum mechanics) to make it reasonable.


I'd like to agree, but when I look at quantum computers all the problems are in scalability, error correction, temporal stability... etc. Those have to do with many of the same problems that plague mechanical and chemical systems: non-local coupling, non-linear mixing, inherent thermal noise.


You know more than I do it seems, I might be naive. Although, transistors were feeble at first.. who knows.


The beautiful thing about electronics (and hydrolics) is their inherent 1 dimensionality (although you can create higher dimensional interconnections). Electrical components in particular can be spectacularly linear (capacitors and log-lin diodes) over 4-8 orders of magnitude.

The great thing about transistors is their localized non-liniarity (gain)... which was used to create simplified digital systems from analog circuits. This allowed fan-out and noise tolerance... which allowed more complex abstractions, processors and OSes to be built on top them.


I guess even if we try chemo-spatial things.. we'll end up reusing linear network topology knowledge. But then.. I see radial wave local interactions and diffusion laws.. Just a thought.


There is also the field of non-moving parts fluidic devices[0] which did find some industrial applications and might even be in use today. Unfortunately, because they have no moving parts there's not much to see when they are in operation. They are interesting though because they can be made to operate at kilohertz frequencies, can be made quite reliable(no moving parts), and can operate in conditions electric circuits cannot. Because operation is dependent on shape and not material properties, fluidic devices can operate in harsh conditions like inside nuclear reactors, in molten steel, or in intense electromagnetic fields. Because they had no moving parts they are fairly insensitive to high accelerations and shocks, which is useful for things like missile guidance systems[2] and tank gun stabilization systems[3]. One very interesting demonstrated use of fluidics was an aircraft autopilot built with very little moving parts[1]. They found their biggest use in industrial control systems for some period to control things like factory equipment.

[0]http://miriam-english.org/files/fluidics/FluidControlDevices... [1]https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/197300... [2]https://apps.dtic.mil/dtic/tr/fulltext/u2/a204334.pdf [3]https://archive.org/details/DTIC_ADA038394


Puts me in mind of MONIAC that used tanks, floats and fluid valves to model the UK economy. Created by an NZ economics student, in the 50s, part time in his garage, from war surplus bits. At this point it sounds far too Heath Robinson to be real, but it was and several were sold worldwide. It could be adjusted to run within 2% of reality - to the surprise of its creator.

It was the inspiration for Terry Pratchett's Making Money, just without the magic ability to control the economy. :)

https://en.wikipedia.org/wiki/MONIAC_Computer


To add on to this, the creator of MONAIC - Bill Phillips - is easily the most badass economist/mathematician to have lived in the 20th century.

He worked as a crocodile hunter and cinema manager in ANZAC and later moved to China during the Sino-Japanese War, but had to escape via the Soviet Union during the height of the Stalinist purges. Yet immediately after that he ended up joining the RAF and being posted in Singapore, which was then invaded by the Japanese, and he ended up being sent to a POW camp after being caught after trying to escape via Indonesia. During his time in the POW camp, he learnt Chinese, jerryrigged a secret water boiler for tea, and built a secret radio. Following WW2 he was then made an OBE due to his exploits in SEA.

As if that wasn’t enough, he went back to school to study maths and Econ at LSE and ended up developing the epynomeous Phillips Curve that is the backbone of macroeconomics (and his coauthor won a Nobel for it since Phillip died before the recognition).


My friend Jem Finer was actively researching building a water computer.

In the end he found it too difficult ("It would have necessitated a 60 foot high water tower to provide the necessary pressure, and a far greater budget.")

Instead he built one using ball bearings:

http://www.supercomputer.org.uk/ Located at Trinity Buoy Wharf, London E14 0JW

http://supercomputer.org.uk/photographs.html http://supercomputer.org.uk/faq.html


60ft of head is only ~26psi, which unless you need that pressure at a ridiculously high flow rate, is easily achievable with a relatively inexpensive pump.


I think he wanted it to be powered only with gravity (for artistic reasons). Also I seem to remember him telling me that it was just far too complicated and fragile. The thing he did build is sitting on the dock by the Thames and it's pretty sturdy.


Interesting application to theoretical mathematics: Terence Tao thinks it may be possible to use a "water computer" to prove blow-up of the Navier Stokes equations.

https://terrytao.wordpress.com/2014/02/04/finite-time-blowup...


This is a common idea that has some history and recent practical implementations, ex: https://www.niklasroy.com/workshop/184/PneumaticComputing

I think his or gate is overthinking the issue. The only thing needed is a bunch of one way valves on each of the inputs, thus creating a sort of "wired OR": any input can create pressure on the output but that pressure can't disspate back to the unpressurized inputs.

By passing this "valve OR" through a single amplifier element set up as an inverter you get NOR gates which can have a large number of inputs and can be futurer used to synthesise any circuit.


Entire computers built with alternative technologies are a neat project and fun to look at, but I disagree that they make a computer any easier to understand, since they are still overwhelmingly complex to look at. I agree that they make a computer more approachable, by demonstrating that there is no "black magic" involved. I write this comment having recently taught a course on computer architecture to 11 year olds. In my mind hierarchy makes a computer easier to understand, and a physical incarnation makes it hard to hide the detail compared to a more abstract representation.

The approach I took in my course, which seemed to work, was bottom up:

* Information

* Bits: answers to yes/no questions

* Binary Numbers and Arithmetic

* Logic and gates: AND/OR/NOT

* Combinatorial logic: XOR, Half-adder, Full-adder, 4-bit adder

* Sequential Logic: RS-Latch, Master-Slave Register, 4-bit register

* A counter (=register+adder)

* Mux/Demux, Encoder/Decoder

* Memory (=registers+decoder+mux)

* Arithmetic Logic Unit (=functions+mux)

* Input/Output

* Program Counter, Instruction Register

* Fetch/Decode/Execute cycle

* Control

* A complete computer using gates

* Machine Code (write and run a program on our gate level computer)

* Assembly Code

The entire course was run using SimcirJS [1].

I think hierarchy is key, as each level of the hierarchy can be presented as a relatively small and simple to understand circuit. For example, the bulk of the gates in the final computer are in the memory, but the memory is presented as a single schematic symbol with the complexity hidden. The child understands what is inside the memory symbol but doesn't have to think about it while comprehending the computer as a whole.

It worked to present the entire computer in symbolic form (in SimcirJS), then demonstrate how we could use different easily understandable technologies to build a simple gate: electricity, mechanics, hydraulics, ... The students were able to see that once we had a basic gate in a technology of choice, we could just "turn the handle" to produce a complete computer by mirroring the easy to understand symbolic form and that there was no "magic" involved.

[1] https://kazuhikoarase.github.io/simcirjs/


Sounds a little like nand2tetris: https://www.nand2tetris.org/


Another link for people interested in this sort of thing:

https://eater.net/8bit


Really fascinating ideas! I would think that one of the difficulties would be dealing with expansion of the plumbing. Anything using flexible hoses will have to deal with this issue.

It might be easier to solve with small 3-D printed digital parts (AND, NAND, etc) that could be fitted together with rigid PVC off a plumbing manifold.

This might allow you to make smaller parts also.

I think you’d also need some sort of hydraulic expansion tank that would act like capacitors - necessary for storing memory and improving signaling.


I wonder if someone could make it by laser cutting channels and pistons into acrylic sheet, adding springs and sandwich between two more sheets?


Microfluidics computation is an active area of research. See for instance Prakash et al

https://youtu.be/m5WodTppevo


Very stupid question please forgive me but I'm really passionate about alternative computing. How feasible is this in practice? I mean, I kinda understand how it works, but will it work like I imagine if I try build a circuit like this with a few nodes?


Yes it would work very well, if somewhat slowly. In fact, "hydraulic logic" is how automatic transmissions in cars were controlled before they had computers. Here's an example:

http://www.oldcarmanualproject.com/manuals/trans/GMControlle...


Incredibly feasible. People were building computers long before electricity, electricity is just really really fast compared to all other methods.


I remember working with a engineering professor who had, early in his career, used a hydraulic computer to solve some equations. What I thought was amusing was the model he was working on was for water flow through walls.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: