Hacker News new | past | comments | ask | show | jobs | submit login

In 1935?



They had matchboxes in 1935*, so https://en.wikipedia.org/wiki/Matchbox_Educable_Noughts_and_... could've been a quarter-century earlier.

* since the XIX? https://matchesmuseum.com/en/match-history/


I wonder how many valves would be needed to build an "AI-capable" computer?

And how much electricity would it consume? Although the heat it generated would probably boil the Great Lakes!


Actually I went and read a bit about the history here, and although I left a glib comment, there was major practical research into neural network approaches, with theory in 1943 and hardware by 1953 https://en.m.wikipedia.org/wiki/Perceptron

There were two major factors though that set back the neural net approach:

- a 1969 Minsky & Papert book on perceptrons lead to a belief that neural nets even of >1 layers had fundamental limits, although the book only showed such limits in 1-layer nets; this lead to a reduction in funding during various AI winters

- the “deep” in “deep learning” is all about how much larger & deeper neural systems produce substantially better results. Even if you can speculate theoretically about this, it was completely impractical to approach the scale/speed to see fruitful results until the late 90s when vector/matrix accelerators (SIMD, GPU-type things) start showing up en masse. I vaguely remember reading about advances in ML in the mid 2000s which sort of had an attitude of “huh, this neural net thing we thought was a dead end turns out to just need MOAR CORES (graph up and to the right)”


Even in the 90s through the late 2000s, when I started working in ML, people poo-poo'd it: not enough data, not good enough algorithms, and computers too slow. And I worked with supercomputers/HPC- you'd think they would have been the first groups to exploit machine learning.

The perceptron was actually a remarkable cool device, way ahead of its time.


Shouldn't need a lick of electricity! If by valves you mean fluidics [1]... at which point, harnessing the Niagra Falls and building out a fluidized supercomputer covering the great lakes would probably suffice. No worries about the waste heat, though, it's water cooled!

https://en.m.wikipedia.org/wiki/Fluidics


Apologies, I meant a vacuum tube:

https://en.m.wikipedia.org/wiki/Vacuum_tube


No apologies needed; I quite like the image your word choice inspired. Shame about the fish; wrong kind of bits.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: