Hacker Newsnew | past | comments | ask | show | jobs | submit | segfault99's commentslogin

Very good. Thought-termination achieved. Branch pruned. Back-tracking...

Now where's my pony?


Back in the day you'd go into an electronics store and there'd be books containing just 555 circuit recipes. Not to mention the magazine articles.

And every EE student back when we tied onions to our belts must have had a lab assignment to spec out a PLL using 555 and bits and bobs and then measure transient responses, temperature stability, etc.


The world would be a much sadder, drearier place without the 555. That's the nostalgia part out of the way.

Really it's such a useful almost universal lego block of a component that it's hard to imagine it going away anytime soon. Sure microcontrollers are as cheap as chips these days, but you get a lot more with them. Do I need to say that sometimes more is less? Can think of scenarios where you absolutely don't want to see a chip containing firmware/code which needs auditing and locking down.


Back in the 1980s2H there was a brief fashion trend of woollen knit sweaters with IC mask type patterns. Guessing related to designers playing around with design software and knitting tech made possible by microprocessor revolution.


We’ve come full circle - knitting tech was the basis for early computing machines!


Jacquard loom!


Yes, and early core memory was also woven by hand. I am not sure if this was just for core rope memory, or if it was more widespread than that.


December/January 1987 I was doing a vacation EE internship in a power station in Australia. Some of the Hitachi mini computers still used core RAM. This was in an all Hitachi Heavy Industries turnkey coal-fired power station commissioned ca. 1985. Pretty sure they had a reference design from boilers and turbines right down to the hardware and software level and kind of cookie cutter stamped out power stations from it. The Hitachi engineering attitude was obviously "If it works, keep doing it the same way for as long as possible". I was told that for some software (firmware?) updates, they'd simply ship out a new core RAM module -- It's non-volatile after all.


Early core memories were woven by hand, but IBM rapidly automated the process. (Since most computers from the 1950s to early 1970s used core memory, there was a lot of demand.) However, IBM later found that it was cheaper to have the memories assembled by hand in Asia. For detailed information on core memory, see the book "Memories That Shaped an Industry".

Core rope is different from core memory and much rarer. Core rope is essentially ROM, using much larger cores with wires going through or around a core, storing 192 bits per core. Core ropes were hand-woven (with machine guidance) for the Apollo Guidance Computer.


True dat. But you see there's this thing called 'Engineering Maths'. Apparently it's really bad for real mathematicians' blood pressure.


Analytic combinatorics (the rubric where mathematicians would want to place all the region-of-convergence, zeros-poles, etc. analysis of generating functions–formal power/Laurent series–Z transforms that engineering often focuses on) is not exactly easy-going either. Other common methods (relating convolution to multiplication, inverting transforms etc.) would traditionally be comprised under the Operational Calculus of Mikusiński.


I forgot to mention the converse also applies. Mathematicians talking about stuff we engineers learned the paint by numbers way makes our heads hurt!


When I did EE, didn't have access to any kind of computer algebra system. Have 'fond' memories of taking Laplace transform transfer functions and converting to z-transform form. Expand and then re-group and factor. Used a lot of pencil, eraser and line printer fanfold paper for doing the very basic but very tedious algebra. Youngsters today don't know how lucky.. (ties onion to belt, etc., etc.)


Did you make sproingies from the tear-off side strips of the printer paper, though? That was the best bit. :P


Of course!


This continued with kids into the 90's. I miss that bit.

https://www.reddit.com/r/nostalgia/comments/b6dptv/folding_t...


Was this professionally or in school? I still did this in an EE program 15 years ago and I can't imagine things have changed since then. I think kids still have to do lots of ugly math in EE classes.


Undergrad. Mid-late 1980s.

I wasn't making point about mathematics qua mathematics. Was thinking that if I were doing EE undergrad today, I'd use SageMath or Mathematica to crunch the mechanical algebraic manipulations involved in doing a z-transform.


I just recently got my Computer Engineering degree which is the modern Electronics Engineering and we had a whole class on transforms. We had to do it on paper, but that professor at Cal State LA knew what the heck she was doing. We learned it good.


No worries, as a self proclaimed youngster I didn't manage to understand Fourier in 2 days and never bothered again. Also had no other prior knowledge to algebra so maybe that's why I struggled. Never perceived algebra as useful in anything programming related, will continue to do so as most problems are solvable without it. I'll let the degree havers do all that stuff.


> Never perceived algebra as useful in anything programming related

Image, video, and audio processing and synthesis, compression algorithms, 2D and 3D graphics and geometry, physics simulation, not to mention the entire current AI boom that's nothing but linear algebra… yeah, definitely algebra isn't useful in anything programming related.


Yea that's not what I work with my guy


You said "anything programming related", not "anything related to my work", "my guy".


So you're a programmer but you've never assigned a number to a variable or written any math operations? Do you just do string translations or something?


I'm talking algebra you need a degree for. Well algebra u learn while getting one that is.


Plot twist: He's a Haskell guru juggling hylomorphisms blindfolded.


You might find LLMs to be a useful crutch for this to an extent, although it's very easy to take the wrong turn and go off into the deep end. But as long as you keep forcefully connecting it back to practical reality, you can get progress out of it. And of course, never actually make it calculate.


Have I got a video for you.

gingerBill – Tools of the Trade – BSC 2025 https://youtu.be/YNtoDGS4uak


"Book learnin' didn't do me no good no how!"


Apple: Hold my beer!


In the late 1980s I did an electrical engineering internship in a coal-fired power station over summer vacation. The gas furnace igniters ran continuously, but how do you detect presence or absence of burner flames against semi-apocalyptic background of ignited pulverised coal dust being air-blasted into the furnace? Have a little window and photosensor pointing at the burner flame and FFT. No spectral component spike at xHz (IIRC x ~= 13? -- it's a burner flame, underlying dynamics not same as for candle wick) --> ringing alarms, flashing lights.


Thank you for mentioning this! Indeed, a practical application of the flame oscillation research is fire detection and monitoring of combustions processes. I should have mentioned this somewhere.


What was the preferred way of doing FFT at that time?


hasn't the preferred way been Cooley-Tukey consistently since 1965?

https://en.wikipedia.org/wiki/Cooley%E2%80%93Tukey_FFT_algor...


Bingo. We certainly learned about Cooley-Tukey in undergrad back then. That power station was 100% Hitachi Heavy Industries turnkey. The control rooms had Hitachi mainframe and some minicomputers running proprietary real time OS (I guess). These were the days when the video controller for a colour industrial process control raster display CRT was a waist-high cabinet. So you'd transduce the flicker and then transmit it via analogue current loop to a rack in the control room annex, convert back to voltage, A/D it... and crunch the FFT on one of the control room computers. Something like that. Cheap distributed compute just wasn't a thing at the time.


this is so cool, thanks for sharing :)


My first thought was to upload the PDF to Qwen3 and ask it to reimplement in Python using NumPy, Astropy, etc. Have to work on the day job, but could be some educational fun learning and Jupyter plots in my near future. Anyway, the generated code looks promising and contains the requisite green tick and bar graph emojis, so what's not to like?


Scipy implements similar algorithms, but delegates the heavy lifting to - you guessed it - Fortran. Example: https://github.com/scipy/scipy/blob/v1.16.1/scipy/integrate/...


The combination of Fortran and AI here starts to be close to 2001. ;)


You mean 2001 of 1969?

The X-files AI episodes of the '90's are about as close to 1969 as to today.

All anybody could do then was to use their imagination, but is it all that much different today?


Jingle.

Virginia Woolf's Mrs Dalloway makes for a nice stream of consciousness study in contrast.


Colorful gossip: It's been reported that Woolf disapproved of the morals of Ulysses.


Indeed. She called him a “queasy undergraduate scratching his pimples” after Ulysses.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: